We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This is a masters-level overview of the mathematical concepts needed to fully grasp the art of derivatives pricing, and a must-have for anyone considering a career in quantitative finance in industry or academia. Starting from the foundations of probability, this textbook allows students with limited technical background to build a solid knowledge of the most important principles. It offers a unique compromise between intuition and mathematics, even when discussing abstract ideas such as change of measure. Mathematical concepts are introduced initially using toy examples, before moving on to examples of finance cases, both in discrete and continuous time. Throughout, numerical applications and simulations illuminate the analytical results. The end-of-chapter exercises test students' understanding, with solved exercises at the end of each part to aid self-study. Additional resources are available online, including slides, code and an interactive app.
This paper investigates a well-known downside protection strategy called the constant proportion portfolio insurance (CPPI) in defined contribution (DC) pension fund modeling. Under discrete time trading CPPI, an investor faces the risk of portfolio value hitting the floor which denotes the process of guaranteed portfolio values. In this paper, we question how to deal with so-called ‘gap risk’ which may appear due to uncontrollable events resulting in a sudden drop in the market. In the market model considered, the risky asset price and the labor income are assumed to be continuous-time stochastic processes, whereas trading is restricted to discrete-time. In this setting, an exotic option (namely, the ‘cushion option’) is proposed with the aim of reducing the risk that the portfolio value falls below the defined floor. We analyze the effectiveness of the proposed exotic option for a DC plan CPPI strategy through Monte Carlo simulations and sensitivity analyses with respect to the parameters reflecting different setups.
Policy studies assume the existence of baseline parameters – such as honest governments doing their best to create public value, publics responding in good faith, and both parties relying on a policy-making process which aligns with the public interest. In such circumstances, policy goals are expected to be produced through mechanisms in which the public can articulate its preferences and policy-makers are expected to listen to what has been said in determining their governments' courses of action. While these conditions are found in some governments, there is evidence from around the world that much policy-making occurs without these pre-conditions and processes. Unlike situations which produce what can be thought of as 'good' public policy, 'bad' public policy is a more common outcome. How this happens and what makes for bad public policy are the subjects of this Element. This title is also available as Open Access on Cambridge Core.
In today’s insurance market, numerous cyber insurance products provide bundled coverage for losses resulting from different cyber events, including data breaches and ransomware attacks. Every category of incident has its own specific coverage limit and deductible. Although this gives prospective cyber insurance buyers more flexibility in customizing the coverage and better manages the risk exposures of sellers, it complicates the decision-making process in determining the optimal amount of risks to retain and transfer for both parties. This article aims to build an economic foundation for these incident-specific cyber insurance products with a focus on how incident-specific indemnities should be designed for achieving Pareto optimality for both the insurance seller and the buyer. Real data on cyber incidents are used to illustrate the feasibility of this approach. Several implementation improvement methods for practicality are also discussed.
Chapter 15 discusses the new Digital Operational Resilience Act (DORA) in the context of cryptoassets and decentralised finance. Section 15.1 introduces the cybersecurity challenge, while Section 15.2 explains DORA’s objectives, approach, and its link to MiCA. Then, Section 15.3 provides an analysis of DORA’s scope, and Section 15.4 gives an overview of DORA’s tools, explaining each of DORA’s Chapters II–VII. Then, Section 15.5 delves into the crypto-specific matters, explaining the MiCA plus DORA situation and analysing the difficult issues of applying DORA’s concepts of “financial entities” and “ICT third-party service providers” in the context of decentralised finance, including fully decentralised crypto networks. Section 15.6 concludes.
Novel methods of data collection and analysis can enhance traditional risk management practices that rely on expert engineering judgment and established safety records, specifically when key conditions are met: Analysis is linked to the decisions it is intended to support, standards and competencies remain up to date, and assurance and verification activities are performed. This article elaborates on these conditions. The reason engineers are required to perform calculations is to support decision-making. Since humans are famously weak natural statisticians, rather than ask stakeholders to implicitly assimilate data, and arrive at a decision, we can instead rely on subject matter experts to explicitly define risk management decision problems. The results of engineering calculation can then also communicate which interventions (if any) are considered to be risk-optimal. It is also proposed that the next generation of engineering standards should learn from the success of open source software development in community building. Interacting with open datasets and code can promote engagement, identification (and resolution) of errors, training and ultimately competence. Finally, the profession’s tradition of independent verification should also be applied to the complex models that will increasingly contribute to the safety of the built environment. Model assurance will be required to keep pace with model development to identify suitable use cases as adequately safe. These are considered to be increasingly important components in ensuring that methods of data-centric engineering can be safely and appropriately adopted in industry.
Written accounts suggest there were major changes in agricultural practices in Anatolia as the region switched between Roman, Byzantine, Arab and Turkic control, yet archaeological evidence of these changes is offered only on a site-by-site basis. This article presents the first synthesis of archaeobotanical, palynological and zooarchaeological evidence for changes in plant and animal husbandry in Anatolia through the first and second millennia AD. Available data indicate a minimal role of climate change in agricultural shifts but offer evidence for substantial changes towards short-term-return agricultural strategies in response to declining personal security, changing patterns of military provisioning and distinct taxation regimes.
Once fidelity and equivalence are abandoned, how can successful translation be understood? Risk management offers an alternative way of looking at the work of translators and their social function. It posits that the greater the cultural differences, the greater the risks of failed communication. What can be done to manage those risks? Drawing on the ways translators and interpreters handle intercultural encounters by adjusting what is said, this essay outlines a series of strategies that can be applied to all kinds of cross-cultural communication. Practical examples are drawn from a wide range of contexts, from Australian bushfires to court interpreting in Barcelona, with special regard for the new kinds of risks presented by machine translation and generative AI. The result is a critical view of the professionalization of translation, and a fresh account of democratized translation as a rich human activity in the service of cross-cultural cooperation.
This article discusses the EU supply chain legislation, by virtue of the recently adopted Corporate Sustainability Due Diligence Directive (CSDDD) which aims to reduce negative sustainability impacts in global supply chains with regard to a list of human rights and environmental standards specified in its Annex I of the CSDDD.
We argue that the CSDDD marks a fundamental change on the EU level, from disclosure duties to mandating prevention of, and compensation for, adverse sustainability impacts in supply chains.
We further find that the CSDDD is a legal transplant combining the principles laid down in the OECD Guidelines for Multinational Enterprises on Responsible Business and those of the UN Guiding Principles on Business and Human Rights, along with elements of French supply chain legislation from 2017 (which relies on a private enforcement model) and the German supply chain law from 2021 (which is based on a public enforcement model). Like all legal transplants, the resulting legal text generally prompts questions about consistency and specifically raises doubts as to whether combining all of the components of a private and a public enforcement model is proportionate for the purpose of the CSDDD which is to ensure that companies take effective steps to counter violations of human rights and environmental standards in global supply chains. The scope provisions (including smaller in-scope EU firms while leaving non-EU peers of a similar size aside) paired with significant high compliance burden provide grounds to argue that the CSDDD impacts on the competitiveness of these smaller in-scope EU companies, and thus the EU economy at large.
Multiple instances of safeguarding failures and criticisms of poor process and weak governance have afflicted the Church of England for many years, despite repeated assurances that ‘Lessons would be learned’. An Independent Safeguarding Board has been formed and then abolished without being replaced. A report by Professor Alexis Jay, former Chair of Independent Inquiry into Child Sexual Abuse, recommending the creation of two independent charities to oversee Church safeguarding has been passed to a Response Group and is being resisted by various groups within the Church. This article examines issues of the management of safeguarding within the overall governance of the organisation, compares issues within the Church with those which have been exposed by the Post Office Horizon scandal and considers the potential role of the audit function to concern itself with safeguarding matters as part of its oversight of risk management and corporate governance.
The growing concern over cyber risk has become a pivotal issue in the business world. Firms can mitigate this risk through two primary strategies: investing in cybersecurity practices and purchasing cyber insurance. Cybersecurity investments reduce the compromise probability, while cyber insurance transfers potential losses to insurers. This study employs a network model for the spread of infection among interconnected firms and investigates how each firm’s decisions impact each other. We analyze a non-cooperative game in which each firm aims to optimize its objective function through choices of cybersecurity level and insurance coverage ratio. We find that each firm’s cybersecurity investment and insurance purchase are strategic complements. Within this game, we derive sufficient conditions for the existence and uniqueness of Nash equilibrium and demonstrate its inefficiency. These theoretical results form the foundation for our numerical studies, allowing us compute firms’ equilibrium decisions on cybersecurity investments and insurance purchases across various network structures. The numerical results shed light on the impact of network structure on equilibrium decisions and explore how varying insurance premiums influence firms’ cybersecurity investments.
Get up-to-speed with the fundamentals of how electricity markets are structured and operated with this comprehensive textbook, presenting coverage of key topics in electricity market design, including power system and power market operations, transmission, unit commitment, demand response, and risk management. It includes over 140 practical examples, inspired by real-industry applications, connecting key theoretical concepts to practical scenarios in electricity market design, and features over 100 coding-based examples and exercises, with selected solutions for readers. It further demonstrates how mathematical programming models are implemented in an industry setting. Requiring no experience in power systems or energy economics, this is the ideal introduction to electricity markets for senior undergraduate and graduate students in electrical engineering, economics, and operations research, and a robust introduction to the field for professionals in utilities, energy policy, and energy regulation. Accompanied online by datasets, AMPL code, supporting videos, and full solutions and lecture slides for instructors.
This paper extends previous research on using quantum computers for risk management to a substantial, real-world challenge: constructing a quantum internal model for a medium-sized insurance company. Leveraging the author’s extensive experience as the former Head of Internal Model at a prominent UK insurer, we closely examine the practical bottlenecks in developing and maintaining quantum internal models. Our work seeks to determine whether a quadratic speedup, through quantum amplitude estimation can be realised for problems at an industrial scale. It also builds on previous work that explores the application of quantum computing to the problem of asset liability management in an actuarial context. Finally, we identify both the obstacles and the potential opportunities that emerge from applying quantum computing to the field of insurance risk management.
A reflective analysis is presented on the potential added value that actuarial science can contribute to the field of health technology assessment. This topic is discussed based on the experience of several experts in health actuarial science and health economics. Different points are addressed, such as the role of actuarial science in health, actuarial judgment, data inputs and their quality, modeling methodologies and the use of decision-analytic models in the age of artificial intelligence, and the development of innovative pricing and payment models.
Fires are among the most feared incidents that can occur in a hospital. Hospital fires will disrupt care continuity, may require the evacuation of patients and have the potential to result in injuries or even deaths. The aim of this study is to gain insight into hospital fires in the Netherlands over a 20-year period.
Methods
Systematic scoping review of news articles mentioning hospital fires in the Netherlands retrieved from the LexisNexis database, Google, Google News, PubMed, and EMBASE between 2000 and 2020. Hospital fires were included if they were associated with the closure of hospital departments or intervention units and/or evacuations. The cause, location, involved departments, need for evacuation, and the number of casualties were evaluated.
Results
Twenty-four major hospital fires were identified. More than half of these were caused by technical failures, and in 6 cases (25%), the fires were attributed to patients. In 71% of the incidents, acute care departments were affected by the fire. Twenty fires (83%) resulted in the evacuation of patients. In 2 cases, the fire resulted in the death of a patient.
Conclusions
Patient-attributed fires are a significant cause of major hospital fires in the Netherlands. Prevention and mitigation measures should be implemented accordingly.
The project aimed to characterize the exposure to seismic hazard in the emergency area of a high-complexity hospital in Cali, Colombia.
Methods
The occupancy of the emergency area was analyzed over 6 months, determining the value of material elements exposed to the seismic hazard. Four phases were executed: search for pre-existing information, occupancy analysis, evaluation of exposed assets, and results analysis. The information was analyzed using a Geographic Information System (GIS), which allowed the visualization of demographic behavior in different locations and times.
Results
The results confirmed that the seismic hazard is high, exacerbated by local geomechanical characteristics. It was observed that the average occupancy of most studied areas exceeded capacity. The value of the exposed assets was estimated at COP 3 221 008 640 (USD 959 844.76), the demolition value at COP 10 582 770 000 (USD 3 153 613.49), and the reconstruction value at COP 30 293 640 275 (USD 9 027 356.03). In the worst-case scenario, the losses were equivalent to 12.4% of the hospital’s annual budget.
Conclusions
The data allow the hospital to take preventive measures and educate the staff to identify and mitigate critical areas. It also contributes to the knowledge of the approximate value of economic losses and the impact of potential human losses.
Risk is a central concept in modern regulatory studies. In Chapter 2, the general idea of ’risk’ is introduced. The chapter helps readers grasp its scientific and practical relevance for regulation. The chapter also offers an overview of the importance of risk in scholarly work and policy-making. The chapter emphasizes the extensive and diverse nature of risk studies across different academic disciplines including ’technical’ quantitative methods and sociological critique. It explains how risk identification, risk assessment, and risk management are conventionally understood and highlights their shortcomings and complexities. Additionally, it discusses the trend of ’riskification’ – the tendency to frame a growing number of issues in the language of risk.
This final chapter demonstrates how the catastrophe (CAT) models described in previous chapters can be used as inputs for CAT risk management. CAT model outputs, which can translate into actionable strategies, are risk metrics such as the average annual loss, exceedance probability curves, and values at risk (as defined in Chapter 3). Practical applications include risk transfer via insurance and CAT bonds, as well as risk reduction, consisting of reducing exposure, hazard, or vulnerability. The forecasting of perils (such as tropical cyclones and earthquakes) is explored, as well as strategies of decision-making under uncertainty. The overarching concept of risk governance, which includes risk assessment, management, and communication between various stakeholders, is illustrated with the case study of seismic risk at geothermal plants. This scenario exemplifies how CAT modelling is central in the trade-off between energy security and public safety and how large uncertainties impact risk perceptions and decisions.
With the start of the pandemic in 2020, central banks re-deployed unconventional policy tools used for the Global Financial Crisis but also added new and even less conventional ones. The steps they took followed a textbook description of a risk management approach to monetary policy when operating close to the effective lower bound on interest rates.[1] This involved loosening policy aggressively and working with many different instruments simultaneously to maximise policy impact. Judging from the aftermath, they were successful. They steered the economy through a crisis and avoided a depression. One could argue that the episode has shown that monetary policy can be effective also in a low interest rate environment if central banks are willing and able to deploy multiple measures with scale and speed. This suggests that central banks are well equipped for the future, irrespective of the path of interest rates.
Focusing on the physics of the catastrophe process and addressed directly to advanced students, this innovative textbook quantifies dozens of perils, both natural and man-made, and covers the latest developments in catastrophe modelling. Combining basic statistics, applied physics, natural and environmental sciences, civil engineering, and psychology, the text remains at an introductory level, focusing on fundamental concepts for a comprehensive understanding of catastrophe phenomenology and risk quantification. A broad spectrum of perils are covered, including geophysical, hydrological, meteorological, climatological, biological, extraterrestrial, technological and socio-economic, as well as events caused by domino effects and global warming. Following industry standards, the text provides the necessary tools to develop a CAT model from hazard to loss assessment. Online resources include a CAT risk model starter-kit and a CAT risk modelling 'sandbox' with Python Jupyter tutorial. Every process, described by equations, (pseudo)codes and illustrations, is fully reproducible, allowing students to solidify knowledge through practice.