14-1190b-innovation-managing-risk-evidence
14-1190b-innovation-managing-risk-evidence
14-1190b-innovation-managing-risk-evidence
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
transport disruption, wind damage, airport closures, or<br />
health effects), rather than any pre-defined meteorological<br />
thresholds. Although the uncertainty in the weather (and<br />
climate) variables is quantified using probabilistic forecast<br />
systems, uncertainty in the downstream models tends not to<br />
be included in determining these <strong>risk</strong>s. For that to happen,<br />
much more research needs to be done in delivering an endto-end<br />
assessment of uncertainties, and therefore of <strong>risk</strong>.<br />
THE INDUSTRY PERSPECTIVE<br />
Rowan Douglas (Chairman, Willis Research Network)<br />
As a focal point for <strong>risk</strong>, it is unsurprising that<br />
the global insurance sector has grappled with<br />
the challenges of understanding and <strong>managing</strong><br />
extremes. Indeed, the past quarter of a century provides<br />
an informative journey of how the sector has achieved far<br />
greater resilience to catastrophic <strong>risk</strong>s through the interplay<br />
of reforms in the scientific evaluation, prudential regulation<br />
and capital management of natural disaster <strong>risk</strong>.<br />
Following a period of unprecedented losses from the<br />
mid-1980s, culminating with Hurricane Andrew (which hit<br />
the United States in 1992), the insurance industry faced a<br />
near-existential crisis. Confidence in the global <strong>risk</strong>-sharing<br />
mechanism was in structural disarray as firms across private,<br />
public and mutual sectors became insolvent or impaired.<br />
After approximately 300 years of successful operation, the<br />
insurance sector’s modus operandi of relying on historical<br />
experience to evaluate <strong>risk</strong> was unsustainable.<br />
As a consequence, traditional sources of insurance<br />
capital dried up and insurance capacity retreated: coverage<br />
became unavailable, unaffordable or severely restricted.<br />
But insurance is a necessity — it is a prerequisite for many<br />
forms of economic activity — so there was no shortage of<br />
demand for the situation to be resolved. Over the next five<br />
years, three seemingly unrelated forces converged to create<br />
the conditions for a transformation in how the insurance<br />
sector confronted extreme <strong>risk</strong>.<br />
The first was the intervention of new ‘smart’ capital. The<br />
shortage of capacity had sharply increased prices and there<br />
was money to be made from underwriting <strong>risk</strong>, but this new<br />
capital demanded a new approach to <strong>risk</strong> evaluation, and<br />
to <strong>managing</strong> <strong>risk</strong> within more tolerable parameters at an<br />
individual policy and portfolio level.<br />
The second was a quantitative revolution that embraced<br />
the developments in mainstream software and computing,<br />
as well as specialist expertise in emerging software firms<br />
known as catastrophe <strong>risk</strong> modelling companies. These firms<br />
began to develop robust methodologies to understand the<br />
potential locations and forces of threats such as extreme<br />
windstorms or earthquakes; the locations, characteristics<br />
and vulnerabilities of exposed buildings; and potential<br />
financial losses.<br />
The third force was a regulatory trend. An insurance<br />
contract is a promise to pay money when a defined loss<br />
event occurs, but if there is no money to pay a claim the<br />
written promise is worthless. Until the mid-1990s, nobody<br />
had asked what level of tolerance an insurance contract<br />
should be designed to meet, for the simple reason that<br />
in general they had worked well up to that point. Should<br />
contracts operate to the 1-in-100 year <strong>risk</strong>, or 1-in-1000?<br />
Over a period of approximately five years, an emerging<br />
convention developed among regulators that insurance<br />
contracts should tolerate the 1-in-200 year level of<br />
maximum probable annual loss — that is, to perform at a<br />
99.5% level of confidence. This level meant that insurance<br />
companies should have enough capital to meet all their<br />
policyholder obligations.<br />
There was initially some confusion about what these<br />
terms meant, let alone how to assess <strong>risk</strong>s at these<br />
extreme levels. It presaged a revolution in the industry,<br />
which not everyone was able to adapt to. Slowly but surely,<br />
the techniques and methodologies that were required to<br />
respond to these new demands began to be developed,<br />
applied, tested and refined, and quite quickly the results<br />
began to show.<br />
In 2005, Hurricanes Katrina, Rita and Wilma (KRW) hit<br />
the US Atlantic and Gulf Coasts, causing unparalleled levels<br />
of losses at over US$50 billion. While the insurance and<br />
reinsurance market was put under severe stress, with major<br />
question marks over the accuracy of the modelling these<br />
specific events, there were very few insolvencies. Over the<br />
13 years since Hurricane Andrew, the industry had allocated<br />
a greater proportion of capital against potential natural<br />
disaster loss events, which may have lain beyond previous<br />
underwriting experience. Ultimately there was sufficient<br />
capital in the system. If KRW had hit in 1995 before such<br />
reforms had taken effect, the impact on the sector and<br />
affected populations seeking support would have been<br />
catastrophic.<br />
By 2011, the global industry suffered the worst year<br />
of natural catastrophe losses on record — in excess of<br />
US$120 billion — from seismic events such as the Great<br />
East Japan (Tohoku) and Christchurch earthquakes, and from<br />
weather losses such as the Thailand floods and a severe<br />
US tornado season. Yet despite this, and the wider global<br />
financial crisis, the re/insurance sector carried on almost<br />
unaffected. While there was still much to learn, it had begun<br />
to properly account for <strong>risk</strong> via the medium of the modelled<br />
world.<br />
Finally, in the aftermath of Super Storm Sandy’s impact<br />
on the New York region in 2012, the confidence in the<br />
modelling and assessment of natural disaster <strong>risk</strong> liberated<br />
over US$50 billion of new capital to respond to US<br />
disaster <strong>risk</strong>. Over a period of a quarter of a century, a<br />
new relationship between science, capital and public policy<br />
had delivered a paradigm shift in <strong>risk</strong> management and<br />
highlighted to many the dangers of overconfidence in past<br />
experience.<br />
The international agenda for 2015 includes the renewal<br />
of the United Nations’ (UN) Hyogo Framework for Action<br />
on disaster <strong>risk</strong> reduction in March, the UN’s updated<br />
Sustainable Development Goals, and the UN Framework<br />
123