NYTimes.com Search

The New York Times
Home
Go to Advanced Search
Search Options divide
go to Member Center Log Out
  Welcome, vsauter2
 

This page is print-ready, and this article will remain available for 90 days. Instructions for Saving | About this Service | Premium Account

February 6, 2003, Thursday

CIRCUITS

What Are the Chances?

By SETH SCHIESEL (NYT) 1966 words
A POWERFUL hurricane tears through Florida.

A nuclear power plant fails.

A space shuttle breaks up on its descent.

The world is full of risks. Some, like catching a cold, can usually be shrugged off. Others, like car crashes, are more serious, but the risks can easily be understood.

Then there are risks like nature's fury, nuclear meltdowns and spacecraft calamities: events that are infrequent yet catastrophic. Their potential damage demands that the risks be minutely assessed. Their rarity makes that task especially tough.

But a rapidly evolving set of conceptual and computing tools allow mathematicians, engineers and insurance executives to assess the risk of what are euphemistically known as low-probability, high-consequence events.

The field, known in professional jargon as probabilistic risk assessment, helps companies and government agencies decide whether they are prepared to take the chances involved.

In 1995, these tools helped a NASA consultant estimate the risk of a catastrophic space shuttle failure at 1 in 145, or about 0.7 percent, for each mission. NASA accepted that risk. Similar methods are used to estimate the health risks at toxic-waste sites, to secure nuclear laboratories, weapon stockpiles and power plants, and to determine the safety and reliability of planes and cars. They help determine home insurance rates for tens of millions of people in the United States, Europe and Japan. And now some of the techniques are being used to analyze the chances of terrorist attack.

The concepts were developed four decades ago, but recent advances in computing power have increased both the use of such analyses and the confidence in them.

''A couple of years ago the computers couldn't run these sorts of programs,'' said Detlef Steiner, a mathematician who is chief executive of the Clarendon Insurance Group of New York, the biggest subsidiary of the insurance giant Hanover Re. ''Now they can do it, no problem.''

And yet, of course, disasters still happen. What the risk analyses can do in the case of a space project, for example, is not only estimate the overall chances of a failure, but also compare the many ways it might unfold, helping engineers direct their resources, and preventive efforts, accordingly.

The idea behind probabilistic risk assessment is that mathematics can help determine the chances of a particular outcome (a power system failure, or a hurricane that destroys thousands of homes) based on what is known or estimated about the smaller variables that lead to those outcomes.

For example, companies serving the insurance industry develop models of hurricane behavior based on historical data that might include a dozen variables. Those variables would include the number of hurricanes that might strike, their initial location, their path, their size and their intensity, according to Karen M. Clark, president and chief executive of the AIR Worldwide Corporation, a developer of risk models for the insurance industry.

The analysts then try to use historical data to estimate the relative frequency of those variables.

These models might include 5,000 or 10,000 different potential hurricane patterns that have been weighted for relative frequency based on the historical record. For instance, the experts think that a storm as ferocious as Hurricane Andrew, which devastated parts of south Florida in 1992, will occur on average every 30 or 40 years.

The 5,000 or 10,000 storm patterns (some of which include no hurricanes and a few of which include Florida-destroying cataclysms) are then applied in random order to models of the properties insured by one particular company. Using a random order is called a Monte Carlo analysis. The results of those thousands of tests, known as iterations, are aggregated to form an overall picture of what is likely to happen.

To illustrate this, Mr. Steiner estimated that the most likely hurricane outcome for any given year would cost his company about $50 million.

''Every 100 years we might have $600 million,'' he estimated. ''A thousand-year event might cost us a billion. But remember, a thousand-year event hasn't happened. A thousand-year event tells you Florida is gone.''

The insurance sector did not show much interest in probabilistic modeling until Hurricane Andrew wiped out years of profits. Even a few years ago, however, the paucity of commonly available computing power made the models much less useful.

''Five years ago, people were running these models on county-level exposure information,'' said Chris McKeown, president and chief executive of Ace Tempest Reinsurance Ltd. of Bermuda, a major property reinsurance company. (Reinsurance companies buy portfolios of insurance policies from insurers who deal with the public.) ''Now you can run these models on a street-by-street level and do it in a matter of hours.''

Jim Goodnight, chairman and chief executive of SAS, the big maker of statistical software, said that with faster processors, more advanced software and a huge availability of memory -- whether on big mainframe computers or on lashed-together PC systems -- ''the ability to do the incredibly difficult modeling is becoming more reachable every day.''

No matter how advanced the equipment, however, the difference between modeling Florida hurricanes for insurance purposes and modeling, say, a spacecraft is roughly akin to the difference between simple algebra and building a corporate spreadsheet-- same idea, much greater magnitude.

While a hurricane model might include a dozen variables, an advanced model for probabilistic risk assessment in an industrial situation -- mounting a space mission, operating a nuclear plant -- might include thousands or tens of thousands or sometimes even hundreds of thousands of pieces, each representing a separate component that could malfunction or fail. Most important, the model must be set up to describe the operational interaction among those components precisely.

It is a task somewhat akin to trying to simulate each individual wind eddy within a hurricane, a herculean task if it is even possible.

The sheer number of variables is not the only hurdle. Hurricane modelists can extrapolate from a huge historical database. An engineer designing parts for a new spacecraft, nuclear installation or submarine may have to develop a computerized model to test the physical and electromagnetic properties of each component before the resulting data can be fed into a probabilistic analysis.

In that sense, the insurance-related modelists focus on effects while the industrial modelists are trying to understand root causes of potential problems.

''We pretty much understand that if a tornado rips through a trailer park that a lot of the trailers will be gone,'' said Annette MacIntyre, acting division leader for the electronics engineering technology division at Lawrence Livermore National Laboratory in Livermore, Calif. Ms. MacIntyre said that she had worked with probabilistic models for two decades and had been engaged in programs involving nuclear waste storage and energy. ''The insurance industry is mostly focused on what will happen if an event does happen. I am trying to prevent. They are trying to mitigate.''

The general consensus in the risk-management industry seems to be that NASA was not much interested in probabilistic analysis until the 1986 Challenger disaster, much as the insurance industry did not pay attention until Hurricane Andrew.

''If it's a Department of Defense project, you have to meet certain standards, and the risk-analysis stuff was actually incorporated as a design tool,'' said Robert K. Weatherwax, who conducted a probabilistic study for the Air Force in the 1980's on the potential public health hazards of using plutonium in spacecraft. ''NASA never did that.''

Mr. Weatherwax, who is now president of Sierra Energy and Risk Assessment, which mostly serves the energy industry, said that NASA's traditional engineering philosophy had been to focus on backup systems as a sort of catch-all safety and reliability philosophy.

''The idea was that this would substitute for quantitative analysis,'' Mr. Weatherwax said. ''In the shuttle, though, they realized they it would weigh too much and cost too much so they couldn't have the level of redundancy they were accustomed to. And numbers were bad news to NASA. They didn't want anyone to talk about the probabilities.''

NASA declined to comment on its risk analysis procedures for this article, but since the Challenger disaster, it has clearly come to embrace probabilistic methods. It has put on at least two workshops on the subject in recent years, and it contracted with the Science Applications International Corporation in the mid-1990's to conduct the probabilistic analysis of shuttle risks that provided the 1-in-145 calculation.

The study identified seven broad categories of risks that could lead to a shuttle catastrophe. It estimated that if a catastrophe occurred, the most likely culprit, with a 37.8 percent chance, would be the shuttle's main engines.

It is unclear whether the report told NASA something the agency already knew or whether it opened the agency's eyes to a lurking problem. It is clear, however, that by 1997 the biggest shuttle upgrade program involved improving pumps for the main engines. Moreover, a 2000 report from the General Accounting Office said that of the shuttle upgrades that NASA planed to incorporate by 2005, the most expensive related to upgrading the main engines.

A category that is now a focus of the Columbia investigation, the craft's protective tiles, was considered a less likely cause -- with a 14.8 percent likelihood -- of a catastrophic failure.

Probabilistic models, of course, are only as useful as the assumptions fed into them. Moreover, they are best used when a system or piece of equipment is being designed, not after it is in the field or in space.

''The most applicability is in the manufacturing of satellites,'' said James B. Frownfelter, chief operating officer of PanAmSat, the No. 1 commercial satellite-services company. ''It is extremely important to employ these tools early in the process. Doing this at the beginning allows you to determine where to focus your testing and your overall cost profile.''

Mr. Frownfelter said that PanAmSat's contractors use probabilistic models to help assure that their craft can meet the requirement of an 80 percent chance of flawless operation for 15 years.

For all of the difficulties of modeling complex technical systems, however, the most daunting challenge may be modeling minds. That is because the next frontier in assessing the risks of ''low-probability, high-consequence events'' is terrorism.

In describing the challenge of modeling terrorism, Hemant H. Shah, chief executive and president of RMS, a risk-modeling firm, echoed Einstein's adage: ''Subtle is the Lord, but malicious he is not.''

''Hurricanes do not make an effort to strike your weak points,'' Mr. Shah said. ''In the case of terrorism you're dealing with a question of intent. You're modeling an adversary in the context of conflict.''

Mr. Shah's firm and others are now using advanced game theory techniques, which emulate human decision-making, to try to build terrorism models.

Ms. MacIntyre, the risk-assessment expert from Lawrence Livermore, seemed to have one piece of advice. ''You're trying to focus on those things that are important,'' she said, speaking generally. ''You can't model all of reality. What would be the point?''



Copyright 2002 The New York Times Company