Monday, April 4, 2011

Detailed portraits of the imperceptible

Some recent articles highlight the ways that computer software, specifically simulations of complex situations, are helping us deal with the world around us.

In this weekend's New York Times, a front-page story entitled From Far Labs, a Vivid Picture of Japan Crisis describes how international nuclear scientists are attempting to use simulation software to comprehend the current situation and behavior of the Fukushima Daiichi nuclear power plant:

The bits of information that drive these analyses range from the simple to the complex. They can include everything from the length of time a reactor core lacked cooling water to the subtleties of the gases and radioactive particles being emitted from the plant. Engineers feed the data points into computer simulations that churn out detailed portraits of the imperceptible, including many specifics on the melting of the hot fuel cores.


This is not a simple task, and it will take a long time to pursue it:

the forensic modeling could go on for some time. It took more than three years before engineers lowered a camera to visually inspect the damaged core of the [Three Mile Island] Pennsylvania reactor, and another year to map the extent of the destruction.


Experts in reactor simulation have been gathering at quickly-convened conferences, such as this one at Stanford two weeks ago, to discuss the work they're doing and what it might mean.

The article discusses the delicacy of trying to use these simulations as practical tools. The simulations are simply the output of computer software, and may or may not match the actual events that are occurring within the reactors. They may suggest certain progressions, and people may use them to make decisions, but, as with all software, they are just tools, and it is up to the people to think:

A European atomic official monitoring the Fukushima crisis expressed sympathy for Japan's need to rely on forensics to grasp the full dimensions of the unfolding disaster.

"Clearly, there's no access to the core," the official said. "The Japanese are honestly blind."


Also, in The New Yorker, Raffi Khatchdourian writes a detailed post-mortem on the cleanup efforts in the Gulf of Mexico following last spring's explosion of the Deepwater Horizon drilling rig. As with the Japanese nuclear power plant disaster, the Deepwater Horizon spill involved plenty of effort by many people working directly with the environmental impacts of the explosion; however, they were again supported by a substantial software effort helping them study, understand, adapt to, and deal with the issues they were facing.

The article starts by explaining SCAT, the Shoreline Clean-Up Assessment Technique:

BP hired the designer of SCAT, Ed Owens, a British geologist, to implement the surveys.

He had come up with SCAT in 1989, after an oil barge collided with a tug off Washington State and released fifty-five hundred barrels of fuel, contaminating ninety-five miles of shoreline. Owens devised standard terminology for the various levels of pollution, and created surveys that allowed government responders and oil companies to trust the same data. Before that, Cramer told me, "people would look and say, 'There's a bunch of oil,' but there wasn't a real systematic process."

In Lousiana, members of the SCAT teams regarded themselves as intelligence officers for the cleanup.


This intelligence, it turns out, was crucially necessary, for reasons that are broadly quite similar to those impacting the Fukushima responders right now: it was very unclear what was going on at the site of the disaster. In Japan, this was because all the action was taking place inside an inner containment core held within several other layers of structure, while in Louisiana it was because all the action was taking place one mile below the ocean's surface:


When the rig sank to the ocean floor, it created clouds of debris, making it difficult to tell how much oil was being released. "It took probably thirty-six hours to get good imagery, because so much sediment and silt was raised when the thing crashed," Admiral Allen told me. After the sediment had cleared, days of bad weather further complicated underwater surveys of the wellhead area.


It turns out that understanding the precise details of the makeup of the spill was crucial, yet tremendously complex:

In the press, oil spills are typically judged by the amount of oil released, but volume can be a misleading standard. Wind patterns, ocean hydrodynamics, the chemistry of the oil, the temperature of the water -- all these factors are significant.


The responders had many tools available to them, and deciding what technique to use where was vital:

Even as Laferriere tried to motivate his responders for an all-out assault upon the coastline, he recognized that the principal fight against the oil was offshore, to be conducted with a weapon -- dispersants -- that many people thought was more harmful than the spill itself. "How do you view the various technologies and their ability to fight oil?" he said. "There are really two components to that. One is: How much oil do they take out of the environment? How much oil can be skimmed or burned or dispersed? Then, there is another factor that is equally important: What is the 'encounter rate' of the technology? Remember, the oil on the water is about a millimeter thick. Its area is huge. So if you can only go about a knot, which is the average skimming capacity, and less than a knot when you are burning, it is not possible, physically, even with all the vessels in the world, to keep up with the spreading of the oil."


As all good engineers do, you can just visualize these teams of engineers sitting together, bouncing ideas off each other, rapidly doing "back of the envelope" calculations of feasibility, effectiveness, and risk, and trying to produce plans on the spot, given the information available.

However, information about techniques for responding to oil spills is not easy to come by, because it takes a long time to acquire:

Levine also phoned Alan Mearns, a NOAA biologist in Seattle, who had worked on the Exxon Valdez response. (He was famous for monitoring for twenty years a boulder that the cleanup did not touch. The boulder -- eventually called Mearns Rock -- recovered as quickly as the most aggressively cleaned areas.)


This paucity of information constrained the engineers, particularly in terms of risk management:

Corexit was the most studied dispersant available; any other chemical would be inherently less well understood.


Even after the spill was over, the engineers were still starving for information:

By September, the BP well had been contained, and the most pressing questions for the response were: Where had the oil gone and how much harm had it done? A team of federal scientists had estimated that the total amount of oil that spewed from the well was 4.9 million barrels. Based on this number, the response estimated that seventeen per cent of the oil had been captured directly from the wellhead. The burns had eliminated five per cent of the oil; skimming had removed three per cent; and the Corexit had dispersed sixteen per cent into the sea. Altogether, the Unified Command appears to have removed and chemically dispersed two million barrels of oil -- an amount equivalent to some of the largest spills in history. A comparable volume of oil seems to have naturally dissolved in the water column, or dispersed on its own, or simply evaporated.


In other words, we just don't know, and maybe we will never know:

Clearly, it will be years before the oil's full ecological impact -- especially the sublethal effects on plants and animals -- is fully understood. Recent studies in Prince William Sound suggest that, in small ways, the ecological legacy of Exxon Valdez persists to this day.


The world is a complicated place, and we need to continue to improve our tools, and our techniques, so that we can produce the best possible "detailed portraits of the imperceptible".

No comments:

Post a Comment