Nov 13, 18 11:47 AM
Mednews announcements of latest treatments, medicines and discoveries
Nov 13, 18 11:44 AM
Biological-Contamination caused by an invasion of organisums
Nov 12, 18 01:13 PM
Wildfire what you can do to be safe
Fusion the safe alternative.
Wibbly-wobbly magnetic fusion stuff: The return of the stellarator
Artistically shaped magnets may make stellarators easier to manage than ITER.
Fusion powers the Sun, where hydrogen ions are forced together by the high pressure and temperature. The nuclei join to create helium and release a lot of energy in the process. Doing the same thing on Earth means creating the same conditions that drive hydrogen nuclei together, which is easier said than done. Humans are very clever, but achieving fusion in a magnetic bottle will probably be one of our cleverer tricks. Making that bottle is difficult, and Ars recently had the chance to visit the people and facilities behind one of our most significant attempts at it.
For most people, magnetic bottles for fusion bring to mind the tokamak, a donut-shaped device that confines the plasma in a ring. But actually, the tokamak is just one approach; there's a more complicated version that is helical in shape. Somewhere in between the two is the stellarator. Here, the required magnetic field is a bit easier to create than for a helix, but it's still far more complicated than for a tokamak.
At the Max Planck Institute for Plasma Physics (MPIPP) in Greifswald, located on the Baltic coast in Germany, the latest iteration of the stellarator design is preparing to restart after its first trial run. The researchers putting it all together are pretty excited by the prospect—frankly everyengineer and scientist would be excited by the prospect of turning on a new piece of hardware. But it's even more so the case at MPIPP since the new gear happens to be something they designed and built. The stellarator is something special: the realization of a design that is more than 50 years in the making.
The heliac, the stellarator, and the tokamak are all trying to achieve the same thing: confine a plasma tightly in a magnetic bottle, tightly enough to push protons in close to each other. They all use a more-or-less donut shape, but that more-or-less involves some really important differences. That difference makes the stellarator a pretty special science and engineering challenge. To highlight that challenge, we can start with the simpler and more familiar tokamak.
The tokamak begins with a donut-shaped vacuum vessel. The magnetic field is applied by a series of flat coils that are wrapped around the tube of the donut (as in the diagram). This, along with a few other magnets, creates a magnetic field that runs in parallel lines around the interior of the donut. When a plasma is injected, its charged particles corkscrew around the field lines. At first sight this looks like it should confine the plasma in a series of tubes.
This doesn't happen, though. As Professor Thomas Klinger, head of the stellarator project at the MPIPP says, "The vacuum magnetic field has no confinement properties because it’s a purely toroidal field. And a purely toroid field does not confine a plasma at all; that was already realised by Fermi in 1951."
The problem is that the charged particles can drift from magnetic field line to magnetic field line. Since the magnetic field doesn't have the same strength across the cross-section of the torus, particle drift to the outside is much more energetically favorable. So the plasma simply expands outward and hits the wall.
To obtain high plasma temperatures in a tokamak, this drift has to be stopped. To do this, a large current has to flow through the plasma. "You have to twist the magnetic field lines, which is done by the current," says Klinger. The current generates a second magnetic field, which distorts the applied field so that the field lines run in a twisted spiral.
A charged particle in the very short-term can still be thought of as corkscrewing around a single field line. But, because the field line spirals around, it is better to think of a series of nested surfaces (like a matryoshka doll), with the particles in the plasma confined on these surfaces. One consequence of this design is that, while particles still hop between field lines, they can now drift from low magnetic field to high magnetic field, and vice versa—an outward flow is no longer favorable. So, on average, the rate at which particles escape confinement is much smaller.
Strong confinement means that the plasma has to support a large current to generate the right magnetic field shape. For the international thermonuclear experimental reactor (ITER), the plasma will generate several million amps of current. Unfortunately, the current through the plasma, the plasma density, and temperature don't end up the same everywhere, and these differences have the potential to destabilize the current.
In particular, if the current is not evenly distributed across the plasma, the lovely nested surfaces that confine the plasma may be destroyed. This process can rapidly spiral out of control, dumping all the current in the plasma to the vessel walls in an event called a disruption. A disruption is not something to be taken lightly, as Klinger notes. "A grown-up tokamak like JET [joint European tokamak] or our ASDEX upgrade [axially symmetric diverter experiment] starts to jump in the case of a disruption," he says. "These are big machines; imagine such a big machine starts jumping."
So while the tokamak can use a self-organizing magnetic field to confine the plasma, that field is subject to various instabilities. To avoid these building into problems, the tokamak has to operate in pulsed mode (though those pulses may be hours in duration), and it requires a lot of sensors, control systems, and feedback to minimize the instabilities.
To get this right, you need a good physical model of the plasma physics. Researchers use the model to look for the telltale signs that indicate the beginning of an instability. "My modeling is mostly related to how do we control these instabilities. How do we affect these instabilities so that they either do not occur or that, when they occur, we suppress them or ameliorate their presence," says Dr. Egbert Westerhof from the Dutch Institute for Fundamental Energy Research (DIFFER).
In the tokamak, this sort of modeling is simplified by the symmetry of the device, which reduces a 3D problem to 2D. The results from these physics-based models are then used to create empirical models that do not really contain detailed physics, but they can quickly provide predictive results within some limited range of plasma properties.
This simplicity has helped produce models that can calculate the tokamak's behavior faster than the tokamak can misbehave, a necessity for a successful control system. This hasn't really happened with the stellarator designs. "They are really far [ahead of us] in tokamaks because they have these models that work really well. They have been tested. And now they can actually predict the temperature and density profiles faster than real time, which is incredible. But we don’t have these models yet," explains Dr. Josefine Proll, an assistant professor at Technical University Eindhoven.
Externally organized confinement
The stellarator has little to no current in the plasma. This is because the externally applied magnetic field has all the properties required to confine the plasma. So, although the vacuum vessel is still basically a toroid, the magnets that loop around the tube are not planar. Instead, they have the shape needed to generate a twisted magnetic field. "If you shape your field in a clever way then you can make it so that the drifts basically cancel out, at least for those that would leave the plasma," says Proll.
Theoretically, that is. In practice, well, we're still working on it. To give a magnetic field precisely the right shape requires extensive calculation at many different scales, and all of it must happen in a 3D space.
So, computer code that simulates the plasma over the entire volume of a stellarator had to be developed, and that had to wait for computers that were powerful enough to perform the calculations. "These machines, these supercomputers of the '80s, made it possible to crank through the equations, to solve the equations simultaneously, and then it was found out, okay, the stellarator needs optimization," says Klinger.
Calling it optimization kind of undersells the problem, though. Scientists had to decide what parameters of the system need to be optimized and in what range. To make that decision more difficult, no single computer model can encompass the vast range of physics that needed to be included. To get an accurate picture of the plasma in a stellarator, you need separate models that calculate the applied magnetic field and the plasma's fluid-like behavior, called a magnetohydrodynamic model. Then, to test the magnetic field confinement against particle drift and particle collisions, you need models that track individual particles along field lines and other models that deal with diffusion. All of these models needed to be created and then verified against experimental data before optimization was even possible.
Listing image by Max Planck Institute for Plasma Physics
JUMP TO ENDPAGE 1 OF 2
Building a beast of burden
This wasn't easy, but it was successful. The result is the Wendelstein 7-X stellarator. The W7-X is a beautiful design to look at, but it wasn't very simple to put together. The magnets that ring the tube are divided into five identical sections, and each section has modules that consist of two parts. The parts contain five non-planar and two planar magnets, arranged in flip-symmetric fashion (so the magnets are ordered 1, 2, 3, 4, 5:5, 4, 3, 2, 1). There are five unique non-planar magnet designs, and each had to be successfully replicated 10 times. Plus the planar magnets also had to meet the same strict set of specifications.
Unlike planar magnets, non-planar magnets experience a force that tries to flatten them, so the winding structure for the superconducting wire had to be strong enough to withstand that force. But, getting magnets to reach specifications in terms of things like retaining their helium coolant, maintaining electrical isolation in the face of high voltages, and surviving quenches (a quench is when superconductivity is suddenly lost) was a challenge.
Building the magnets was a story six years in the making. "We did a test of each single coil; this was foreseen from the beginning. So each single coil, each single coil of the seventy coils, went through very thorough tests in Paris," Klinger tells me. "There was one coil that has seen Paris three times. We call that coil Apollo 13." Apart from confirming that the hardware could produce a magnetic field with a very precise state, these tests were necessary to make the magnets less susceptible to cascaded failure—the sort of failure that marred the start up of the LHC.
Still, that was a minor cause of stress compared to the mechanical engineering problems. Unlike the tokamak, there is no real symmetry, so the whole structure had to be modeled. The engineers used a finite element model—a standard engineering tool to help design structures—to calculate where the stress induced by the magnets would be and design the support structure to cope with that. They got it wrong. And it was only discovered after the magnets were in production.
"We had to change the entire support concept, a very fundamental concept. And the most important change was that we made the magnet system less stiff," Klinger explains. In the end, the entire structure was redesigned to allow the magnets to move by 5cm. But they all move in concert, so the relative position of all the magnets stays the same, and the magnetic field is not altered by that.
Start me up
This may not sound like much of an achievement, but the vessel that contains the magnets and the inner chamber that holds the plasma is some 16 meters across, and the magnets need to maintain a relative orientation that is accurate to within about 100 micrometers. This has to be the goal despite the fact that, when the current is turned on, the magnets shift by about 4cm. In late 2016, the magnets were switched on, and the field shape was measured. I suspect there were some quiet sighs of relief and perhaps a beer or two consumed—the measured field shape agreed with computer models to within one part in 100,000.
The agreement was so good that, even though not all of the parts were ready, they decided to do some early plasma tests. Normally, the vessel wall has to be lined with a material that absorbs the heat from the plasma. Since the energy in the plasma is rather high, the material either has to be sacrificial (meaning it's allowed to burn off), a good heat conductor, or both. But the carbon panels designed for this function had not yet been installed. So the W7-X could only be ramped up to the limit of the copper-chrome-zirconium mounts for the carbon tiles. Even with that, Klinger claims that they were able to run at 2 to 4MJ and an ion temperature of 2,000 eV, which is about what they would expect given the wall limitations.
Currently, the vessel is open, and engineers are placing 8,000 carbon tiles. That will allow researchers to run at energies up to 80MJ, which should demonstrate two very important points. First, they hope to confirm the model predictions for plasma confinement: do they get the plasma density and ion temperature that is predicted? And since magnetic confinement fusion systems all follow the same scaling, they can compare their results to those from tokamaks. Given that the Wendelstein 7-X has a larger volume than the ASDEX upgrade but is smaller than JET, Klinger expects performance then falls somewhere in between the two.
The second major goal is to show that the stellarator is, indeed, stable enough to run continuously. However, that cannot be demonstrated at full power yet. The graphite tiles are not water-cooled, so the W7-X can only be run at full power for 10 seconds. The models predict that the stellarator should settle down to continuous operations, but the settling takes longer than 10 seconds. To test this, the researchers will have to run at lower power, which should still be good enough to demonstrate a certain amount of stability (or not—that's the point of doing experiments).
The heat is on
A major part of the experimental plan is to see if the researchers can successfully incorporate a diverter into their design. What is a diverter? Essentially, all magnetic confinement schemes leak. The plasma is going to hit the wall, and the plasma is energetic: it is going to heat the wall material, possibly blow holes in it, and definitely blast contaminants from the wall into the plasma.
The magnetic field, however, can be constructed such that there are specific locations at which the plasma escapes. At this location, you can devote considerable attention to little details like having a material that doesn't easily ablate, having good heat conductivity and cooling systems, and pumping away all the material that does ablate. These are called diverters. Tokamak experiments have incorporated diverters for years, but stellarators have not.
The critical point is not just showing that a diverter can be engineered but that the plasma escapes in a predictable way at the diverter. That is, it is not just the thermal and mechanical engineering of the hardware at the plasma-facing material and mount; the plasma and magnetic field models also need to be tested at operating conditions. All of them need to work to create a diverter, and, without a diverter, the stellarator will have reached the end of the line.
Assuming the diverter works, the stellarator will have one more scheduled shutdown. At this point, the water cooling will be turned on. The machine has been designed with water cooling in mind—all the pipes are installed and wind, spaghetti-like, around the vessel until they find a gap in the coil structure to escape. But water cooling and vacuum systems are not always happy companions, so rather than switch it all on now and spend the next year chasing leaks, the researchers have delayed that step. At the moment, while many components are water-cooled, the diverter sections are not.
Once the diverters are water-cooled, the W7-X will be able to operate at 10MW for half an hour. This is enough time to verify stable operation at temperatures and densities that are the maximum achievable. The data will be fusion relevant as well.
Living in an imperfect world
The stellarator of our dreams lives in computer code. We're hoping to realize it with the W7-X, but our code base is incomplete; the model used to design the stellarator does not contain all the important physics. According to Proll, a major missing component is an accurate model of turbulence.
Turbulence can cause particles to leak from the plasma. Full 3D models cannot deal with turbulence; instead, the model that Proll uses follows particles along a single flux line around the entire stellarator. These computations take on the order of 5 million CPU hours per data point.
For example, turbulence will result in the plasma density fluctuating in space and time. But do these fluctuations grow under all conditions? Or, do they reach some point where the properties of the rest of the plasma put a cap on fluctuation growth? This point, where the turbulence stops being strongly influenced by the rest of the plasma, is called saturation. "There are a few really cool phenomena that we’re studying at the moment, and one of the main things is what happens at saturation. So, [they are] what really kind of stops the instabilities from the turbulence... from growing," says Proll.
For now, optimization was performed with the influence of turbulence included as an extra form of diffusion. But it's possible that this isn't the right approach or that the effective speed of the diffusion is quite different from that in the model.
While a stellarator is stable compared to a tokamak, turbulence is probably going to be something that should be minimized through dynamic control, meaning we track the plasma and intervene to control turbulence as it develops. But the models that Proll uses are too slow for this. Instead, the goal is to use these CPU-intensive calculations to come up with empirical models that only require a few input parameters, like the magnetic field geometry and current plasma parameters, to predict the onset of turbulence.
One of the questions I had for Klinger was how a PhD program works on a project like the stellarator. In my experience, students are basically given some lab space, a bunch of hardware, and some general ideas of what might be interesting to do with said hardware. Then, your academic advisor awaits results, while occasionally breaking things in the lab under the guise of helping you out. Clearly this is not the approach at a facility like the W7-X, which has taken years of planning.
Creativity is key to science. So how do you keep creativity in such a strictly planned experiment, I wondered?
The staff at MPIPP have a clever system to keep student creativity in the research mix. Students, along with their advisors, submit proposals for experiments on the W7-X. Then, if the proposal is accepted, at their allotted time, the student takes command in the control room. The magnets are configured per their specs, and the instrumentation teams take data with the student. (I'd imagine that this is quite an experience and I admit to some jealousy.)
The staff at the MPIPP are also very careful to take on their students at a time when the W7-X will be up and running. Thus, with assembly and the initial data run over, the first wave of students has obtained all its experimental data and is completing its studies. The W7-X will start up again in 2018, so the next wave of students will start soon, allowing enough time for them to be prepared for operations. It is all very carefully organized and quite foreign—but in a good way—to my experience in research.
Part of the W7-X's funding comes from a European consortium called EUROfusion. But if you look on the EUROfusion website, there is barely a mention of the stellarator. Instead, all the glory goes to ITER. The reasons for the focus are some key design differences between ITER and the W7-X.
According to Professor Tony Donné, program manager at EUROfusion, the plan is for ITER to do a whole lot of interesting fusion physics that will test many of the physical principles upon which a commercial reactor could be based. That doesn't just include the plasma confinement and control systems but also diverters to absorb heat and walls that breed tritium fuel from beryllium so that deuterium-tritium fusion can be used. The W7-X, on the other hand, cannot handle tritium. Plasma parameters can be explored, but nuclear fusion is not in the plan.
The step after ITER for EUROfusion is DEMO, a demonstration power plant. This, according to Donné, will also be a tokamak design. The design for DEMO is not set in stone yet and could conceivably be a stellarator. However, a lot of the initial design work and studies for DEMO are underway, so a decision to change to a stellarator would have to be made soon, probably before the W7-X experimental program is complete.
Accordingly, Professor Marco de Baar, of the Dutch institute DIFFER, suggests that stellarators, should the W7-X deliver, could end up being a second-generation fusion power plant.
It would be second generation because there are numerous issues yet to solve. You have to have walls that breed tritium, which no one knows how to do in a stellarator. And, periodically, the interior has to be cleaned, which involves robotic handling. "The design and construction of such a machine is very difficult, the maintenance of such a machine is even more difficult. In ITER already we have to think of robotic access using haptic master-slave systems, but at least you have some sort of symmetry, and you have some ports that you can use to bring your robotic systems into the vessel," explains de Baar. These devices provide multiple types of feedback to their operators to help them navigate the interior of ITER.
In a stellarator, access ports are restricted by the strange magnetic field, and there is no symmetry. Yet, Klinger is hopeful the robots will be ready. "We really have to count on the advances in engineering, in robotics, and robots getting better and better and better. Just compare the robots nowadays with the robots 20 years ago; I’m pretty relaxed about that."
No matter what form a fusion power generator takes, fusion as a viable power source is not a certainty. Unlike most green energy solutions, the initial investment is huge: the magnets are big, expensive, and only part of the cost. Companies would also be taking on an enormous liability should they choose to construct them.
And, frankly, the time scale at which fusion generators could be attached to the grid is not going to help us much. As de Baar puts it, "Let me make it very clear: fusion is, if you simply look at the carbon dioxide goals we have, fusion would be too late to bring those carbon dioxide emissions down at the rate that we need. If renewables do the job, fusion could become part of a network of dispatchable power generation units. However, if renewables don’t do the job, fusion will be too late to prevent serious damage. In that scenario, we would find ourselves in a bad situation for a period of time that extends beyond when the first fusion reactors come on line."
So, the stellarator remains an exciting physical and scientific achievement the world should be anxiously awaiting. It represents hope for the future, just not the hope most people would assign to fusion.
June 22nd 2018
As Fukushima residents return, some see hope in nuclear tourism
On a cold day in February, Takuto Okamoto guided his first tour group to a sight few outsiders had witnessed in person: the construction cranes looming over Japan's Fukushima Daiichi nuclear plant.
Seven years after a deadly tsunami ripped through the Tokyo Electric Power (9501.T) plant, Okamoto and other tour organisers are bringing curious sightseers to the region as residents who fled the nuclear catastrophe trickle back.
Many returnees hope tourism will help resuscitate their towns and ease radiation fears.
But some worry about drawing a line under a disaster whose impact will be felt far into the future. The cleanup, including the removal of melted uranium fuel, may take four decades and cost several billion U.S. dollars a year.
"The disaster happened and the issue now is how people rebuild their lives," Okamoto said after his group stopped in Tomioka, 10 kilometres (6.21 miles) south of the nuclear plant. He wants to bring groups twice a week, compared with only twice a month now.
Electronic signs on the highway to Tomioka showed radiation around 100 times normal background levels, as Okamoto's passengers peered out tour bus windows at the cranes poking above Fukushima Daiichi.
"For me, it's more for bragging rights, to be perfectly honest," said Louie Ching, 33, a Filipino programmer. Ching, two other Filipinos and a Japanese man who visited Chernobyl last year each paid 23,000 yen ($208.75) for a day trip from Tokyo.
The group had earlier wandered around Namie, a town 4 kilometres north of the plant to which residents began returning last year after authorities lifted restrictions. So far, only about 700 of 21,000 people are back - a ratio similar to that of other ghost towns near the nuclear site.
Former residents Mitsuru Watanabe, 80, and his wife Rumeko, 79, have no plans to return. They were only in town to clear out their shuttered restaurant before it is demolished, and they chatted with tourists while they worked.
"We used to pull in around 100 million yen a year," Mitsuru said as he invited the tourists inside. A 2011 calendar hung on the wall, and unfilled orders from the evacuation day remained on a whiteboard in the kitchen.
"We want people to come. They can go home and tell other people about us," Mitsuru said among the dusty tables.
Okamoto's group later visited the nearby coastline, where the tsunami killed hundreds of people. Abandoned rice paddies, a few derelict houses that withstood the wave and the gutted Ukedo elementary school are all that remain.
It's here, behind a new sea wall at the edge of the restricted radiation zone, that Fukushima Prefecture plans to build a memorial park and 5,200-square-metre (56,000-square-foot) archive centre with video displays and exhibits about the quake, tsunami and nuclear calamity.
"It will be a starting point for visitors," Kazuhiro Ono, the prefecture's deputy director for tourism, said of the centre. The Japan Tourism Agency will fund the project, Ono added.
Ono wants tourists to come to Fukushima, particularly foreigners, who have so far steered clear. Overseas visitors spent more than 70 million days in Japan last year, triple the number in 2011. About 94,000 of those were in Fukushima.
Tokyo Electric will provide material for the archive, although the final budget for the project has yet to be finalised, he said.
"Some people have suggested a barbecue area or a promenade," said Hidezo Sato, a former seed merchant in Namie who leads a residents' group. A "1" sticker on the radiation metre around his neck identified him as being the first to return to the town.
"If people come to brag about getting close to the plant, that can't be helped, but at least they'll come," Sato said. The archive will help ease radiation fears, he added.
Standing outside a farmhouse as workmen refurbished it so her family could return, Mayumi Matsumoto, 54, said she was uneasy about the park and archive.
"We haven't gotten to the bottom of what happened at the plant, and now is not the time," she said.
Matsumoto had come back for a day to host a rice-planting event for about 40 university students. Later they toured Namie on two buses, including a stop at scaffolding near the planned memorial park site to view Fukushima Daiichi's cranes.
Matsumoto described her feelings toward Tokyo Electric as "complicated," because it is responsible for the disaster but also helped her family cope its aftermath. One of her sons works for the utility and has faced abuse from angry locals, she added.
"It's good that people want to come to Namie, but not if they just want to get close to the nuclear plant. I don't want it to become a spectacle," Matsumoto said.
Okamoto is not the only guide offering tours in the area, although visits of any kind remain rare. He said he hoped his clients would come away with more than a few photographs.
"If people can see for themselves the damage caused by tsunami and nuclear plant, they will understand that we need to stop it from happening again," said Okamoto, who attended university in a neighbouring prefecture. "So far, we haven't come across any opposition from the local people."
Sept 21st 2016
CC BY 2.0 Eamonn Butler
The names Chernobyl and Fukushima connote nuclear disaster. But do you remember Three Mile Island? Have you ever heard of Beloyarsk, Jaslovske, or Pickering? These names appear among the 15 most expensive nuclear disasters.
1. Chernobyl, Ukraine (1986): $259 billion
2. Fukushima, Japan (2011): $166 billion
3. Tsuruga, Japan (1995): $15.5 billion
4. Three Mile Island, Pennsylvania, USA (1979): $11 billion
5. Beloyarsk, USSR (1977): $3.5 billion
6. Sellafield, UK (1969): $2.5 billion
7. Athens, Alabama, USA (1985): $2.1 billion
8. Jaslovske Bohunice, Czechoslovakia (1977): $2 billion
9. Sellafield, UK (1968): $1.9 billion
10. Sellafield, UK (1971): $1.3 billion
11. Plymouth, Massachusetts, USA (1986): $1.2 billion
12. Chapelcross, UK (1967): $1.1 billion
13. Chernobyl, Ukraine (1982): $1.1 billion
14. Pickering, Canada (1983): $1 billion
15. Sellafield, UK (1973): $1 billion
A new study of 216 nuclear energy accidents and incidents crunches twice as much data as the previously best review, predicting that
"The next nuclear accident may be much sooner or more severe than the public realizes."
The study points to two significant issues in the current assessment of nuclear safety. First, the International Atomic Energy Agency (IAEA) serves the dual masters of overseeing the industry and promoting nuclear energy. Second, the primary tool used to assess the risk of nuclear incidents suffers from blind spots.
The conflict of interest in the first issue is clear. The second issue may not be transparent to the layperson until they understand more fully how industry conducts the probabilistic safety assessments (PSAs) which are the source of the standard predictions of the risk of nuclear accidents. A PSA involves identifying every single possible thing that could go wrong, and assigning a probability that reflects the risk it will go wrong. Nuclear plants are then built with layers of interlocking safety mechanisms, that should reduce the probability to near zero that all of the failures necessary to result in a significant event could ever happen all at the same time.
It is a comprehensive and thorough method to help safety engineers reduce risks to levels that are acceptable relative to the benefits of the technology. It has certainly helped safety engineering make great strides in the effort towards 'zero accident' goals. However, the scientifically calculated risk probabilities from a PSA are only as good as the engineers' abilities to identify every single thing that could go wrong.
Every time some new thing goes wrong that wasn't thought of before, it is quickly integrated into the PSA and the assessment re-calculated and safety measures reinforced to again return the risks to the 'safe' levels. And industry keeps close track of everything that goes wrong, even when no accident occurs due to the layers of safety engineered in, which helps to fine-tune PSAs without the need for actual disasters. But every so often, a Chernobyl or Fukushima proves that our limitations outrun our technology for controlling the risks.
The new study, by researchers at the University of Sussex (England) and ETH Zurich (Switzerland), takes a different approach by submitting the data on events that have disrupted the nuclear industry to a statistical analysis. The report tracks the evolution of nuclear safety engineering that with the benefit of 20:20 hindsight in the wake of each nuclear disaster. It finds that nuclear accidents have substantially decreased in frequency, especially due to success of safety engineering in suppressing the "moderate-to-large" incidents.
But even with these optimistic trends, the report predicts that it is more likely than not that disasters at the extreme end of the IAEA scale will occur once or twice per century. Accidents on the scale of Three Mile Island have over a 50% probability of occurring every 10-20 years.
This may not spell the end of the nuclear industry though. One co-author of the study, Professor Didier Sornette, emphasizes that: "While our studies seem damning of the nuclear industry, other considerations and potential for improvement may actually make nuclear energy attractive in the future."
The papers are published in Energy Research and Social Science: Reassessing the safety of nuclear power and in the journal of Risk Analysis: Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents and Accidents
For another perspective published on TreeHugger about nuclear power, see: The debate over nuclear power: An engineer looks at the issues
The more serious nuclear-accidents are nuclear power station explosions, accidents are rare but they are very serious. The accident at Chernobyl is pictured above, very well documented, caused many deaths and laid waste to thousands of square miles of contaminated land, including cities as pictured above and where ever there was fallout the land, the animals and the produce was unusable.
The above picture is of radiological contamination on an amusement park which was hastily abandoned due to the contamination caused by the Chernobyl accident, huge areas of the country was laid to waste with no possibility of human habitation for many hundreds of years, there are of course still wild animals living in the vicinity and these are monitored by the authorities to determine the effects of overexposure to radiation
There have been cases of accidental contamination where an x-ray machine has been taken out of service, scrapped and sent for recycling without first removing the radiation source, the recycled steel was then used to make very many everyday objects, a widespread investigation revealed radiation coming from such things as restaurant furniture and white goods, refrigerators and washing machines, fortunately in modern times the authorities are much more careful..
The other very serious accident was at a power station at Fukushima in Japan, this was caused by an earthquake just off the coast and resultant tsunami flooded the power generating facility and crippling the electrical supply to the cooling water pumps, this one also is well known and well documented and sadly the leaking contamination has yet to be contained.
These accidents have led to a general distrust of nuclear power stations and some countries have decided to phase them out altogether whereas in other parts of the world there are plans to build dozens more, of course the design is being improved constantly and the regulation of such facility is is much tighter now than it was previously, this applies to the construction and the operation of such facilities, Japan has 58 operating nuclear power stations and plans underway for a few more.
With the realisation that the demand for electricity is going to steadily increase, it is becoming more and more obvious that the only way we are going to survive an energy crisis is to build bigger and better nuclear power stations.
Fortunately with modern communication systems these dangerous situations can be monitored easily and warnings issued by local government, civil defense, police, local radio and television.