iter forum website update 9-11 - ainse.edu.au€¦  · web viewthis could be the year the national...

44
ITER Forum Website Update 3.2012 B.J.Green (14/312) 1. Solar enthusiasm exceeds practicality BY:BJORN LOMBORG From:The Australian February 20, 2012 12:00AM http://www.theaustralian.com.au/news/world/ solar-enthusiasm-exceeds-practicality/story- e6frg6ux-1226275140254 ONE of the world's biggest green-energy, public policy experiments is coming to a bitter end in Germany, with important lessons for policymakers elsewhere. Germany once prided itself on being the "photovoltaic world champion", doling out generous subsidies totalling more than $US130 billion to citizens to invest in solar energy, according to Germany's Ruhr University. But now the German government is vowing to cut the subsidies sooner than planned and to phase out support over the next five years. What went wrong? There is a fundamental problem with subsidising inefficient green technology: it is affordable only if it is done in tiny, tokenistic amounts. Using their government's generous subsidies, Germans

Upload: trinhngoc

Post on 13-Jul-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

ITER Forum Website Update 3.2012

B.J.Green (14/312)

1. Solar enthusiasm exceeds practicality

BY:BJORN LOMBORG From:The Australian February 20, 2012 12:00AMhttp://www.theaustralian.com.au/news/world/solar-enthusiasm-exceeds-practicality/story-e6frg6ux-1226275140254

ONE of the world's biggest green-energy, public policy experiments is coming to a bitter end in Germany, with important lessons for policymakers elsewhere.

Germany once prided itself on being the "photovoltaic world champion", doling out generous subsidies totalling more than $US130 billion to citizens to invest in solar energy, according to Germany's Ruhr University.

But now the German government is vowing to cut the subsidies sooner than planned and to phase out support over the next five years. What went wrong?

There is a fundamental problem with subsidising inefficient green technology: it is affordable only if it is done in tiny, tokenistic amounts. Using their government's generous subsidies, Germans installed 7.5 gigawatts of photovoltaic capacity last year, more than double what the government deemed "acceptable". It is estimated this will lead to a $US260 ($243) hike in the average consumer's annual power bill.

According to Der Spiegel, even members of Chancellor Angela Merkel's staff are now describing the policy as a massive money pit. Philipp Rosler, Germany's Minister of Economics and Technology, has called the spiralling solar subsidies a "threat to the economy".

Germany's enthusiasm for solar power is understandable. We could satisfy all of the world's energy needs for an entire year if we could capture just one hour of the sun's energy. Even with the inefficiency of current PV technology, we could meet the entire globe's energy

demand with solar panels by covering 250,000sq km, about 2.6 per cent of the Sahara Desert.

Unfortunately, Germany, like most of the world, is not as sunny as the Sahara. And while sunlight is free, panels and installation are not. Solar power is at least four times more costly than energy produced by fossil fuels. It also has the distinct disadvantage of not working at night, when much electricity is consumed.

In the words of the German Association of Physicists, "solar energy cannot replace any additional power plants". On short, overcast winter days, Germany's 1.1 million solar-power systems generate no electricity. The country is then forced to import considerable amounts of electricity from nuclear power plants in France and the Czech Republic. When the sun failed to shine last winter, one emergency back-up plan powered up an Austrian oil-fired plant to fill the supply gap.

Indeed, despite the massive investment, solar power accounts for only about 0.3 per cent of Germany's total energy. This is one of the key reasons why Germans now pay the second-highest price for electricity in the developed world (exceeded only by Denmark, which aims to be the "world wind-energy champion"). Germans pay three times more than their American counterparts.

Moreover, this sizeable investment does remarkably little to counter global warming. Even with unrealistically generous assumptions, the unimpressive net effect is that solar power reduces Germany's CO2 emissions by roughly eight million tonnes, or about 1 per cent, for the next 20 years.

When the effects are calculated in a standard climate model, the result is a reduction in average temperature of one twenty-thousandth of a degree celsius. By the end of the century, Germany's $US130bn solar panel subsidies will have postponed temperature increases by 23 hours.

Using solar, Germany is paying about $US1000 per tonne of CO2 reduced. The current CO2 price in Europe is $US8. Germany could have cut 131 times as much CO2 for the same price. Instead, the Germans are wasting more than 99c of every euro that they plough into solar panels.

It gets worse: because Germany is part of the EU emissions trading system, the actual effect of extra solar panels in Germany leads to no

CO2 reductions, because total emissions are already capped.

Instead, the Germans simply allow other parts of the EU to emit more CO2. Germany's solar panels have only made it cheaper for Portugal or Greece to use coal. Defenders of Germany's solar subsidies also claim that they have helped to create "green jobs" but each job created by green-energy policies costs an average of $US175,000, considerably more than job creation elsewhere in the economy. And many "green jobs" are being exported to China, meaning that Europeans subsidise Chinese jobs, with no CO2 reductions.

Germany's experiment with subsidising inefficient solar technology has failed. What governments should do instead is to focus first on increasing research and development to make green-energy technology cheaper and more competitive. Production should be ramped up later.

Bjorn Lomborg is the author of The Sceptical Environmentalist and Cool It, head of the Copenhagen Consensus Centre and adjunct professor at Copenhagen Business School

2. Listed geothermal companies have been poor performers

BY:GREENCHIP: GILES PARKINSON From:The Australian March 09, 2012 12:00AM

http://www.theaustralian.com.au/business/opinion/government-needs-to-step-in-to-give-geothermal-a-push/story-e6frg9if-1226294055203

SOMETIME in the next few days, the listed geothermal company Geodynamics will spud the Habanero 4 well -- that is, start drilling operations -- in the heart of its hot dry rock geothermal prospect in the Cooper Basin in South Australia.

The well will replace the Habanero 3 well, which failed in a spectacular fashion with a blowout in 2009, and will be a critical step towards establishing a 1 megawatt pilot plant at Innamincka in 2013.

But it will, in all likelihood, be the only geothermal well to be drilled in Australia this year, despite a new tax deduction that comes into force from July 1.

And the industry, which is tipped by the government's draft energy white paper to provide up to 23 per cent of Australia's electricity generation by 2050 (and 8.4 per cent by 2030), is wondering how to unlock the funds to deliver on this potential.

The Australian Geothermal Energy Association yesterday convened a special workshop in Sydney to try to find the answer, inviting representatives from the industry, equity and debt financing specialists from the banking and super funds sector, and Energy Minister Martin Ferguson, to share their ideas.

The basic problem is that the geothermal sector and the dozen or so small companies that comprise it are found where some say they should not be, listed as public stocks on the Australian Securities Exchange.

Like other sectors in emerging industries, be they clean-tech or biotech or some other cutting-edge technology, the geothermal sector should be a largely unlisted investment play, the province of private equity and venture capital.

But because those avenues hardly exist in this country, money can only be found in the public arena, and after the problems with Habanero 3 and other disappointments, there has been less of that too.

Most geothermal stocks are trading at a fraction of the levels of two to three years ago, and this in turn has made it all but impossible for the companies to match the grants-based funding programs of the federal government.

Mark Rogers, an asset manager in the infrastructure investment division of Colonial First State, says it is a classic chicken and egg situation.

He says the technology has yet to prove itself, and until it does, it will not attract investment from the infrastructure investment sector, because these investors are inherently conservative people who only want to deal with known risks and known parameters.

Rogers says the government needs to get involved at the front end,

and provide some sort of mechanism that de-risks the project either by providing equity itself, or tax incentives, or providing guaranteed returns -- as has proven successful in Europe and elsewhere in pushing wind, solar and geothermal to the forefront. "You have either got to de-risk it, or you have to lift the return," he says.

"It's the classic chicken and egg situation. We are more than happy to buy brownfields versions of this technology after it has been around for five years or so, but it needs to be proven," he says.

"The Geodynamics experience sums it up."

This, ironically enough, was the theme of Mr Ferguson's speech to the workshop.

He said the industry was made up of a number of small companies with prospective exploration licences and volatile share price fluctuations, and they faced significant barriers with high upfront drilling costs.

To overcome this meant government "sharing some of the early-stage risks with industry and setting a flexible policy framework that encourages private sector investment", he said.

"I recognise that 2011 was a very challenging year for many in the geothermal sector but, with a suite of new policies coming into force, I am optimistic that geothermal can turn the corner in 2012," he said, citing the introduction of a carbon price and a tax deduction for exploration.

"While I recognise there has been a tightening of capital markets globally, now is a good time to invest in geothermal technology."

But even within the industry there appears to be disagreement about the best way forward, or how to prove the technology.

Many want to see incentives that would encourage the sort of drilling programs envisaged by the AGEA, which wants to encourage companies to drill in multiple areas and multiple situations, to test the possibility not just of hot dry rocks, but of hot sedimentary aquifers and geothermal heat, and in a range of states.

Some, however, suggest the industry would be best served by focusing on a couple of "flagship" projects that could prove the technology and make it ready for when there is increased demand for

new baseload power -- something that might not actually occur until 2020.

The two most likely candidates would be the two companies that have won (but not yet spent) the $153 million awarded under the Renewable Energy Demonstration Program: Geodynamics for a 25MW demonstration project in the Cooper Basin, and Petratherm for a separate 33MW project.

Indeed, Petratherm is proposing to create a "clean energy province" in South Australia that could serve the needs of power-hungry projects such as the Olympic Dam expansion, initially with a combination of gas and wind, and then with geothermal and solar.

Susan Jeanes, the head of the AGEA, said the geothermal industry had mostly survived on private investment, with more than $500m spent over the past decade, and less than 10 per cent of that coming from government grants or incentives.

"We are confident that we can make those contributions by 2050, but the question is, how do we start?" Ms Jeanes said.

Even to meet the ABARE prediction of 4 per cent of the national supply by 2035 would require billions of dollars of investment.

But apart from a small power station that has been operating in Birdsville, Queensland for the last 30 years, drawing energy from a relatively shallow resource of hot water, there was no operating geothermal plant in the country.

Without such plants it was impossible to estimate the costs and the potential returns.

"The main thing is that we need to do things concurrently. We just can't sit and wait until one venture is successful or not."

Finally a grant

FOR the last 18 months, the local wave energy and geothermal industries have been eagerly awaiting the first of the grants in the government's Emerging Renewables Program to be announced.

Emerging Renewables was first announced by Mr Ferguson in the lead-up to the last federal poll in August 2010, with a promise to spend $40m by the end of that financial year.

That didn't happen. But in the meantime, the fund grew as unspent monies from the Geothermal Drilling Program and the Renewable Energy Development Program were added, taking its total to $126m, with a promise to spend on wave, geothermal and enabling technologies under the auspices of the newly constructed Australian Renewable Energy Agency.

Finally, the first grant has been announced, with $1.9m to go to a $5m program by NICTA, Australia's ICT Research Centre of Excellence, to use "big data analytics" to locate geothermal energy sources beneath the surface.

It will help a leading team of university experts from four states to find "better, automated ways" to define geothermal targets, using "machine learning techniques and advanced data analytics instead of drills".

NICTA says it will work with the School of Information Technologies at the University of Sydney to develop algorithms, and the Schools of Earth Science at the Australian National University, University of Melbourne and University of Adelaide to apply these methods to the problem of geothermal target characterisation and exploration.

It will also work with listed companies Geodynamics and Petratherm, as well as GeoScience Australia and the South Australian Department of Manufacturing, Innovation, Trade, Resources and Energy.

The rest of the wave and geothermal industries will be hoping that more grants will follow soon.

Giles Parkinson is the editor of  RenewEconomy.com.au

3. Wet behind the ears on climate

BY:STEWART FRANKS From:The Australian March 08, 2012 12:00AM

http://www.theaustralian.com.au/national-affairs/opinion/wet-behind-the-ears-on-climate/story-e6frgd0x-1226292581646

TIM Flannery, Australia's Chief Climate Commissioner, once declared that "even the rain that falls will not fill up the dams".

This was back in 2007 at the height of the protracted drought that afflicted eastern Australia. Now, for the second year in a row, we see the effects of El Nino's twin sister -- La Nina -- bringing extreme rainfall across great swaths of Australia. This is hardly the climate change future envisaged by Flannery.

Flannery has recently been the target of growing criticism for his wildly speculative claims, in particular from Andrew Bolt and Alan Jones.

Perhaps of even greater significance, Flannery is being publicly criticised by prominent meteorologists. Indeed, The Weather Channel's Dick Whitaker recently stated: "People ideally suited to (weather forecasting) are meteorologists. From what I can see on Tim Flannery, meteorology wasn't one of his specialties."

In response to this growing criticism, Flannery has declared that the recent "big wet" cannot be taken as evidence that climate change is not happening -- it is merely an interlude before we continue with the drying of the continent.

In a statement of extreme chutzpah, he also has declared that interpreting the recent wet is merely confusing weather with climate.

In a recent opinion piece published by The Daily Telegraph (March 2), Flannery stated: "Despite our wet summer, the long-term trend shows that southeastern Australia is getting drier. This is the difference between weather and climate. Weather is about what's happening day-to-day and year-to-year. But climate refers to weather trends over the long term.

"Records over the past 40 years indicate that rainfall has been steadily declining in southern and eastern Australia and studies by the CSIRO show that it's likely to continue."

The first thing to note about Flannery's recent statements is that they do not include an admission that he got it wrong regarding future rainfall not filling the dams. He did unequivocally get it wrong; it was a stupid comment to make and it was inevitable that it would eventually be seen as such.

That Flannery appears to be defending his alarmism by pointing to

others confusing weather for climate just provides another example (if one were needed) of his ignorance of the science of climate variability in eastern Australia.

In fact, Flannery's error was to confuse climate variability for climate change.

The observed history of Australian climate is a lot longer than just the recent 40 years cited by Flannery. Australia regularly experiences epochs lasting between 20 to 40 years when extreme floods cluster, only to be succeeded by a similarly long period when droughts dominate and flooding is only occasional.

Between 1910 and 1945, a dominance of El Nino with only a few La Nina events led to long-term and persistent drought in Australia.

About 1945, we experienced a major change in climate from this previously El Nino-dominated regime to one dominated by La Nina events. This led to as much as a threefold rise in the average annual maximum flood across eastern Australia.

The dominance of La Nina and an associated southward shift in the location of the Intertropical Convergence Zone meant that tropical deluges of rain were frequent and extreme.

About 1975 we returned to El Nino event dominance. Once again, Australia suffered repeated droughts and very few floods, culminating in the terrible drought that Flannery thought would never end.

The past few years have brought a return to dominant La Nina conditions.

It is therefore no surprise that we have witnessed flooding of a magnitude last seen in 1974, the last of the strong La Nina events before the present period. Nor is it a surprise that the dams are full or filling.

This cycling of El Nino and La Nina dominance on periods of about 20 to 40 years has been associated with a long-term climate mode known as the Pacific Decadal Oscillation, sometimes known as the Interdecadal Pacific Oscillation.

The PDO-IPO is characterised by long-term trends in warming and cooling in the mid-latitudes of the Pacific Ocean, which in turn are related to variable periods of global warming and cooling.

Given Flannery's penchant for CO2-driven climate change, it is perhaps no wonder that this climate mode is largely ignored in his predictions; it appears to be a complication that the climate models still cannot reproduce. How it relates to increases in atmospheric CO2 concentrations remains largely unknown.

Despite our uncertainty about the PDO-IPO, one thing should be abundantly clear: to look at simple trends across a relatively short 40-year period is meaningless. If one looks at the trends in eastern Australian climate from 1950 to the present, one can see a marked, statistically significant decline in rainfall and flood risk.

However, if one looks at a similar length of records from, say, 1925 to 1975, we see a statistically significant trend, but in the opposite direction: upward. If Flannery were hawking his climate change message back in 1975, he would probably be claiming that the carbon climate future would be one of permanent flood.

Relatively short trends are clearly irrelevant given the multidecadal variability of eastern Australian climate driven by El Nino-La Nina Southern Oscillation and the PDO-IPO.

Flannery in his opinion piece has also stated: "Some commentators jump on any cold spell or rainy period to claim climate change is not happening. This cherry-picking is irresponsible and misleading."

It is also true that some commentators jumped on the recent drought to claim climate change was happening. This cherry-picking is indeed irresponsible and entirely misleading.

Stewart Franks is associate professor in the school of engineering at the University of Newcastle specialising in hydro-climatic variability and hydrological modelling.

4. Billions blown away on wind power, says British study

BY:DAVID CROWE, NATIONAL AFFAIRS EDITOR From:The Australian March 09, 2012 12:00AM

http://www.theaustralian.com.au/national-affairs/billions-blown-away-on-wind-power-says-british-study/story-fn59niix-1226294168155?from=hot-topics-home

GOVERNMENTS are squandering billions of dollars on "uneconomic" wind farms, according to a landmark study that undermines the case for Labor's huge renewable energy subsidies.

Investment in wind turbines will fail to cut enough greenhouse gas emissions to justify their cost, economists warned yesterday after a detailed British analysis released this week.

The conclusions challenge a cornerstone of Labor's climate change policy as the federal government pours taxpayer funds into wind projects using direct subsidies, a planned $10 billion investment fund and renewable energy targets.

In a finding with direct relevance to Australia, the study by University of Edinburgh economics professor Gordon Hughes warns that using wind turbines to cut emissions costs 10 times the price of a gas-fired power station.

"Wind power is an extraordinarily expensive and inefficient way of reducing CO2 emissions when compared with the option of investing in efficient and flexible gas combined-cycle plans," he concludes.

Professor Hughes, a commissioner on Britain's Infrastructure Planning Commission and a former World Bank senior adviser, conducted his study for the Global Warming Policy Foundation, which is chaired by former Conservative chancellor Nigel Lawson.

The British study warns of the rising cost to consumers of wind power subsidies on the grounds that governments could achieve the same environmental benefits by other means at much lower cost.

Comparing a pound stg. 13 billion ($19bn) outlay on a combined-cycle gas plant against a pound stg. 120bn outlay on wind farms, Professor Hughes found the renewable energy option was too expensive by any standard.

Wind power would cut emissions at an average cost of pound stg. 270 a tonne, he estimated, but meeting Britain's greenhouse targets

in this way would cost about pound stg. 78bn a year or 4.4 per cent of the nation's GDP.

Professor Hughes also warned that greenhouse gas emissions might be higher using wind turbines because the energy supply could be intermittent and would need back-up systems powered by fossil fuels.

"Any reduction in CO2 emissions due to additional wind generation will certainly be much lower than the headline figures quoted by lobbyists for renewable energy," Professor Hughes writes.

"Without some fundamental technical change, onshore wind power is going to remain uneconomic, especially if external costs are taken into account."

Frontier Economics managing director Danny Price said Australian policies to favour wind farms, such as the mandatory renewable energy target and direct subsidies, would be judged a "gigantic waste of money" in retrospect. "The real problem is they've put in place a scheme for renewables where the only real option is wind," he said. "But it is just so incredibly costly it's not funny."

Policies should favour a wider array of renewable projects instead, he said.

Anthony Owen of the International Energy Policy Institute in Adelaide said the British findings translated to Australian conditions, with the central conclusions not only being the high cost of wind power but also the fact that it was not zero-emission technology, despite common belief.

"The high capital cost of wind makes it particularly unattractive to private power generation companies in the absence of government subsidies," Professor Owen told The Australian.

"Gas-fired power generation is very flexible and relatively cheap in terms of capital costs.

"In the form of combined cycle gas turbine technology, it is also a much lower emitter of greenhouse gases - roughly 40 per cent of equivalent coal, in operation."

Professor Owen said Australia's wind resources had different characteristics to those in Britain, complicating comparison, but in principle the results carried over.

5. National Academy calls for fusion energy roadmap08 Mar 2012Interim report recommends making effective use of the National Ignition Facility to assess energy-generating potential.

http://optics.org/news/3/3/12

A committee at the US National Academy of Sciences (NAS) has recommended that the National Ignition Facility (NIF) becomes a major part of a drive to assess the feasibility of inertial fusion as a practical source of energy.

At present, the 192-laser NIF is focused primarily on fundamental science and addressing technical issues related to stewardship of the country’s nuclear weapons stockpile – but in recent months the project’s senior management team have increasingly highlighted the energy-generating potential of the technology.

Now, an interim report based on the first four meetings of the Academy's Committee on the Prospects for Inertial Confinement Fusion Energy Systems has concluded that enough has been achieved to warrant significant further investigation.

“An intense national campaign is under way to achieve ignition conditions on the NIF, and there has been considerable initial technical progress towards this major goal, although progress has been slower than initially anticipated,” wrote the committee.

Its recommendation is that “planning should begin for making effective use of NIF as one of the major program elements in an assessment of the feasibility of inertial fusion energy”.

In its current guise, NIF is designed to deliver single, intense bursts of laser energy to a deuterium-tritium target. At the recent Photonics West conference, NIF’s director for laser fusion energy Mike Dunne said he was hopeful that fusion with energy gain (or “burn”) would be demonstrated for the first time this year.

However, NIF has not been designed as a practical source of energy. For that to happen, a deuterium-tritium target would need to be ignited many times per second, and although Dunne and NIF director Ed Moses have outlined ways in which streams of targets could be dropped into a chamber and ignited with a high-repetition-rate laser, these ideas remain at a very early stage.

Technology roadmapIn its report, the NAS committee highlighted that it

was pleased to see that the inertial confinement community has begun a process to develop a consensus on critical issues and future activities geared towards fusion energy.

“This important effort should be encouraged, with the overall goal of developing options for a community-based roadmap for the development of inertial fusion as a practical energy source,” it wrote.

One of only two conclusions that the committee has reached so far is that it is too early to identify any particular type of laser or other technology as the preferred option to drive inertial confinement fusion in an energy plant.

NIF’s current design features flashlamp-pumped Nd:glass lasers that emit in the infrared and are up-converted to the ultraviolet region before being aimed at the target. Both Dunne and Moses have said previously that the rapid progress made in high-efficiency laser diodes since NIF was first built make these devices more obvious choices for a future fusion energy facility – with Moses having identified diode pumping of ytterbium-doped strontium fluoro-apatite (Yb:S-FAP) crystals as a strong candidate technology.

The NAS committee suggests that krypton fluoride excimer lasers – capable of direct ultraviolet emission and widely deployed in lithography stepper systems used by the semiconductor industry – might also be an option.

Its interim report is based on four initial committee meetings that took place in 2011, and was submitted to the National Research Council (NRC) for review back in August 2011. Following two subsequent meetings, NAS is set to publish its final report on the topic this summer.

The committee says that the interim report is aimed at assisting the US Department of Energy in planning its future-year budget requests for inertial fusion energy, while the final report is due to provide a full assessment of the prospects for the approach, with specific regard to cost targets and R&D objectives associated with developing a demonstration plant.

In last month’s FY2013 budget request, the DOE said that it intended to cut its spending on science related to “high energy density laboratory plasmas” (HEDLP), which covers inertial fusion energy, from the $24.7 million enacted in 2012 to $16.9 million. The department added that spending priorities would need to be reassessed, and that the final NAS/NRC study report would inform a competitive review of the program.

Mike Dunne interview with SPIE.TV:

6. Laser fusion nears crucial milestone

National Ignition Facility approaches energy break-even point, but uncertainty over next step persists.

Eric Hand07 March 2012

http://www.nature.com/news/laser-fusion-nears-crucial-milestone-1.10175

This could be the year the National Ignition Facility (NIF) finally lives up to its name. The facility, which boasts the world’s largest laser, is designed to trigger fusion by imploding a target pellet of hydrogen isotopes, thereby releasing more energy than will go into the shot. NIF’s managers think that the end of their two-year campaign for break-even energy, or ‘ignition’, is in sight. “We have all the capability to make it happen in fiscal year 2012,” says Ed Moses, director of the US$3.5-billion facility, at the Lawrence Livermore National Laboratory in California.

But even if the champagne corks do get popped, the method — a form of ‘inertial confinement’ fusion — faces an uncertain future. Would success mean that the US Department of Energy (DOE) will be ready to develop it into an economically viable energy source? And if so, is NIF’s laser-based approach the best one? An interim report released on 7 March by a US National Academies panel concludes that it is still too early to tell, and recommends that fusion scientists explore alternative technologies for imploding the fuel.

Glen Wurden, a plasma physicist at Los Alamos National Laboratory in New Mexico, agrees, saying that scientists working on inertial confinement should be wary of putting all their eggs in the laser basket. “It’s premature right now,” he says. He points to the troubles that have plagued a competing approach to fusion — magnetic confinement — and its flagship project ITER, a $21-billion international fusion experiment under construction at St-Paul-lez-Durance, France. Wurden blames ITER’s delays and ballooning costs on a premature commitment to a technology known as a tokamak, a doughnut-shaped cage within which powerful electromagnets confine a fusion plasma.

Despite early confidence, bolstered by favourable computer models, NIF too has lagged behind schedule. “It thought it had ignition in the bag,” says Wurden. Instead, NIF’s approach to heating and compressing the hydrogen isotopes has proved troublesome. In what is known as indirect drive, the laser’s multiple beams are focused at the openings in a pencil-eraser-sized gold cylinder called a hohlraum, blasting the insides to create X-rays.The X-rays then heat and squeeze the fuel pellet inside the hohlraum to produce fusion. But unexpectedly turbulent interactions between the laser light and the plasma inside the hohlraum sap energy

from the beams. That could wipe out any gains as NIF managers ramp up the laser energy to the threshold needed for ignition.

The NIF team has made steady progress, however. When the push for ignition began 18 months ago, the facility was achieving 1% of the conditions thought to be needed for ignition. Now the figure stands at 10%, and the pace is quickening: a record 57 shots were taken in January alone (see ‘Power play’). The team is also studying an array of tweaks, including encasing the fuel in beryllium or diamond instead of plastic and changing the hohlraum material or its shape. Moses says that it might also be possible to crank up NIF’s peak energy from the 1.8 megajoules estimated to be needed for break-even to 2.2 megajoules.

Still, as the National Academies’ report notes, other approaches might provide an easier route to ignition and ultimately to a practical power plant. But who will pay for developing them? Most of the research on inertial confinement in the United States and worldwide has been supported by national security and weapons complexes, which want to study fusion for weapons purposes, not for electric power. Today, laser fusion in the United States makes its home within the National Nuclear Security Administration, the branch of the DOE responsible for stewardship of the nuclear stockpile.

Within the DOE’s Office of Science, almost no money goes to inertial confinement research. The vast majority supports magnetic confinement fusion, and increasingly, that money is paying for ITER. Stephen Dean, president of Fusion Power Associates, an advocacy group based in Gaithersburg, Maryland, says that even if the academies’ panel in its final report calls for a robust inertial confinement energy programme, the research will struggle to find a home with the office of science. “I think it’ll just ignore it,” he says. “ITER is obviously its top priority, and they’re struggling like mad to save it.”

The $460 million requested for the NIF effort in 2013 will allow it to vary its approach. For example, plasma physicists at the University of Rochester in New York want to adapt NIF’s lasers so that they can implode a hydrogen-isotope pellet directly and dispense with the hohlraum.

The NIF scientists aren’t waiting for alternative approaches to catch up, however. Even before achieving ignition, they are racing to plan their next project, a demonstration power plant that they call LIFE, for Laser Inertial Fusion Energy. To be economic, the plant would have to produce more than 50 times more energy from each shot than it puts in, and would have to boost repetition rates from a few shots a day to 15 per second — no mean feat.

In the quiet, cavernous NIF facility is a mock-up of one of the modular beam lines that would make up LIFE, small enough to fit in the back of a truck. Whereas NIF’s set-up uses thousands of bulky flashbulbs to pump energy into the glass lasers, LIFE would use small, transistor-powered light-emitting diodes. Moses dismisses the notion that it’s too early to commit to lasers as the drivers of a future power plant. Because of investment in lasers and transistors for consumer electronics, the world has already chosen, he says. Historians will look back, and “they’ll see the transistor and the laser as the turning point”.

LIFE director Mike Dunne says that the capital costs for the pilot plant would be about $4 billion, and it could be putting hundreds of megawatts into the grid by the early 2020s — at least a decade earlier than the magnetic-fusion community hopes to deliver a practical power plant. Recalling the first time he presented the LIFE concept to magnetic-fusion researchers at a conference a few years ago, Moses says “The response to it was almost violent: ‘This cannot be.’ They were shocked at the ambition of it for sure. And they still are.”

Nature 483, 133–134 (08 March 2012) doi:10.1038/483133a

7. Opinion: GUEST COLUMN: Fusion research is a wise investmentThe United States must not give up its place in the world fusion research programBy Geoff OlynykMarch 6, 2012

http://tech.mit.edu/V132/N9/olynyk.html

Course 22 senior Derek Sutherland’s article in last Friday’s Tech did a great job of describing why the Alcator C-Mod magnetic fusion experiment, the largest experiment at MIT, deserves to be funded in the fiscal year 2013 federal budget. But it is also imperative to note how magnetic fusion energy research in the United States as a whole is in serious danger at this time, and how the path proposed for fusion in the 2013 budget is harmful to the future of U.S. energy independence and U.S. scientific leadership.

The proposed budget ramps down the U.S. fusion program at a time when other countries are scaling up their efforts. In China, a new long-pulse tokamak called EAST is now producing scientific results, and the government has

announced plans to train 2,000 fusion PhDs this decade. In Korea, fusion funding is guaranteed by law until 2040. Germany has a new stellarator (another type of magnetic fusion device) coming online next year. A consortium of six nations plus the EU is constructing the world’s first burning-plasma device, the ITER tokamak in France, which will produce 10 times more fusion power than external power put in to heat the plasma. The rest of the world sees the tremendous potential of magnetic fusion energy.

Meanwhile, in the United States, despite the recommendations of the National Academies of Science and Engineering and energy-aware think tanks like the American Security Project, the government is eviscerating the domestic research program, starting with Alcator C-Mod, to pay for its nine percent share of ITER construction. In effect, the United States will be subsidizing tomorrow’s foreign fusion industry using its fusion research budget. The U.S. won’t be able to reap the benefits of its ITER investment — research results and skills development — without a strong domestic program to capture those gains. It’s also important to note how modest the fusion research budget is: Alcator C-Mod employs 120 skilled staff and supports the jobs of 200 more, and trains 30 graduate students at a time, on an annual budget of $28 million. The entire domestic magnetic fusion program costs the taxpayer $298 million per year. This is a mere 0.03% of the U.S. defense budget, or about the cost of buying two of the new F-35 fighter jets.

Magnetic fusion research suffers from numerous misconceptions, dating back to the early years of the research program when, buoyed by the spectacular first results from the tokamak in the late 1960s, a few pundits made optimistic predictions about how long it would take to build an economical fusion reactor. Later, in the 1970s and ’80s, new phenomena were discovered that at first were mostly bad news, like turbulence that caused heat to leak out of the plasma much faster than originally predicted. But more recent discoveries have been hugely beneficial, and have propelled fusion research toward the goal of an economical reactor.

The past few decades have seen spectacular increases in

fusion performance, due to discoveries like a region of parameter space called H-mode, which halves the energy leak rate for tokamaks and led to experiments in the U.S. and the U.K. that produced more than 16 MW of fusion power. A more recent development is the I-mode, which promises to keep the plasma clean and hot without edge instabilities that act like solar flares and damage wall components. It was discovered right here at MIT, on Alcator C-Mod, and is being actively studied as an operating scenario for ITER.

Furthermore, every time something new is discovered to better control fusion plasmas, our designs for fusion reactors drop in cost and size. The state-of-the-art ARIES-AT reactor study concludes that a fusion reactor is cost-competitive with a fission reactor, and has none of the proliferation or high-level waste issues. Further advances will continue this trend, but these advances will only come about with a strong experimental program in place.

The U.S. will only be poised to take advantage of the results from ITER and take the next step to build a real prototype electricity-producing magnetic fusion reactor if fusion researchers exist in the U.S. We do not know exactly how long it will take to reach an economical reactor — indeed, this uncertainty defines scientific research. But the progress that fusion research has made, as demonstrated by the ability to simulate and then build tokamaks like EAST and ITER, shows that this is one research risk that the U.S. would be foolish not to take. The potential reward is far too great to ignore.

The United States should fully fund the domestic fusion research program for fiscal 2013, including Alcator C-Mod at MIT, while simultaneously fulfilling its ITER obligation. The U.S. should support a fusion future.

Further information about Alcator C-Mod and the domestic fusion program, as well as a link to contact Congress, can be found at http://www.fusionfuture.org.

Geoff Olynyk is a graduate student in the Department of Nuclear Science and Engineering. Alcator C-Mod will host an open house for the MIT community on Wednesday,

March 7, from 1 p.m. to 3 p.m., with tours every half-hour starting in NW17.

8. Small Nuclear Power Reactors(Updated 3 March 2012)

http://www.world-nuclear.org/info/inf33.html

There is revival of interest in small and simpler units for generating electricity from nuclear power, and for process heat. This interest in small and medium nuclear power reactors is driven both by a desire to reduce capital costs and to provide power away from large grid systems. The technologies involved are very diverse. 

As nuclear power generation has become established since the 1950s, the size of reactor units has grown from 60 MWe to more than 1600 MWe, with corresponding economies of scale in operation. At the same time there have been many hundreds of smaller power reactors built both for naval use (up to 190 MW thermal) and as neutron sourcesa, yielding enormous expertise in the engineering of small units. The International Atomic Energy Agency (IAEA) defines 'small' as under 300 MWe, and up to 700 MWe as 'medium' – including many operational units from 20th century. Together they are now referred to by IAEA as small and medium reactors (SMRs).  However, 'SMR' is used more commonly as acronym for Small Modular Reactors.

Today, due partly to the high capital cost of large power reactors generating electricity via the steam cycle and partly to the need to service small electricity grids under about 4 GWe,b there is a move to develop smaller units. These may be built independently or as modules in a larger complex, with capacity added incrementally as required (see section below on Modular construction using small reactor units). Economies of scale are provided by the numbers produced. There are also moves to develop small units for remote sites.  Small units are seen as a much more manageable investment than big ones whose cost rivals the capitalization of the utilities concerned.

This paper focuses on advanced designs in the small category, i.e. those now being built for the first time or still on the drawing board, and some larger ones which are outside the mainstream categories dealt with in the Advanced Reactors paper.   Note that many of the designs described here are not yet actually taking shape.  Three main options are being pursued: light water reactors, fast neutron reactors and also graphite-moderated high temperature reactors. The first has the lowest technological risk, but the second (FNR) can be smaller, simpler and with longer operation before refueling.

Generally, modern small reactors for power generation are expected to have greater simplicity of design, economy of mass production, and reduced siting costs. Most are also designed for a high level of passive or inherent safety in the

event of malfunctionc. A 2010 report by a special committee convened by the American Nuclear Society showed that many safety provisions necessary, or at least prudent, in large reactors are not necessary in the small designs forthcomingd.

A 2009 assessment by the IAEA under its Innovative Nuclear Power Reactors & Fuel Cycle (INPRO) program concluded that there could be 96 small modular reactors (SMRs) in operation around the world by 2030 in its 'high' case, and 43 units in the 'low' case, none of them in the USA.  (In 2009 there were 133 units up to 700 MWe in operation and 16 under construction, in 28 countries, totaling 60.3 GWe capacity.)

A 2011 report for US DOE by University of Chicago Energy Policy Institute says development of small reactors can create an opportunity for the United States to recapture a slice of the nuclear technology market that has eroded over the last several decades as companies in other countries have expanded into full‐scale reactors for domestic and export purposes. However, it points out that detailed engineering data for most small reactor designs are only 10 to 20 percent complete, only limited cost data are available, and no US factory has advanced beyond the planning stages. In general, however, the report says small reactors could significantly mitigate the financial risk associated with full‐scale plants, potentially allowing small reactors to compete effectively with other energy sources. In January 2012 the DOE called for applications from industry to support the development of one or two US light-water reactor designs, allocating $452 million over five years.   Other SMR designs will have modest support through the Reactor Concepts RD&D program.

In March 2012 the US DOE signed agreements with three companies interested in constructing demonstration SMRs at its Savannah River site in South Carolina. The three companies and reactors are: Hyperion with a 25 MWe fast reactor, Holtec with a 140 MWe PWR, and NuScale with 45 MWe PWR. DOE is discussing similar arrangements with four further SMR developers, aiming to have in 10-15 years a suite of SMRs providing power for the DOE complex. DOE is committing land but not finance.  (Over 1953-1991, Savannah River was where a number of production reactors for weapons plutonium and tritium were built and run.)

The most advanced modular project is in China, where Chinergy is starting to build the 210 MWe HTR-PM, which consists of twin 250 MWt reactors. In South Africa, Pebble Bed Modular Reactor (Pty) Limited and Eskom were developing the pebble bed modular reactor (PBMR) of 200 MWt (80 MWe), with similar fuel. A US group led by General Atomics is developing another design – the gas turbine modular helium reactor (GT-MHR) – with 600 MWt (285 MWe) modules driving a gas turbine directly, using helium as a coolant and operating at very high temperatures. All three are high-temperature gas-cooled reactors (HTRs) which build on the experience of several innovative reactors in the 1960s and 1970s.

Another significant line of development is in very small fast reactors of under 50 MWe. Some are conceived for areas away from transmission grids and with small loads; others are designed to operate in clusters in competition with large units.

Already operating in a remote corner of Siberia are four small units at the Bilibino co-generation plant. These four 62 MWt (thermal) units are an unusual graphite-moderated boiling water design with water/steam channels through the moderator. They produce steam for district heating and 11 MWe (net) electricity each. They have performed well since 1976, much more cheaply than fossil fuel alternatives in the Arctic region.

Also in the small reactor category are the Indian 220 MWe pressurised heavy water reactors (PHWRs) based on Canadian technology, and the Chinese 300-325 MWe PWR such as built at Qinshan Phase I and at Chashma in Pakistan, and now called CNP-300. These designs are not detailed in this paper simply because they are well-established. The Nuclear Power Corporation of India (NPCIL) is now focusing on 540 MWe and 700 MWe versions of its PHWR, and is offering both 220 and 540 MWe versions internationally. These small established designs are relevant to situations requiring small to medium units, though they are not state of the art technology.

Other, mostly larger new designs are described in the information page on Advanced Nuclear Power Reactors.

Medium and Small (25 MWe up) reactors with development well advanced

Name Capacity Type Developer

KLT-40S 35 MWe PWR OKBM, Russia

VK-300 300 MWe BWR Atomenergoproekt, Russia

CAREM 27-100 MWe

PWR CNEA & INVAP, Argentina

IRIS 100-335 MWe

PWR Westinghouse-led, international

Westinghouse SMR  200 MWe PW

R Westinghouse, USA

mPower 125-180 MWe

PWR Babcock & Wilcox + Bechtel, USA

 HI-SMUR 140 MWe PWR Holtec, USA

SMART 100 MWe PWR KAERI, South Korea

NuScale 45 MWe PWR NuScale Power + Fluor, USA

 CAP-100/ACP100 100 MWe PWR CNNC & Guodian, China

HTR-PM 2x105 MWe  HTR INET & Huaneng, China

PBMR 80 MWe HTR Eskom, South Africa

GT-MHR 285 MWe HTR General Atomics (USA), Rosatom (Russia)

 SC-HTGR (Antares) 250 MWe HTR Areva

BREST 300 MWe FNR RDIPE, Russia

SVBR-100 100 MWe FNR Rosatom/En+, Russia Hyperion PM 25 MWe FNR Hyperion, USA Prism 311 MWe FNR GE-Hitachi, USA

FUJI 100 MWe MSR ITHMSO, Japan-Russia-USA

 9. U.S. SCIENCE BUDGET

Bigger Contribution to ITER Erodes Domestic Fusion Programhttp://fire.pppl.gov/iter_erodes_USfusion_science_022412.pdf

The U.S. fusion program is in a bind. To remain at the cutting edge, U.S. fusion researchers must participate in the huge inter- national experiment called ITER being built in Cadarache, France. But to pay for ITER— which aims to produce a self-sustaining fusion reaction, or “burning plasma,” and prove that fusion is a viable energy source—the United States may have to sacrifice the very com- munity of researchers who would use the machine when it is ready.

That paradox hit home last week, when President Barack Obama submitted a 2013 budget request to Congress that would slash the nation’s already beleaguered domestic fusion program while boosting the U.S. contribution to ITER. Contributing to ITER “is reasonable only in the context of a domestic program,” says Martin Greenwald, a physicist at the Massachusetts Institute of Technology (MIT) in Cambridge and chair of the Department of Energy’s (DOE’s) Fusion Energy Sciences Advisory Committee (FESAC). “Otherwise, you’re just building a piece of equipment for other people to use.”

At first blush, the proposed 2013 budget for the fusion energy sciences program at DOE doesn’t look so bad. It would dip by less than 1% to $398 million. However, within that flat budget, spending on ITER construction would increase by 43% next year, from $105 million to $150 million. As a result, spending on fusion research at home would fall 16%, to $248 million.

The effects of the cut would be dramatic. DOE supports three large experimental devices called tokamaks—doughnut-shaped chambers in which ionized gas, or “plasma,” is confined by magnetic fields and heated and squeezed to the point at which atomic nuclei fuse and release energy. In the biggest blow, the tokamak at MIT, called the Alcator C-Mod, would shut down.

“I was shocked,” says Miklos Porkolab, director of MIT’s Plasma Science and Fusion Center. “I didn’t have the vaguest idea of what was coming.” C-Mod is the only U.S. tokamak that operates at magnetic fields as strong as ITER’s will be,

Porkolab says. It supports 100 staff members and 30 graduate students.

The budget of the United States’s sole dedicated fusion lab, the Princeton Plasma Physics Laboratory (PPPL) in New Jersey, would drop by 16%, to $61.8 million. “If all the cuts go through, we would have to lay off about 100 of 435 staff,” says PPPL Director Stewart Prager, who notes that the lab has already shrunk by two-thirds since the 1990s. The proposed cut for 2013 would stretch by 6 months an ongoing upgrade of the lab’s National Spherical Torus Experiment, delaying the tokamak’s restart until 2015.

Obama’s budget request, if adopted by Congress, would leave the United States with only one tokamak operating next year, the DIII-D at General Atomics in San Diego, California. But its running time would be reduced to 10 weeks—3 weeks less than this year and a far cry from the 25 weeks that would constitute full utilization, says Tony Taylor, vice president of the magnetic fusion energy division at General Atomics. The cuts would also require axing 30 of 180 DIII-D staff and postponing key upgrades.

Even the proposed $150 million contribution to ITER in 2013 won’t keep the United States on pace to meet its commitment to the project, says Stephen Dean, a physicist and president of Fusion Power Associates, a non-profit research and education foundation in Gaithersburg, Maryland. That would require about $200 million, he says.

The budgetary train wreck is exactly what some researchers have long feared. When the United States signed on to ITER in 2003—as a junior partner alongside he European Union, China, India, Japan, Russia, and South Korea—its projected cost was $5 billion, making the U.S. share roughly $500 million. Now the price tag tops $20 billion, and the U.S. share has ballooned to $2.2 billion or more. At the same time, DOE’s fusion budget has been flat for the past decade when adjusted for inflation.

Oddly, although the proposed budget would provide just $40 million for running facilities, it still provides $154 million for research. A typical operations-to-research ratio for a DOE program is 1:1. Why the imbalance? One reason is that much of that research might be done overseas. Last sum- mer William Brinkman, director of DOE’s Office of Science, asked FESAC to study the idea of sending legions of researchers to South Korea and China to work on new toka- maks that those countries have built. Its report is due this month, but scientists already have misgivings about that approach.

The U.S. fusion program is in a bind. To remain at the cutting edge, U.S. fusion researchers must participate in the huge inter- national experiment called ITER being built in Cadarache, France. But to pay for ITER— which aims to produce a self-sustaining fusion reaction, or “burning plasma,” and prove that fusion is a viable energy source—the United States may have to sacrifice the very com- munity of researchers who would use the machine when it is ready.

That paradox hit home last week, when President Barack Obama submitted a 2013

budget request to Congress that would slash the nation’s already beleaguered domestic fusion program while boosting the U.S. contribution to ITER. Contributing to ITER “is reasonable only in the context of a domestic program,” says Martin Greenwald, a physicist at the Massachusetts Institute of Technology (MIT) in Cambridge and chair of the Department of Energy’s (DOE’s) Fusion Energy Sciences Advisory Committee (FESAC). “Otherwise, you’re just building a piece of equipment for other people to use.”

At first blush, the proposed 2013 budget for the fusion energy sciences program at DOE doesn’t look so bad. It would dip by less than 1% to $398 million. However, within that flat budget, spending on ITER construction would increase by 43% next year, from $105 million to $150 million. As a result, spending on fusion research at home would fall 16%, to $248 million.

The effects of the cut would be dramatic. DOE supports three large experimental devices called tokamaks—doughnut-shaped chambers in which ionized gas, or “plasma,” is confined by magnetic fields and heated and squeezed to the point at which atomic nuclei fuse and release energy. In the biggest blow, the tokamak at MIT, called the Alcator C-Mod, would shut down.

“I was shocked,” says Miklos Porkolab, director of MIT’s Plasma Science and Fusion Center. “I didn’t have the vaguest idea of what was coming.” C-Mod is the only U.S. tokamak that operates at magnetic fields as strong as ITER’s will be, Porkolab says. It supports 100 staff members and 30 graduate students.

The budget of the United States’s sole dedicated fusion lab, the Princeton Plasma Physics Laboratory (PPPL) in New Jersey, would drop by 16%, to $61.8 million. “If all the cuts

go through, we would have to lay off about 100 of 435 staff,” says PPPL Director Stew- art Prager, who notes that the lab has already shrunk by two-thirds since the 1990s. The pro- posed cut for 2013 would stretch by 6 months an ongoing upgrade of the lab’s National Spherical Torus Experiment, delaying the tokamak’s restart until 2015.

Obama’s budget request, if adopted by Congress, would leave the United States with only one tokamak operating next year, the DIII-D at General Atomics in San Diego, California. But its running time would be reduced to 10 weeks—3 weeks less than this year and a far cry from the 25 weeks that would constitute full utilization, says Tony Taylor, vice president of the magnetic fusion energy division at General Atomics. The cuts would also require axing 30 of 180 DIII-D staff and postponing key upgrades.

Even the proposed $150 million contribution to ITER in 2013 won’t keep the United States on pace to meet its commitment to the project, says Stephen Dean, a physicist and president of Fusion Power Associates, a non- profit research and education foundation in Gaithersburg, Maryland. That would require about $200 million, he says.

The budgetary train wreck is exactly what some researchers have long feared. When the United States signed on to ITER in 2003—as a junior partner alongside the European Union, China, India, Japan, Russia, and South Korea—its projected cost was $5 billion, making the U.S. share roughly $500 million. Now the price tag tops $20 billion, and the U.S. share has ballooned to $2.2 billion or more. At the same time, DOE’s fusion budget has been flat for the past decade when adjusted for inflation.

Oddly, although the proposed budget would provide just $40 million for running facilities, it still provides $154 million for research. A typical operations-to-research ratio for a DOE program is 1:1. Why the imbalance? One reason is that much of that research might be done overseas. Last summer William Brinkman, director of DOE’s Office of Science, asked FESAC to study the idea of sending legions of researchers to South Korea and China to work on new tokamaks that those countries have built. Its report is due this month, but scientists already have misgivings about that approach.

If nothing else, it will make it harder to attract younger scientists, researchers say. (Already, the 2013 budget would slash the number of student positions from 325 to 263.) The scheme also exports the country’s most valuable resource: knowledge. “It makes no sense for the United States to pay to ship our intellectual capital overseas and make our- selves less competitive,” Taylor says.

The proposed budget isn’t a done deal. “I can tell you that the community does not support this plan, does not support this budget, and is going to try to get Congress to overturn it,” Dean says. Brinkman agrees that “it makes no sense to invest in ITER if there isn’t a base program” and suggests that the department may also be looking for help from Congress.

“This [budget] has to go to the Hill, and we’ll see what the Hill does with it,” he says.

Even if Congress kicks in the $50 million needed to shore up the domestic program next year, 2014 could be far worse. The United States will have to pony up $2 billion for ITER over 8 years, so its annual contribution will likely shoot up to $300 million, potentially consuming the whole domestic program. Restructuring those payments may be the best short-term solution. Brinkman says that Office of Science staff members have begun talking to other Administration officials and ITER partners about such changes. “One thing you learn in this government town is you take it one year at a time,” he says. But fusion physicists worry that such a tactical approach will only delay the demise of their research program.

Greenwald and other fusion scientists would like greater support from the Obama Administration. But presidential science adviser John Holdren says the Administration is doing what it can. “The cutting edge of fusion is determining whether we can create a burning plasma, and the only machine in the world that has a prospect of doing that is [ITER],” Holdren said last week during a rollout of

the new budget when asked if ITER was being favored over the domestic program. But he added that “we are going to maintain a strong plasma science program and invest in ITER.” –ADRIAN CHO

10. BOOKENDSQuestions and answers with Francis F. ChenFebruary 15, 2012

http://www.physicstoday.org/daily_edition/bookends/questions_and_answers_with_francis_f_chen?type=PTPICKS

Plasma physicist Francis (Frank) Chen has spent more than five decades conducting theoretical and experimental research in magnetic fusion, laser fusion, plasma diagnostics, basic plasma physics, and low-temperature plasma physics. A graduate of Harvard University, Chen was hired in 1954 for Princeton University’s Project Matterhorn, which conducted fusion research for domestic energy production. Using a stellarator built by space physicist James van Allen, Chen was the first to show that electrons could be trapped by a magnetic field for millions of transits; that discovery advanced the field of plasma fusion.Though formally retired from UCLA, where he has been since 1969, Chen continues to manage an active low-temperature plasma research lab and occasionally advises graduate students. He is author of An Indispensable Truth: How Fusion Power Can Save the Planet (Springer, 2011). Physics Today recently caught up with him to discuss the book.PT: What motivated you to write this book?Chen: It has been clear for a long time that being a member of a large experimental team would not be the best way for me to contribute to fusion. What is needed is to get the public interested in supporting it. The media have been exceptionally bad in understanding fusion enough to treat it favorably. In astronomy, there was Carl Sagan; in cosmology, there is Steve Hawking; in string theory there is Brian Greene. I'm not such a personality, but I've had success in bringing plasma physics down to the undergraduate level: My textbook, Introduction to Plasma Physics, has been selling well since 1973. However, explaining plasma fusion to a lay audience is an entirely different matter, and I grossly underestimated the difficulty.PT: Obviously, the title of the book plays off the title of former vice president Al Gore’s documentary, An Inconvenient Truth. To what extent do you think your book can have a similarly successful impact on popularizing fusion power?Chen: I tried to write in a conversational style for an ordinary, non-scientific audience, but I have not reached a large readership. Book club readers are used to flipping the pages and devouring a 400-page book in a day. They can't get through two pages with real facts in them. In the climate change chapter, I thought I had used bar charts in a clever way, but my wife says she is used to xy-charts like the Dow-Jones index and can't understand bar charts. Al Gore got professional

help from the entertainment industry. Fusion needs such help. It needs grass-roots support. Fusion can solve Al Gore's problem of global warming.The second half of the book is intended to be a resource for physicists and policymakers. It gives the facts about fusion without overselling it. I am hoping that my book can garner support for fusion, perhaps not from the [Obama] administration, which is worried about jobs, but maybe from the Sierra Club, the Nature Conservancy, the Environmental Defense Fund, or EPRI [the Electric Power Research Institute].PT: What got you interested in fusion research? And are you still professionally engaged in the field?Chen: Fusion is in my blood, since I was one of the earliest workers in the field. In 1973 a marvelous collaboration with international visitors at UCLA, where I’ve been since 1969, launched the field of parametric instabilities in laser fusion. I then changed to that field for 10 years, after which it evolved into plasma accelerators. But for the last 20 years I have been in low-temperature plasma physics, four orders lower than fusion temperatures.PT: Cynics like to say that fusion power is always 20 years away. How do you address that sort of skepticism in your book?Chen: You have to understand how it was before 1990. We didn't know how to prevent instabilities from destroying plasma confinement, or even what the instabilities were. The skepticism stems from those early days when we were flying blind. But great changes came in the 1990s as large tokamaks were built, confinement got better, and a better understanding of scaling was possible. Supercomputers allowed three-dimensional simulations of toroidal plasmas. High-confinement modes were invented, and internal transport barriers could be made by tailoring the current distribution with radio-frequency power. The problems, which are still hard, are well enough understood now that the ITER machine in France can be planned with a definite schedule. The people who still retain their impressions from 1970 should realize that the situation has changed. That is the main point I was trying to explain in my book.PT: What is your take on existing clean energy technologies, which aim to fill the gap until fusion power comes online?Chen: Wind power is very efficient. Solar cells are now cheap. But they are not steady, and you can't store the electricity for base power. Adding more than 10% of temporary power would destabilize the grid. Carbon sequestration wastes 30% of the energy produced and is very expensive. To store the CO2, you have to compress it, which takes energy, and then inject into a geological reservoir, which could be leak-proof only if you didn't drill holes into it for injection. Nuclear power is very clean and very safe. Public hysteria is its problem. It should be used until fusion comes online. The power plants and grids can remain the same; the power core simply changes from fission to fusion.PT: What book are you reading at the moment?Chen: I'm reading David McKay’s Sustainable Energy—Without the Hot Air (UIT Cambridge, 2009) just to learn how to write more simply.

11. After Fukushima, there is still only one clean energy option

BY:ZIGGY SWITKOWSKI From:The Australian March 12, 2012 12:00AM

http://www.theaustralian.com.au/national-affairs/opinion/after-fukushima-there-is-still-only-one-clean-energy-option/story-e6frgd0x-1226296364889

IT has taken a year to stabilise the Fukushima nuclear plant in Japan after it was hit by a tsunami. No one has died, no crew has yet suffered ill health from any excessive radiation exposure. But 100,000 people remain displaced from the 20km exclusion zone.

The long-term impact of this nuclear disaster won't be measured by the number of radiation-caused sicknesses - there is likely to be very few, if any - but by the anxiety and depression affecting people obsessing about potential consequences of radiation exposure.

In this past year, Germany, Switzerland and Belgium have decided to phase out their nuclear generators. Italy reversed a decision to restart building reactors and Thailand and Venezuela have cancelled plans to introduce nuclear power. But the US, Britain and France have completed careful inspections of their networks and continue nuclear operations. And China and India have reaffirmed their commitment to nuclear power, aiming to add more than 100 large reactors by 2030.

Insights have emerged from the many reviews of Fukushima. The siting of reactors in geologically active zones, design of hardened containment structures, better remote monitoring instrumentation, the nature and location of back-up power, details of crisis management and communication and the importance of strong, independent regulators are the headline conclusions along with confirmation of the integrity of the nuclear fuel cycle as a safe, clean and cost-effective source of electricity.

Prior to Fukushima, Japan had 54 reactors generating about 30 per cent of its electricity. Today, only two reactors are operating, with the process of shut down for inspection, refuelling and local government and regulator approval for restart taking two years or more. So Japan is spending more than $50 billion annually importing coal, gas and oil to offset the loss of nuclear power. This will increase its greenhouse gas emissions by more than Australia proposes to reduce ours over the next 30 years.

The debate in Australia about nuclear energy was reshaped by Fukushima. Growing public interest and support for nuclear power shifted back after March 11, 2011 to opposition, albeit temporarily. Today, opinion remains evenly split but divided by gender women against, men for.

But the nuclear option cannot be avoided by us and continues to influence key national policies. Last year Australia admitted India to its list of approved export countries for our uranium. Given we hold about 40 per cent of the known global uranium reserves, and India may be the second largest civilian nuclear state within a few decades, this decision is pragmatic.

The Gillard government has introduced a carbon tax with rhetoric condemning fossil fuels as pollutants. If a clean energy future is ahead, no practical scenario emerges without a nuclear contribution, notwithstanding earnest intentions to the contrary. To that end, the government's draft energy white paper had terms of reference explicitly excluding consideration of nuclear energy. This is like having a comprehensive review of tax and excluding the GST - politically expedient but producing suboptimal outcomes.

The draft white paper, which is a long overdue but fine piece of work, places unreasonable expectations on geothermal and solar energy and the presumed success of carbon capture and storage. The authors are sufficiently astute, however, to note that should these technologies fall short, then the nuclear alternative must come into play. I expect we'll reach that decision point in this decade.

Finally, nuclear submarines must be debated. Given confidence in our technological skills and a preparedness to plan 20 years out, the answer seems self-evident.

Ziggy Switkowski is chancellor of RMIT University and former chairman of the Australian Nuclear Science andTechnology

Organisation.