Tuesday, April 25, 2006

Compressed Air Hybrids On Instapundit

Instapundit has, in its telegraphic style, picked up on the compressed air hybrid meme. He's linked to a company called Scuderi Group that makes air compressors. They're hawking a split-cycle engine, where compression and expansion are performed in separate sets of cylinders. For just a few hundred dollars more you can interpose an air tank and make the engine capable of storing and reusing compression energy. They have a publicity blog here.

I've previously discussed this kind of technology. Let's hope something comes of it.

Wednesday, February 22, 2006

Reprocessing Nuclear Waste: A Solution in Search of a Problem?

Recently, President Bush announced $250 million for a pilot nuclear reprocessing facility. The plan would use a process, called UREX+, that separates uranium, technetium, trans-uranic elements (TRUs) and possibly iodine, cesium, and strontium. The goal appears to be to reduce the volume, heat output, and future escaping toxicity of waste destined for geologic disposal.

Schemes like this face serious economic problems.

Today, the spot price of U3O8 (yellowcake) is around $37/lb. At this price, reprocessing does not come close to paying for itself from avoided fuel costs. It's cheaper to just fabricate new fuel elements from freshly mined uranium (especially when one considers the difficulty of fabricating MOX fuel elements, which have to deal with plutonium contamination of the fabrication line.)

But maybe uranium will get more expensive in the future, and reprocessing will make sense. How far in the future? Estimates of the cost of the Japanese amidoxime scheme for extracting uranium from seawater are in the range of $100 to $250/kg; the cost of extracting fissionables from spent fuel are at least three times this. The oceans contain 4.5 billion tons of uranium; roughly 150 tons/year of natural uranium would be required to feed the enrichment plant to fuel a 1 GW(e) powerplant on a once-through fuel cycle. This source will not run out soon, even if the world's nuclear capacity is greatly expanded.

So, why is Bush proposing this recycling research? My guess is he wants to close down the as-yet unused Yucca Mountain geological disposal facility, and needs to be able to present the nuclear waste problem as being solvable in the future. I personally have little doubt it is solvable, even without this program, but it does provide political cover.

I also think delaying geological disposal is the right thing to do. Spent fuel can be safely, securely and economically stored in above-ground facilities for decades or centuries. The storage scheme, generically called dry cask storage, involves placing cooled fuel elements in sealed armored steel containers filled with inert gas. The containers are further shielded with concrete or other materials; some double as transport containers for the spent fuel. These casks are rugged, resistant to diversion or attack, and require no active cooling systems.

At some point in the distant future, it might become economical to reprocess the accumulated spent fuel. At that point, I suggest it be done in space. Reprocessing inevitably leaks some material, contaminating the facility and, if you are not very careful, the surrounding land. If it were done on the moon (for example), this contamination would be unimportant. The cleanup cost of the Rocky Flats facility, where plutonium was handled, reached $7 billion, and that was only because some corners were cut (saving $29 billion). The volume of material handled at Rocky Flats was a small fraction of the volume that would flow through industrial reprocessing facilities.

Launch costs would have to come down a lot for this to be economical -- I'd guess a factor of 100 would be more than enough -- but it's a long term idea, so that's reasonable. PWR fuel elements would have to be armored against launch accidents (even on space elevators). However, the fuel elements from high temperature gas-cooled reactors could probably be launched without reentry armor, since they'd likely survive reentry largely unscathed, encased in rugged graphite/silicon carbide shells.

The proposed UREX+ process separates out technetium (which is important for long term radiotoxicity due to its solubility and the ease with which it escapes from the repository), so even if terrestrial reprocessing can be made to work, it's attractive to dispose of this element in space instead of, say, transmuting it in a reactor. The volume is much smaller than that of the unprocessed spent fuel, so this could be attractive even at today's launch costs.

Friday, February 17, 2006

Saving the World, Destroying the Rainforest

Soil scientists and organic gardening advocates, like those at the Rodale Institute, have for decades propounded the importance of organic matter in soils. Organic materials promote good soil structures and help retain nutrients that would otherwise be leached away.

It turns out, though, that the best organic soil component had not been properly appreciated. It's charcoal. Charcoal retains nutrients even better than the traditional organic materials derived from plant litter, manure, or compost. It binds phosphorus, which those others don't do. And, best of all, charcoal is stable in soils -- even tropical soils -- for centuries or millenia. The other organic materials quickly decompose, especially under tropical conditions.

These properties are crucially important for the highly leached and oxidized soils of the tropics. Much of the Amazon basin had been thought to be useless for agriculture because of depleted, acid soils lacked nutrients and would too quickly lose any that were applied. But within the past decade it was increasingly realized that this didn't have to be the case, and that, in actuality, pre-Colombian cultures had successfully terraformed and farmed large areas of the Amazon. Their technique, which caused the formation of so called Terra Preta de Indio, or 'indian dark earths', involved the long term addition and accumulation of charcoal and other organic materials in Amazonian soils. The resulting dark soils have remained fertile even after centuries of cultivation by farmers ignorant of its origin.

Addition of charcoal to soils is now being promoted as part of a carbon-negative biomass energy system. Biomass, when heated in anoxic conditions, pyrolyzes to make charcoal and a hydrogen-rich gas. About half the carbon ends up in the charcoal, some of which can then be added back to the soil. This carbon would accumulate over many cycles, making the soil better with time instead of depleting it.

I do worry, however, that all this will have a side effect that the advocates don't mention. If the Amazon can be made productive, there will be increasing incentive to exploit it. The tropics can be farmed year round and have plenty of water and sunlight. Already Brazil is becoming one of the lowest cost producers of grain and soybeans in the world. If biofuels become globally important as oil substitites, economic forces will drive the conversion of the rain forest to energy plantations. Sure, the Amazon could be farmed with small-scale, organic, ecologically sensitive methods. So could the US midwest -- but it largely isn't. And given how the forests of the temperate zones were converted wholesale to farmland, the previously developed countries won't be able to object without hypocrisy.

Sunday, February 12, 2006

Drinking the Sky from a Firehose

I just got back from Capricon, a Chicago-area SF convention. The science/space programming was excellent this year. I had to miss part of the talk on Pan STARRS, but what I did hear and what's online is extremely exciting for anyone interested in the solar system.

Pan STARRS is a survey system using advanced orthogonal charge transfer CCD detectors. It exploits the increasing power and economy of electronics to do things that would have been inconceivable a few years ago. At full size it will use four 1.8 meter mirrors, each equiped with a 1.4 gigapixel camera. The imagers will be designed to do real time active image correction electronically, by shifing the image back and forth in two dimensions. The system will scan the entire sky visible from its site in Hawaii three times per month, reaching 24 mag in each single image over a 30 second integration. By stacking images over several years, the survey will be able to go beyond 29 magnitude.

Among the scientific results expected:

  • In its first month of observing, Pan STARRS will more than double the number of known asteroids. After several years, the number of discovered asteroids will reach about 10 million.
  • All NEOs down to a few hundred meters in diameter will be found. If any are possibly going to hit Earth soon, we'll know.
  • Roughly 20,000 Kuiper Belt Objects are expected to be found (vs. less than 1000 today).
  • PS can detect a body like Pluto out to 300 AU, Earth out to 600 AU, Neptune out to 1200 AU and Jupiter (or heavier) out to 2000 AU. PS will resolve the question of whether there is a planet X anywhere near the existing solar system (bodies like Sedna suggest there may be something heavy out there.)
  • PS should detect about 1 interstellar comet per year. These are comets that originated in some other solar system and were then ejected into interstellar space, a fate that befell about 90% of the comets in our early solar system as well.
  • Determine the position and distance of all stars within 100 parsecs of Earth visible from the site.
  • Find roughly 100 extrasolar planets by occultation.
  • Detect all Andromeda-size galaxies in the universe that are visible from Hawaii, all the way back to the start of galaxy formation more than 12 billion years ago.


I can't wait to start seeing the results from this. Their prototype telescope (with only one mirror and camera) should begin operating this year.

Saturday, December 10, 2005

Dealing With Climate Change

The big climate change circus in Montreal is winding to a close, with posturing on all sides. I can't help but be disgusted by the hypocrisy. Canada's PM attacked the US, even though Canada's performance under the Kyoto treaty (which does not bind the US, which never ratified it) is worse that the US's. Canada's CO2 emissions are up 24% since the start of the Kyoto period, meaning they'll need a 30% reduction in emissions to meet their commitments. 11 European countries are in similar situations.

What that means is they'll be buying emissions credits, mostly from Eastern Europe. As I understand Kyoto, if the promised emission reduction in the country selling the credit doesn't materialize, the buyer is not liable. So I expect lots of fake credits to be sold, and then, gosh, too bad so sad when the seller drops out of the treaty (not that the enforcement mechanisms in Kyoto have any teeth.)

The Europeans, who initially opposed emissions trading, could at least thank the US for getting it into the treaty in the first place -- even if, as predicted years ago, this scheme proves unworkable. At this point it provides them a useful fig leaf.

Kyoto's a failure, and the follow-on is not likely to be any less ridiculous, so what's to be done? My money at this point is on geoengineering, particularly schemes to increase the effective albedo of the Earth. Allowing CO2 to double (for example) while reducing insolation to compensate for the resulting warming would dramatically increase the biological productivity of the Earth's ecosystems. A study at Lawrence Livermore of mitigation showed that, if optimized, the cost is amazingly low -- just a billion dollars a year. This is a small fraction of the cost to the US alone of complying with Kyoto, never mind the rest of the world or the far more onerous emissions reductions that would be required to actually stabilize atmospheric CO2 levels.

I have a sneaking suspicion that the US government is, at some level, egging on the climate hysteria, encouraging greater and greater predictions of disaster and doom from a warming world. At some point, a geoengineering proposal will be brought out, and it will be difficult for the doomsayers to claim it shouldn't be done.

Wednesday, December 07, 2005

Science Makes Sex Obsolete ...

... or so the title of this article in Wired claims. Among the things on the horizon (within five years, according to an editor of the journal Fertility and Sterility) is the manufacture of eggs and sperm cells from other human cells, enabling two men to be the two genetic parents of a single child. Just imagine what this will do to the gay marriage debate: the argument that society should not support couples that are not potentially procreative is cut down at the premise.

Thursday, November 24, 2005

Direct Carbon Fuel Cells

Fuel cells, consuming hydrogen and air, are often touted as the ultimate chemical power source. Aside from the difficulty of storing hydrogen, however, they have a basic problem. The reaction H2 + 1/2O2 to H2O causes a reduction in the number of gas molecules. This reduces entropy so, by the second law, entropy must increase elsewhere. As a result, no hydrogen/oxygen fuel cell can convert 100% of the chemical energy of the fuel into electrical energy. Worse, the maximum efficiency declines with increasing temperature. The maximum efficiency of a solid oxide fuel cell (SOFC) burning hydrogen at 1000 C is only 3/4 that of one operating at room temperature. This is a shame, since high temperature operation reduces the need for expensive electrocatalysts. You can get some of the waste heat back with a bottoming cycle, but that adds cost and complexity.

What's needed is a fuel that produces one gas molecule for each oxygen molecule consumed. And there is such a fuel -- carbon. The maximum theoretical efficiency of a fuel cell oxidizing carbon to CO2 is close to 100%, even at elevated temperature. You have to avoid the reaction of C and CO2 to form carbon monoxide, but that's not too hard below 1000 C.

The first such direct carbon fuel cell was built more than a century ago, in 1896, by William Jacques. No one could duplicate his results for four decades (possibly due to role of the titanium impurities in the steel he used for the cathode), but now there is an active research program exploring several different electrolyte chemistries. DOE has been conducting periodic workshops on DCFCs showing the progress in the field.

The results so far are promising. Maximum efficiency in the lab is in the mid-80%s, and practical efficiencies in the range of 70-75% (about twice that of coal-fired steam plants) appear possible. Areal power densities have been achieved in the lab that are good enough for practical application. A DCFC plant, unlike an IGCC or molten carbonate fuel cell plant, does not require bottoming cycles. Nor would it require precious metal electrocatalysts, like low temperature fuel cells. The capital cost per kW of a DCFC may be less than that of a conventional steam plant of the same capacity. As an added bonus, the carbon dioxide from a DCFC comes off in a relatively pure stream, so it will be easier to sequester than the CO2 from a conventional plant, which is mixed with nitrogen, argon, and unconsumed oxygen. NOx and SOx emissions are negligible. Mercury emissions can be largely eliminated.

What's the catch? You'd want a DCFC to burn coal, but coal is loaded with non-combustible stuff. So, it's necessary to clean the ash out of the coal before using it as fuel. This appears to be a solvable problem, with several different approaches (mechanical separation, digestion of silicates with fluorosilic acid, cleaning with organic solvents at elevated temperature) being considered. The cleaner the carbon, the less often the electrolyte will have to be changed. Fortunately, cheap electrolytes like sodium/potassium carbonate appear to work.

I'm wondering if it would be possible to design a system where the electrolyte also acts as a CO2 sink. That is, add a component that irreversibly binds to CO2, and flow the electrolyte through the stack, disposing of the mineralized CO2 by burial. It might also be possible to combine coal cleaning with side processing steps to extract useful elements from the ash.