Tuesday, November 30, 2010

Philosophical and Pyschological Underpinnings of Modern Peak Oil

Profile Books

The fear of resource scarcity is a very old fear. Much of the modern fear of resource scarcity centers on "Peak Oil," a philosophy, a psychology, a way of life for many adherents. Author Duncan Clarke has forty years of experience looking at economics, geopolitics, and the oil industry. In the book "Battle for Barrels", Clarke looks at several aspects of the modern peak oil belief system -- including some of its forerunners:
During his term in office in the 1970s, President Jimmy Carter had come to the conclusion that all proven reserves oil would be used up by the end of the next decade. In 1920, the USGS had put world oil reserves at a mere 20 BBLS. In the early 20th century there were regular predic-
tions of oil famine: in 1914, the US Bureau of Mines said that America would run out of oil in ten years. And back in 1855 the rock oil found in Pennsylvania had been predicted to disappear, the victim of reserve depletion. The peaks have been shifting for a long time now.

The gloomy, pessimistic 1970s, in particular, proved fertile soil for the growth of doomsday scenarios, such as the Club of Rome and the Limits to Growth model, impending population bombs and the collapse of capitalism. The publication in 2005
of Limits to Growth: The 30-Year Update is a reminder that our era is in some ways comparable. It is unsurprising that in our troubled and uncertain times a theory such as Peak Oil should flourish. Angst feeds on angst. _Excerpted from Chapter 8, _Excerpted from Chapter 8, BattleforBarrels
The first part of Clarke's book deals largely with the philosophical and psychological underpinnings of the peak oil movement. In that sense, it differs from other critiques of peak oil written by economists, environmentalists, geologists, and oil company executives. It is quite useful in that regard, since people who choose to join movements-of-doom such as Peak Oil have an essential psychological need which is being met by their membership in the group. Authors who argue strictly on the level of reserves, technologies, and substitutions will not address this crucial underlying aspect of the mass movement.

Clarke does look at some of the numbers, -- although not until the later chapters -- and his strongest insights into the phenomenon have to do with the psychological aspects.

In this regard, Clarke's book is an essential piece of the total puzzle as revealed by a number of Peak Oil critics and observers.

Here is another look at the book:
The Battle For Barrels provides fresh, comprehensive and seasoned analytical insights, with polemical reflections on the Peak Oil debates (including on its architects, adherents, allies, activists, and even critics), to counterpoint these alarmist ideas..

The book deconstructs the Peak Oil model and its pre-ordained determinism, explaining why its morbid outlook on reserves and discovery potential does not reflect a world terrain still much unexplored and under-exploited. The real circumstances and complexities that shape the present and likely oil future are elaborated, with new insights on the world oil game. Hence the global angst aroused by Peak Oil, the passions evoked, plus the apocalyptic meanings held by its diverse constituencies, can be seen as excessive and built on shallow foundations. There is no “skeleton in the oil kitchen”, such that human ingenuity and imagination cannot resolve future dilemmas. The divergence between Peak Oil Theory and the real oil word is stark. The essential crisis inside Peak Oil reflects a condition alike some Twilight In The Mind. _Petro21

So you see, Battle for Barrels is not a technical numerical or technological argument, so much as it is an insightful look into the thinking of Peak Oil. I will continue to provide excerpts from various books dealing with this issue, and will try not to step on the toes of any authors or publishers in the process.


Monday, November 29, 2010

Coming $Multi-Trillion Industry to Disrupt Fossil Fuels In Time

Greg Mitchell, a researcher at the prestigious Scripps Institute of Oceanography, expects seaweed to become a multi-trillion dollar industry -- sometime after a ten year developmental period. This new industry should disrupt the use of fossil fuels, according to Mitchell.

According to the Biomass Handbook, cultivated seaweed can yield close to 130 tons per ha per year. Fast growing willow may yield just above 10 tonnes per ha per year. And miscanthus grass can yield 15 tonnes dry mass per ha per year. Giant King Grass (PDF) may produce 5 X or more yield than miscanthus, in tropical climates.

Those are rough figures which are subject to change as faster-growing strains are developed via several means. Besides rapid growth and 6 X a year harvesting, seaweed takes advantage of large areas of the Earth's surface which cannot be utilised to grow land trees or land grass for biomass. Seaweed essentially doubles available biomass growing area -- or more -- which throws conventional calculations for biomass potential out the window. Some problems must be worked out, but by the time humans truly need the massive quantities of biomass they can get from seaweed (and special grasses and trees), the problems should have been solved.
Seaweeds, a macro form of algae, hold great promise because of their potential for very high yields and high oil production while thriving on non-arable land. Another benefit is that they grow well in saline water. Traditionally crops will not excel in salt water and in some areas of the country valuable agricultural land has been taken out of production due to high concentrations of salt.

But as all researchers know, not all algae is created equal. There are strains of seaweeds that hold great promise for bio-energy and others that hold great promise for producing other products such as high protein meals for replacing non-sustainable ocean-caught fishmeals in aquaculture and other animal diets.

In fact, many algae companies that began with the mission of producing algal fuels have now refocused on producing algae products for the pharmaceutical, plastics, health, and agricultural feed industries. For example, there are strains of seaweeds that UCSD-SIO has been studying that grow well inland and can be used to recycle artificial seawater and waste nutrients from chicken ranches or pig farms. Algae has also been used in farm fish operations from cleaning the ponds to providing feed.

...Yet with all the research focused on algae, there are still several major hurdles that need to be overcome before algal biofuels will become commercially viable. The cost of production must be significantly reduced, elite strains of algae and seaweeds optimized for fuel production need to be developed and test facilities need to scale up to large production areas of several hundred acres.

Mitchell believes the timeline for this to occur spans more than 10 years. To date, all research projects are small and need to be brought to commercial scale levels. “We need several hundred acre demos that would take three years to design, permit and build. Then we need at least two years to get data and improve design,” said Mitchell. “Then we’ll roll out commercial scale over the following five years. We can do all this now at pilot scale but its not yet economically viable. So I see 10 years for this to be turned to economic viability.”

The result, Mitchell believes, will be a multi-trillion dollar industry that will disrupt the use of fossil fuels. _DomesticFuel

Labels: ,

Nuclear Power Reliable, Cheap, Clean


One of the features of nuclear power which is most attractive to a large segment of the population, is the fact that it does not emit CO2. Among all baseload energy sources, nuclear is the cheapest and lowest carbon emitting power source. With the coming of factory-made small modular nuclear plants, nuclear should get cheaper, more reliable, safer to operate, and quicker to install.
After analysing a wealth of peer-reviewed studies on market needs, technology
performance, life-cycle emissions and electricity costs, the researchers conclude that only five technologies currently qualify for low-emission baseload generation. Of these, nuclear power is the standout solution. Nuclear is the cheapest option at all carbon prices and the only one able to meet the stringent greenhouse gas emission targets envisaged for 2050.

Only one of these five qualifiers comes from the renewable energy category – solar thermal in combination with heat storage and gas backup. However, on a cost basis, it is uncompetitive, as are the carbon capture and storage technologies.

Professor Barry Brook, director of climate science at the University of Adelaide’s Environment Institute says: “I am committed to the environment, personally and professionally. The evidence is compelling that nuclear energy must play a central role in future electricity generation. No other technology can meet our demand for power while reducing carbon emissions to meet global targets”. _BraveNewClimate_via_NextBigFuture

More informed energy analysts understand that the true strengths of nuclear power far transcend the feature of no carbon emissions. But if low CO2 emissions can be used as a bargaining point with the carbon hysterics in government regulatory agencies, then by all means, let's do it.

CO2 is a well-mixed atmospheric gas. Plants and algae do not care where their CO2 comes from (China and India), and will not begrudge humans their more sustainable nuclear power -- even if it means less CO2 for them.


Saturday, November 27, 2010

29th Carnival of Nuclear Energy Blogs

Here is the link to the full Carnival of the Nukes #29. H/T Brian Wang

Here is a short excerpt:
At Atomic Insights Rod Adams writes he is surprised just how long it takes the advertiser supported media to recognize an important story. For instance, This morning, MSNBC and Bloomberg had both noticed that Westinghouse had transferred 75,000 documents relating to the design and construction of AP1000 nuclear reactor plants to China. One of those sources linked to a November 23, 2010 Financial Times report titled US group gives China details of nuclear technology.

Neither one of them linked to a June 2007 article titled China may export technology learned by building modern reactors that warned about the implications of a signed technology transfer agreement that was an integral part of Westinghouse's sale of four AP1000s in March of 2007.

U.S. missing the boat on the nuclear renaissance

Areva North America: Next Energy Blog advises readers to turn their attention to a brilliant piece published in The American Spectator’s October issue, William Tucker synthesizes one of the biggest issues surrounding the Nuclear Revival—the United States is not part of it.

Not only is the nation lagging behind on construction, training, and investments for a technology that provides huge amounts (1,000+ megawatts) of carbon-free energy, but the country has no clear outlook for when it will break out of this quagmire holding back the development of future energy security.

TVA could buy a six pack of nuclear power

While work on big reactors is lagging, at CoolHandNuke the story is that the Tennessee Valley Authority (TVA) could be the first customer for B&W's 125 MW small modular reactors (SMRs) – six of them. At an estimated $4,000/Kw, the purchase price would be $500 million each or a total of $3 billion for all six. TVA will evaluate the SMRs for its Clinch River site in Tennessee. If TVA decides to go forward, the first two units could be delivered by 2020. _Carnival29

Brian Wang at NextBigFuture takes a look at Chinese efforts to build better and cheaper nuclear reactors than the French or the Americans can do.

The Chinese engineering efforts to improve reactor size, efficiency, and costs are most commendable. It is not likely that cost reductions could be transferred outside of China itself, except perhaps to third world countries. The building of nuclear reactors in third world countries, however, is not advisable unless the mean population IQ is at least 90 or above. If China violates that proviso, it will be guilty of considerable death and hardship in the future, within those unlucky countries which purchase its nuclear product, without also paying for the Chinese to operate all phases of the plant's operation and security.


Sustainable Fossil Fuels?

We introduced Mark Jaccard in the previous posting. The Vancouver-based economist has written a book-length analysis of the global energy dilemma entitled Sustainable Fossil Fuels: The Unusual Suspect in the Quest for Clean and Enduring Energy. In the book, Jaccard takes a close look at the range of alternative energy sources for powering the world in the 21st century, and comes to some interesting and perhaps unconventional conclusions. A few paragraphs from Chapter 5 are excerpted below giving some assessments of some world energy resources.
The world coal resource is estimated at over 7 trillion tonnes or 200,000 EJ, of which 80% is hard coal. Over half of this is concentrated in the countries of the former Soviet Union, especially Russia. North America, Western Europe and China also have significant resources.

Coal reserves are substantial compared to our current use rate. If coal consumption continued at its current rate of 100 EJ per year reserves might last 210 years and the estimated resource 2,000 years. These long timeframes would decline of course if the exploitation rate were to increase or if not all of the resource were ever to become technologically or economically accessible. In my current trends scenario (table 2.1 in chapter 2), annual coal production increases six-fold.

...The World Energy Assessment provides an estimate of unconventional oil reserves as 5,000 EJ and the total unconventional resource as 20,000 EJ.* Together, the estimated conventional and unconventional oil resources are 11,000 GJ for reserves within a total resource base of 32,000 EJ. If global oil consumption continued at its current annual rate of 163 GJ, currently estimated reserves would last sixty-seven years and the estimated resource 200 years.

...Combining conventional and unconventional gas yields total gas reserves of 15,000 EJ and a total gas resource of 49,500 EJ. If global natural gas consumption continued at its current annual rate of 95 EJ, reserves would last 160 years and the resource 520. _ SustainableFossilFuels Chap 5

There is a great deal more to the book than a few resource assessments -- which the author handles with considerably more sophistication than the few isolated excerpts above might suggest.
Sustainable Fossil Fuels Chap 5 Jaccard

We are being threatened by doomers of all sorts with a near-term apocalypse via cataclysmic resource scarcity. Their pornographic portrayals of doom adorn most print and electronic media outlets on a regular basis. And yet these modern-day predictions of doom are not really different than the badly failed predictions of doom from the 1960s and 1970s. And those 20th century predictions of doom were little different from similar predictions from the 1800s and 1700s. And so it goes, backward through time. The apocalyptic instincts of humans-with-time-on-their-hands left their mark extending well back into the early historical periods.

Fortunately, thinkers such as Jaccard are willing to publish well-reasoned and documented assessments which contradict the messages of doom that buffet us about on a daily basis.

It is likely that even in a future of abundant fusion energy, home-scale molecular nano-fabs, and inter-stellar colonisation based upon hyper-space drives -- even then there will be doomsayers who will be virtually indistinguishable from the modern prophets of doom.

Those of us with things to accomplish will pay them all the attention they are due, in the course of doing what we must.


Friday, November 26, 2010

A Lot of Life Left in Fossil Fuels: Economist Mark Jaccard

Mark Jaccard is professor of resource and environmental management at Simon Fraser University, in Vancouver. He thinks that the planet's massive reserves of coal can be used cleanly and responsibly for a long, long time.
Today we burn coal. But we could gasify it instead, using decades-old technology deployed in South Africa, a legacy of apartheid-era restrictions on crude oil imports. Rather than making gasoline, however, we could add extra steam to produce a hydrogen-rich gas, and then scrub it with a solvent to extract its carbon dioxide, the greenhouse gas we do not want to enter the atmosphere.

The resulting hydrogen could be burned to produce electricity, or piped to industrial plants, buildings and vehicles for use in fuel cells. Sulphur, mercury and other coal residuals could be captured and converted into useful products. The carbon dioxide could be injected into old oil and gas reservoirs, enhancing their output by 30 per cent, or into deep saline aquifers for permanent storage.

...we are not about to run out of fossil fuels. While doomsayers decry the peaking production of ‘conventional’ crude oil, experienced energy experts calmly assess the technical and economic potential of substitutes. They note that when the price of crude oil is above $35 (£20) per barrel - and today it is $60 [Editor: As of November 2010, at $80 plus] - alternatives such as oil sands from Canada, natural gas from Qatar, coal from South Africa and biomass from Brazilian sugar cane can profitably produce oil products such as gasoline and diesel. Even with growing consumption, fossil fuels could last hundreds of years, given the global resources of coal and unconventional natural gas deep in the earth and frozen below the oceans. This evidence contradicts the claims of doomsayers that every spike in oil prices portends imminent resource exhaustion. _Spiked

As long as the price of oil is above $70 a barrel, a host of unconventional hydrocarbons will be affordable and attractive.

Venezuela's huge resources of heavy oils, for example, would be instantly and relatively cheaply accessible if you built some heavy-duty nuclear reactors nearby. The abundant heat and power from nuclear reactors are a perfect match to the challenge of breaking down and using heavy oils, oil sands, oil shales, and other stubbornly resistant hydrocarbons.

Heavy oils, oil shales, oil sands, and other unconventional hydrocarbons are present in abundant quantities the world over. We haven't begun to look for them because they have always been too expensive to develop and use. Soon, that will no longer be the case.

Of course, nuclear energy can also be used to make hydrocarbons from CO2 and H2O, or to facilitate the creation of fuels, chemicals, and plastics from biomass. The advantage of biomass over hydrocarbon deposits, is that biomass can be grown virtually anywhere on the surface of land or sea.

Labels: ,

High Value Aromatics and Olefins from Tuneable Pyrolysis

UMass-Amherst has been a busy hive of research into the thermochemical production of biomass-derived fuels and chemicals. Here is a report on a "tuneable", high-yield approach to the production of bio-hydrocarbons -- including high value fuel additives and other possible high value chemicals.
In the new UMass approach, the hydroprocessing increases the intrinsic hydrogen content of the pyrolysis oil, producing polyols and alcohols. The zeolite catalyst then converts these hydrogenated products into light olefins and aromatic hydrocarbons in a yield as much as three times higher than that produced with the pure pyrolysis oil.

The yield of aromatic hydrocarbons and light olefins from the biomass conversion over zeolite is proportional to the intrinsic amount of hydrogen added to the biomass feedstock during hydroprocessing. The total product yield can be adjusted depending on market values of the chemical feedstocks and the relative prices of the hydrogen and biomass, the researchers said.

The integrated catalytic approach presented in this report can be tuned to produce different targeted distributions of organic small molecules that fit seamlessly into the existing petrochemical infrastructure. The products can be tuned to change with different market conditions. The C6 to C8 aromatic hydrocarbons can be high-octane gasoline additives or feedstocks for the chemical and polymer industries. The C2 to C4 olefins can also be used directly for polymer synthesis or can be modified to form other products, including alkylated aromatics and longer linear alpha olefins. The gasoline-range alcohols can be high-octane gasoline additives. The C2 to C6 diols can serve as feedstocks for the chemical and polymer industries. The chemical industry relies on seven primary building blocks that are all derived from petroleum-based processes: benzene, toluene, xylene, ethylene, propylene, 1,3-butadiene, and methanol. Our catalytic process produces five of these seven petrochemical feedstocks, which opens the door to a chemical industry based on renewable biomass feedstock.
—Vispute et al. _GCC
Anellotech is one startup spun off from UMass Amherst, to produce catalytic pyrolysis fuels. There should be many others, as the chemical engineering program releases new batches of well trained and ambitious chemical engineers every year.

Pyrolysis, gasification, and other thermochemical approaches to advanced biofuels and chemicals are likely to surpass fermentation fuels such as maize ethanol. But the ethanol industry is not standing still, it is developing ways to reduce energy requirements and finding ways to expand its feedstocks to include cheaper biomass. Butanol -- a superior biofuel -- can be fermented from the same feedstocks as ethanol, once researchers develop better microbes -- so don't count out fermentation just yet.

But the thermochemical approaches championed by UMass-Amherst have a big advantage. They are quick, easily scaled, and can provide a broad range of fuels, fuel additives, high value chemicals, soil additives, fertilisers, and animal feeds. They are expensive, of course, and will require high oil prices to make them profitable.

But let's be brutally honest. Humans do not actually need these biomass to fuels approaches right now -- and won't need them for a few decades yet. But it is better to develop the technologies and have them available for when they are needed, if you can.

There are plenty of reserves of coal, gas, crude oil, and other hydrocarbons to see us through a long time yet -- in a very clean and environmentally responsible manner. But carbon hysterics rule the land at this time, and what is worse, these carbon hysterics are also dragging their feet in approving new nuclear technologies. So that leaves us with only hydro, geothermal, and biomass / biofuels to oppose the energy starvationists and their suicidal wind / solar agendas.

As long as carbon hysterics, energy starvationists, anti-nukes, and dieoff.orgiasts control the governments and inter-governmental institutions, we had best go as fast as we can in developing fuels and energy from biomass.


Thursday, November 25, 2010

US Natural Gas Supplies Exploding Without Apparent End

Gas abundance is not limited to the U.S. The latest World Energy Outlook report by the International Energy Agency (IEA) released in early November basically threw the entire natural gas market under the bus by predicting the glut will worsen next year and last for 10 years, which will only fade gradually as demand rises strongly in China. _Chu
Chu Market Oracle
You can follow US shale plays at the Shale Blog and at Shale News. One interesting recent find that was mentioned in the recently televised documentary "Haynseville", is that the already huge Haynesville shale deposit was compounded by another huge deposit discovered in a slightly shallower layer of rock. Haynesville's portion of the graph above may continue to grow as a result.

Something has to be done with all of this gas production, so for now a move is on to create LNG facilities for export. Export of LNG to the UK has already begun.
While several import facilities were planned and built (before unconventional gas even came into the picture) in anticipation of high LNG imports in the coming decades, the U.S. has very limited LNG export capability. That could be about to change.

There are two LNG export facilities announced this year--Freeport LNG and Australia's Macquarie Bank have agreed to build one in Texas to export 1.4 billion cubic feet per day of gas, and Cheniere Energy's will be on the site of its Sabine Pass facility to export 16 million metric tons per year. Both plan to produce and export LNG by 2015. _Chu

Besides converting the unconventional gas bonanza to LNG for export, gas to liquids may eventually become more feasible on the US mainland, after the departure of Salazar, Obama, Boxer, and the rest of the energy starvation reich.


Wednesday, November 24, 2010

Perhaps the First Rolling Pebble in the Coming Avalanche

The Earth has enough hydrocarbon fossil fuels to carry it past the middle of the 21st century, even if the global economy does not suffer one (or several) large collapse. It is unlikely that we will ever reach peak hydrocarbon energy from a production standpoint, given the rapid development of alternative and sustainable fuels -- and given the rapid advance in nuclear fission technologies. [Editor: Contrary to what many peak oilers assert, nuclear power can be used to produced liquid fuels in several ways. Abundant, reliable, baseload nuclear power would spell the end of liquid fuels shortages, even without the microbial fuels revolution.]

This article is about a radical framework for microbial biofuels which is likely to displace the use of liquid petroleum in the advanced world, although liquid petroleum is likely to be utilised for several purposes in areas of large, easily accessible deposits well into the 22nd century.

The basis for a system of abundant microbial fuels is, of course, the energy from the sun. Solar energy facilitates the microbial conversion of CO2 and H2O into carbohydrates, which are further converted (probably by yet other microbes) into hydrocarbons -- using H2 derived from various sources including biomass. Here is one possible starting point for this incredibly inefficient -- yet incredibly lucrative -- coming system of liquid fuels:
Proterro is using a form of genetically modified cyanobacteria. Here’s what’s different – its system derived from the operations of the humble tree leaf.

The Proterro approach is not to produce sugars in an aqueous phase – in water – but in a thin film bioreactor where the cells are at the surface of a fabric that transports water and nutrients to the organism.

...Proterro borrows from the approach utilized by the leaf. Leaves can transport water and nutrients to a surface layer where CO2 and sunlight are being absorbed. The Proterro approach is not to produce sugars in an aqueous phase – in water – but in a thin film bioreactor where the cells are at the surface of a fabric that transports water and nutrients to the organism. The organism secretes, or “sweats” a sucrose solution, which is then collected using gravity.

This is the key difference between most algal-based technologies – which also aim to synthesize low cost sugars from sunlight, CO2 and water – and Proterro. The approach radically reduces the difficulties of getting the water out of the algae, or the algae out of the water. And by radically reducing the amount of water, it radically reduces the land footprint of the overall system.

... Proterro expects that the process will work in relatively northern climates well north of the sugar cane belt. That’s more significant than might be gleaned at first glance.

Think of all the CO2 at all those first-generation corn ethanol plants. All that can be utilized, along with water and sunlight, to generate simple sugars on site that can be fermented into ethanol. That’s without corn, without land use change, and reducing the potential impact of waste CO2. How much could an existing corn ethanol plant boost capacity by using its own CO2, and a Proterro like microorganism – somewhere in the range of 25-50 percent, according to the earliest, unconfirmed estimates by the company.

These are not insignificant opportunities — that could add 3-6 billion gallons to US corn ethanol production, without additional land or corn usage, and without expanding the current fleet of plants.

...How does it differ from Joule? In some ways, not much at all — the magic bug produces a simple sugar instead of a hydrocarbon, but otherwise would have some of the same elements of a modular production system that could be proved at extremely small scale, and uses a modified cyanobacteria.

To date, they’ve scaled up from the lab bench to a square meter system – and on their shopping list is a larger base module. From the business perspective, they are worlds apart – ultimately, Joule is a fuel and chemicals solution; while the Proterro approach works on the feedstock end, with a wide potential swath of partners, ranging from those seeking low-cost sugars for their own microbes, to companies that are looking to convert their waste CO2 from cost centers to revenue steams....

Proterro is at an early stage. In their development process, they have yet to optimize the flow rates of water – for the amount of hydrogen required to make sugars is relatively small. They’ve been able to prove that the magic bugs will do their magic, but the design of the reactor and the optimization of rate and yield is among the hard work ahead. _BiofuelsDigest

In other words, using very little water, the Proterro bacterium converts (concentrated) CO2 to sucrose -- very like cane sugar. As efficiencies and yields improve, the idea is to feed the sucrose to fermentation micro-organisms capable of converting the sugar to fuels -- preferably butanol or hydrocarbons. The product would also be converted into plastics, lubricants, chemicals, etc.

It is likely to become something of a multi-microbial assembly line approach. The microbes must be both tough and prolific. The production system must be able to briskly move reactants, intermediates, and products briskly along the line without wasting time at any particular stage of processing.

Yes, of course this will take time to put together so that it works economically and at scale. Thermochemical approaches are currently much closer to markets, and are likely to make a splash much sooner than microbial fuels.

But we are not talking about many decades here -- certainly not the 90 years or longer that mainstream analysts have casually tossed around in the press. Al Fin energy analysts predict 5 years to significant, small scale impact by thermochemical approaches (gasification, pyrolysis, etc). Scale-up of thermochemical fuels will be quite rapid after that, as long as governments do not step in to cause political energy starvation and political peak oil.

It will take roughly 10 years before early impact in the market by microbial approaches. Within 20 years, there will be no doubt as to the future of liquid fuels. But for now, it requires the ability to follow multiple lines of development simultaneously to see what is coming.

Microbial fuels will have higher efficiencies than thermochemical approaches, due to the lower energies required to drive the critical reactions. But microbes will not be the final word in liquid fuels. Acellular enzymes and non-biological catalysts that function efficiently at similarly low temperatures will probably be substituted for the microbial photosynthesis, fermentation, and hydrocarbon synthesis steps.

And then, finally, with abundant and reliable fission, direct electrolytic and catalytic conversion of sunlight, water, and carbon dioxide to whatever hydrocarbon you wish. That will not occur on a large scale for another 30 to 50 years.

Al Fin futurists predict that the global economy will go through multiple quasi-collapse experiences over the next half century. These economic catastrophes will be the direct result of government policies -- some of them guided by left-Luddite carbon hysteric energy starvation, and dieoff.orgiasm. As noted above, the hydrocarbons are present to carry civilisation for several decades -- even without the likely collapses to come.

Electrical ground vehicles will slowly -- over several decades -- displace the vast worldwide infrastructure of internal combustion vehicles, so that the microbial and thermochemical fuels-from-biomass will be converted to the production of chemicals, materials such as plastics, animal feeds, fertilisers, etc.

The only shortages are the shortages of human ingenuity, rationality, and wisdom. Such shortages cannot be solved for humanity as a whole, regrettably. But they will be solved for enough.

Labels: ,

Tuesday, November 23, 2010

A Seep Here, A Seep There . . . Before You Know It, Your Entire Society Is Running on Gas

The world is swimming in hydrocarbons. One modest sign of the richness of shale gas deposits is this waterfall in western New York state, where a natural methane seep provides an eternal flame for hikers. US shale gas has become an amazing economic and geopolitical phenomenon, with money to be made across large swathes of the lower 48 states. Homeowners who heat their homes with gas have also had a lot of reasons to be happier about their energy bills over the past two years or so.
___________________________Waterfall Photo: Jessica Ball

Just one of the amazing "gold rush" shale gas deposits in the US will be highlighted in a television documentary on CNBC November 23 2010.
Oil vs Gas MMBTU

Natural gas is significantly cheaper than oil, per unit of heat energy, as seen in the graph above. Someone who could economically convert gas to liquids (GTL) might be able to take advantage of that price difference and make a lot of money.

Robert Rapier recently highlighted the Shell Oil GTL plant in Malaysia, and made reference to the larger Shell GTL plant to be completed in 2011 in Qatar. If the price of oil continues to be much higher than the price of gas -- in energy units -- such GTL conversion plants could well pay off.

Given the large amount of natural gas which is flared into the atmosphere every year, some intriguing new approaches to on-site conversion of GTL at gas wells -- including offshore wells, may offer a profitable income stream for smaller producers and individual wells.

It has been proven that natural gas is constantly being generated deep beneath the Earth's crust -- inside the hot mantle. We do not yet know how much of that gas penetrates into the crust to the point of economic extraction by humans, but it is likely to prove significant, in the opinion of Al Fin energy analysts.

Published earlier at Al Fin under a different title.


Monday, November 22, 2010

Safety and Security Advantages of Small Nuclear Reactors

Ever since Chernobyl, much of the public has been afraid of nuclear power plants. While there are no nuclear plants in Western Europe, North America, or Oceania which are designed so badly or operated so irresponsibly as Chernobyl was -- thus offering nothing like the Chernobyl disaster experience -- the public is still concerned.

Along comes the small modular nuclear fission reactor (SMRs) -- which are even more safe and secure than modern western designs. The public should feel better about nuclear by learning more about SMRs. And government regulatory agencies should be feeling much better about licensing the new SMR designs, given their improved safety and security aspects. Here is more from John Wheeler:
small modular reactors offer several big advantages that make them safer:

They are smaller, so the amount of radioactivity contained in each reactor is less. So much less in fact, that even if the worse case reactor accident occurs, the amount of radioactive material released would not pose a risk to the public. In nuclear lingo we say SMRs have a smaller “source term.” This source term is so small we can design the plant and emergency systems to virtually eliminate the need for emergency actions beyond the physical site boundaries. Then, by controlling access to the site boundary, we can eliminate the need for off-site protective actions (like sheltering or evacuations).

These smaller reactors contain less nuclear fuel. This smaller amount of fuel (with passive cooling I’ll mention in a minute) slows down the progression of reactor accidents. This slower progression gives operators more time to take action to keep the reactor cool. Where operators in large reactors have minutes or hours to react to events, operators of SMRs may have hours or even days. This means the chance of a reactor damaging accident is very, very remote.

Even better, most SMRs are small enough that they cannot over heat and melt down. They get all the cooling they need from air circulating around the reactor. This is a big deal because if SMRs can’t melt down, then they can’t release radioactive gas that would pose a risk to the public. Again, this means the need for external emergency actions is virtually eliminated.

Also, some SMRs are not water cooled; they use gas, liquid salt, or liquid metal coolants that operate at low pressures. This lower operating pressure means that if radioactive gases build up inside the containment building there is less pressure to push the gas out and into the air. If there is no pressure to push radioactive gas into the environment and all of it stays inside the plant, then it poses no risk to the public.

SMRs are small enough to be built underground. This means they will have a smaller physical footprint that will be easier to defend against physical attacks. This provides additional benefits of lower construction costs because earth, concrete and steel are less costly than elaborate security systems in use today, and lower operating costs (a smaller footprint means a smaller security force).

In summary, small modular nuclear reactors offer potential safety and security advantages over larger commercial reactors because they can be designed (1) to have smaller source terms, (2) to have accident scenarios that progress more slowly, (3) to be meltdown proof, (4) to operate at lower pressures, and (5) to have smaller security footprints.

These safety and security advantages can result in considerable cost advantages. A large percentage of a nuclear plant’s operating expenses go into emergency planning and security. It is possible that four or five SMRs packaged together to provide the equivalent of a large nuclear unit could operate with a smaller staff size and lower costs. However, because existing rules were written for larger reactors, some changes to NRC regulations will be required for SMRs to take full advantage of their inherent safety and security features. There are groups already working on these changes.

These safety and security advantages offered by SMRs, when combined with lower initial capital costs, shorter construction times, and scalability, may tip the scales in favor of a new generation of small, factory built modular reactors. _ThisWeekinNuclear

SMRs are less expensive to build and install, with a much quicker installation period from start to finish. The main obstacle to SMRs -- besides public ignorance and left-Luddite opposition -- is the lazy laggardness of government regulatory agencies. US NRC bureaucrats are so slow and lazy to get off their fat asses, that it may be ten more years before they license the first SMR -- despite a solid decades-long safety record by SMR manufacturers who supply the US military.

Some SMRs are designed to go as long as 20 years between refueling. SMRs of the future are likely to be designed to go much longer.

Sometime between now and the next ice age, it would be nice if government bureaucrats would do their jobs so that we can at least stay warm and well-lit in our underground bunkers.


Saturday, November 20, 2010

Brian Wang Hosts Carnival of Nuclear Energy #28

The 28th edition of the Carnival of the Nukes is being hosted at NextBigFuture. Here is a small sample:

1. Canadian Energy Issues - Nuclear subsidizes gas and renewables in Ontario: an inquiry into the price of political correctness

Natural gas has been cheap so far this year, and the wind and sun are "free." In spite of this, Ontario gas-fired and wind/solar generators still can't operate without enormous cost recovery payments from Ontario electricity rate-payers. In this post, Steve Aplin demonstrates that the province's three nuclear plants are generating most of the electricity and hence most of the revenue that covers the cost recovery payments to gas and renewables.

2. Idaho Samizdat - Patrick Moore ratchets up the rhetoric

When Patrick Moore first got started with his
Clean & Safe Energy Coalition to promote nuclear energy, his target was a nexus of green groups that opposed it. However, in an interview and in a recent speech to the nuclear industry in Cleveland, Moore came across as an astute analyst of financial and technology issues which are emerging as far much more formidable challenges to the nuclear renaissance

3. ANS Nuclear Cafe submission to the carnival: Dan Yurman interviews Ambassador Hamad Al Kaabi, the United Arab

Emirates (UAE) Permanent Representative to the International Atomic

Energy Agency (IAEA).

In December 2009, the UAE awarded a $20 billion contract to a consortium of South Korean firms to build four nuclear reactors on a remote desert location along the Persian Gulf. The Ambassador, who has been personally involved in key milestones of the UAE's nuclear energy assessment, discusses the background of the UAE nuclear deal, the use of nuclear for desalinization, why nuclear was chose for the energy path forward, why solar could not provide the necessary base load requirements, key factors in the contract award, and the 1-2-3 agreement with the United States.

The UAE new build is one of the fastest moving nuclear energy programs on the planet after China. Other countries will be following the UAE’s progress with interest to take home lessons learned from their experience.

4. Yes Vermont Yankee provides a post by guest blogger Cavan Stone, " Where Does Our Energy Come From" He discusses the DOE Wind Integration Report and the lack of energy storage. After we get 20% of our electricity from intermittent renewables, where does the other 80% come from?

5. Nuclear Green has Solar Photovoltaics are not Competitive with Nuclear Power

An Energy from Thorium member, Cyril R, provided me with a link to a new web page that charts the performance of Germany's installed Photovoltaic capacity. This link provides some measures of how well German PV is performing on a real time and daily basis. For example, it is currently 2:28 PM on November 20, 2010. Germany's 15.17 GWs of installed PV capacity is currently producing 1.8 GWs of electricity, already well past its peak output for today. 1.8 GWs of electrical output, is 12% of installed capacity, and that already represents a substantial drop from the system maximum noon time power output. Another grafh on the same web page indicates that no power was generated before 7:30 AM German time this morning, and power output will be back to zero by 4:15 PM this afternoon. Thus electrical output German PV is anticipated for less than 1/3rd of today

Wind and solar are almost total duds, in terms of "bang for the buck." Green advocates of wind and solar are making epochal fools of themselves by embracing carbon hysteria, and going so far out on the limb for wind and solar. Such ludicrous posturing as you see at their websites speaks ill of both their education and their breeding.... I could go on and on in that vein, but it simply feels too good to do so, and since I am currently observing the holy month of Lentadan (a convergence of Lent and Ramadan in my spiritual practise), I must desist.


Wednesday, November 17, 2010

Can Hydrocarbons Survive in the Hot, Pressured, Mantle?

The mantle is a dense, hot layer of semisolid rock approximately 2,900 kilometers thick. The mantle, which contains more iron, magnesium and calcium than the crust, is hotter and denser because temperature and pressure inside Earth increase with depth. Because of the firestorm-like temperatures and crushing pressure in Earth’s mantle, molecules behave very differently than they do on the surface. _Source
Earth's hydrocarbons are typically formed when organic matter is trapped in sediments on the bottom of Earth's oceans, seas, lakes, swamps, and bogs. Over a period of time, exposed to varying temperatures, pressures, and anaerobic conditions, the organic matter is transformed into hydrocarbons such as natural gas, peat, coal, and oil of various types.

Sediments trapped in oceanic crust (as opposed to continental crust) are subducted into the Earth's mantle after dozens of millions of years -- and exposed to very high pressures and temperatures. Many geologists had presumed that any hydrocarbons that had not migrated out of these subducted sediments, would be destroyed in the oxidising environment of the mantle. But a variety of research over the past several years suggests that not only can hydrocarbons survive the heat and pressure of the upper mantle -- new short-chain hydrocarbons may actually be created within the mantle.
... conventional geochemists argued that hydrocarbons could not possibly reside in Earthʼs mantle. They reasoned that at the mantleʼs depth—which begins between 7 and 70 kilometers below Earthʼs surface and extends down to 2,850 kilometers deep—hydrocarbons would react with other elements and oxidize into carbon dioxide. (Oil and gas wells are drilled between 5 and 10 kilometers deep.) However, more recent research using advanced high-pressure thermodynamics has shown that the pressure and temperature conditions of the mantle would allow hydrocarbon molecules to form and survive at depths of 100 to 300 kilometers. Because of the mantleʼs vast size, its hydrocarbon reserves could be much larger than those in Earthʼs crust. _PDFLivermoreLabPDF

“The notion that hydrocarbons generated in the mantle migrate into the Earth's crust and contribute to oil-and-gas reservoirs was promoted in Russia and Ukraine many years ago. The synthesis and stability of the compounds studied here as well as heavier hydrocarbons over the full range of conditions within the Earth's mantle now need to be explored. In addition, the extent to which this 'reduced' carbon survives migration into the crust needs to be established (as in, without being oxidized to CO2). These and related questions demonstrate the need for a new experimental and theoretical program to study the fate of carbon in the deep Earth,” the expert adds. _Softpedia

Now for the first time, scientists have found that ethane and heavier hydrocarbons can be synthesized under the pressure-temperature conditions of the upper mantle -the layer of Earth under the crust and on top of the core. The research was conducted by scientists at the Carnegie Institution's Geophysical Laboratory, with colleagues from Russia and Sweden, and is published in the July 26, advanced on-line issue of Nature Geoscience. _Geology.com

So far there is no strong evidence that large quantities of economically important hydrocarbons are being generated within the mantle, with subsequent migration up into the crust -- where humans can access them. But it seems quite likely that new gaseous hydrocarbons do migrate from the mantle into the crust -- in some quantities -- and contribute to gas deposits of various types, including methane clathrates.

What is more interesting to me than the abiotic generation of hydrocarbons is the fate of billion year old hydrocarbons of biological origin which find their way into the upper mantle through geologic upheaval. No doubt some of this hydrocarbon will survive as medium chain alkanes, although I suspect most will end up as methane or ethane. Some will get caught up in volcanic activity and be converted to CO2 -- or get ejected into the atmosphere or ocean as CH4. But what is the proportion of each product? How much will end up in a typical oil & gas "trap" in the crust where they can be economically extracted?

We will learn more about that over time. But between the abiotic gases and the truly ancient hydrocarbons that have survived the eons, it is likely that there is far more hydrocarbon in the deep Earth than geologists typically allow themselves to dream.

More: A rare, optimistic view of energy from the NYTimes

Labels: , , , , ,

Gasoline from Corn Cobs for $1 a Gallon? GE Energy Wants In


GE Energy, a GE subsidiary, has jumped into the advanced biofuels race by throwing in $8 million with the startup CoolPlanetBiofuels. The startup claims to be able to produce a bio-gasoline from rough biomass for about $1 a gallon. Here is more from GreenCarCongress:
$8-million funding round for CoolPlanetBioFuels, a start-up company developing a technology that converts low-grade biomass into high-grade fuels, including gasoline, and carbon that can be sequestered. This venture capital investment was led by North Bridge Venture Partners, which had also led CoolPlanet’s financing round last year. Additional financial details were not disclosed. CoolPlanet’s research and development facilities are located in Camarillo, CA.

CoolPlanetBioFuels is developing modular thermal/mechanical processors which directly input raw biomass such as woodchips, crop residue, and algae and produces multiple distinct gas streams for catalytic upgrading to conventional fuel components.

In support of the biomass fractionator, the company is also developing a range of one-step catalytic conversion processes which mate with the fractionator’s output gas streams to produce products such as eBTX (high octane gasoline), synthetic diesel and proprietary ultra-high crop yield “super” fuels.

At the GoingGreen Silicon Valley 2010 conference in October, Mike Rocke, CoolPlanetBiofuels VP Business Development, said that the startup could produce carbon-neutral gasoline from biomass for less than $1.00/gallon US.

Biomass throughput time in the biomass fractionator is minutes, Rocke said earlier at a conference at Stanford. Two fractionators in a module can produce one million gallons of gasoline per year, with capex of $0.50/gallon to install—i.e., $0.10/gallon over a five year life. _GCC

The image above shows a comparison of product between conventional Shell 87 octane gasoline and the Cool Planet BioFuels drop-in product from biomass, by gas chromatograph.

Whether the information provided to investors is accurate or not, if the company is able to produce high quality drop-in bio-gasoline from biomass technology already developed, increasing efficiencies and yields, and decreasing costs, may make the product competitive within a matter of 5 or 10 years.

The problem with biomass is its low energy density, and its diffuse nature. It takes a lot of energy to densify biomass for transport, and to transport large amounts to a central processing facility such as CoolPlanetBioFuels'. It is clear that those energy costs were not figured into the amounts quoted to investors.

Thermochemical production of biofuels via pyrolysis and gasification have a natural head start on microbial fuels -- due to prior work done on other feedstocks. But if the thermochemical approach is to achieve a foothold -- and critical scale-up -- it cannot dally about while people such as Craig Venter are working feverishly to genetically engineer microbes to achieve the same thing at far lower energy cost.

Labels: , ,

Tuesday, November 16, 2010

Another CO2 to Fuels Approach, Using SOEC Electrolysis

In a paper published in the journal Renewable and Sustainable Energy Reviews, researchers from Columbia University and the Risø National Laboratory for Sustainable Energy (Denmark) review the possible technological pathways for recycling CO2 into fuels using renewable or nuclear energy, considering three stages: CO2 capture; H2O and CO2 dissociation, and fuel synthesis.

The new review paper analyzes dissociation methods including thermolysis, thermochemical cycles, electrolysis, and photoelectrolysis of CO2 and/or H2O, and then identifies co-electrolyzing H2O and CO2 in high temperature solid oxide cells to yield syngas, and then producing gasoline or diesel from the syngas in a catalytic reactor (e.g. Fischer–Tropsch) as one of the most promising, feasible routes. _GCC
The authors claim that the process can be competitive with gasoline sold between US $2 and US $3 a gallon, depending upon the cost per kwh of electricity. In reality, for this process to be competitive, electricity would have to be extemely cheap -- which rules out wind and solar as power sources.
In the review, Grave et al. examine the status of the enabling technologies for each stage, with special focus on the various thermochemical, electrochemical and photochemical energy conversion technologies that could be used for dissociation of H2O and CO2, the stage with the highest energy consumption. They noted that combining more than one stage into a single unit is possible, but there may be benefits to optimizing each stage separately.
With feasible technology development and mass production of the process components, CO2-recycled hydrocarbon fuels can be produced at the scale needed to replace transportation fuels at a price competitive with more conventional fossil-derived hydrocarbons, especially if oil and CO2 sequestration costs are high. The potentially greater sustainability of CO2-recycled fuels over fossil or biomass derived fuels, as well as independence from the geographic and supply related issues of conventional fuels, could also give CO2-recycled fuels a market advantage.
—Graves et al _GCC
The image below is a fanciful representation of some EU bureaucrat's reality-disconnected daydream, but it provides a simplified and readily grasped summary of the underlying idea. As long as you know that wind power is much too unreliable and expensive to work in this scheme, and that atmospheric capture of CO2 is completely unworkable economically, you will be prepared to look at the images above for slightly more realistic approaches.
Al Fin analysts have concluded that the only way such a scheme could work in the near to mid-term future, would be to combine a plant that creates significant concentrated CO2 with a dedicated modular fission plant for electric power, adding the components for SOEC electrolysis and F-T synthesis. Each step must be proven individually, then in an ensemble pilot project.

For this type of project, theory is not enough to base a significant investment. All of the concepts must be proven individually, and then together, before attempting to scale up.

Labels: ,

Monday, November 15, 2010

Who Needs Rare Earth Magnets? Not NovaTorque


Sunnyvale California's Nova Torque has introduced 3 permanent magnet motors that use magnets costing only 1 / 15th the cost of a neodymium magnet.
So how does NovaTorque do it? Take a look at the photo. The motor (foreground) consists of conical hubs (background) containing magnets separated by a tapered motor shaft (left). The hub is the component that looks like a space capsule and the shaft has the band of copper. The key here is that the interface between the magnetic surface of the hub and the shaft is diagonal, not flat like in most motors. A diagonal interface dramatically increases the surface area between the two, thereby increasing magnetic flux transmission (good) and reducing materials (also good). The magnets are the raised surfaces on the side of the conical hub.

Think of how baguettes get cut in restaurants: you can put more butter on diagonally cut bread than slices lopped off the top. The expanded surface area permits NovaTorque to switch to ferrite magnets.

NovaTorque also manages to reduce the amount of copper needed for the coils in the motor. Less copper, less cost.

Efficiency in the motor is greater than average due to the fact that the magnetic field is axial, i.e., it runs in an oval around the axle, instead of being radial, i.e., circumnavigating it. One advantage: the axial field means NovaTorque can use gain-oriented transformer-grade sette, which lowers eddy current losses and boosts efficiency. Higher efficiency should also result in fewer breakdowns: a large percentage of mechanical failures can be traced to ambient waste heat generated by motor inefficiencies.

The motor also pairs well with variable drives, and variable speed air conditioners are one of the top priorities in data center retrofits.

The company's Premium Plus+ motors are currently spec'd for refrigerators, HVAC systems, vacuum pumps and industrial equipment. But if the technology scales well to larger motors, other markets could open. _greentechmedia

In other news from GreenTechMedia, Massachusetts company Premium Power is introducing a zinc bromide flow battery system scaled for a large home. It is clear from the comments following the article that most readers have no idea what a flow battery is. Still, the announcement is good news for anyone looking for advances in large scale power backup systems.

Also from GreenTechMedia, an announcement that Sapphire Energy is beginning work on the first of a series of 3 100 acre algae biofuel pilot plants in New Mexico. Ignore the facetious comment left by yours truly after the article. The website will probably alter the article by the time you read it to make the comment seem superfluous, or may delete the comment altogether. Website administrators tend to take themselves and their sites altogether too seriously! ;-)

Brian Wang describes research from Montana State University, which demonstrates that algae are capable of thriving on added HCO3 -- producing significantly more oil as a result of the extra carbon. This is actually good news for those worried about ocean acidification from atmospheric CO2, since most dissolved CO2 turns promptly into HCO3 (over 90% I believe). In other words, sea creatures will take this extra bicarbonate and turn it into more plankton, sponges, coral reefs, and shelled sea creatures.

Robert Rapier describes a visit to a Shell Oil Gas-to-liquids plant in Malaysia. The plant utilises gasification plus F-T synthesis of liquids. Very enlightening, and most optimistic toward the future of GTL.

CTL and BTL are still considerably more expensive, but given time we are likely to see a lot of progress on those fronts as well.

Labels: , , , ,

Saturday, November 13, 2010

Bakken and Eagle Ford are Just the Beginning


The de facto Obama moratorium on oil well drilling in the Gulf of Mexico has spurred new exploration and production on the US mainland. Local economies are beginning to experience boom times never experienced before. It is a phenomenon that may well spread across the US, as the oil & gas mania locates useful hydrocarbons wherever they are to be found.
For much of this decade, energy companies pioneered new drilling technologies that allowed them to recover natural gas from a subterranean rock called shale. By drilling down and then out laterally, companies were able to exploit greater areas of the shale. And by injecting massive doses of water, sand and chemicals into the ground, they could crack open the gas-bearing rocks, allowing gas to flow to the surface.

...The shale boom won't begin to end American dependence on imported oil, but industry experts say it is driving a significant and potentially enduring shift in the way oil is produced domestically.

"It's a game-changer for U.S. oil production," said Bill Durbin, head of global markets research at Wood Mackenzie. "The U.S. has always been perceived to be a very mature oil province with relatively little prospect for growth. Now we're seeing the declines in production being arrested by the increase in unconventional oil."

Nationally, the balance between oil and gas exploration onshore has tilted heavily toward oil. The number of oil-seeking rigs has nearly tripled since June 2009, and now makes up 42% of all rigs in use, a prevalence not seen since 1997, according to data compiled by oilfield-services company Baker Hughes Inc.

Among states, Texas has seen the greatest increase of rigs in the past year, adding 300, a 73% increase. North Dakota added 83 rigs in the last year, Oklahoma gained 71, and Colorado picked up 30. Analysts at IHS Cambridge Energy Research Associates have identified 20 significant shale prospects across North America.

Industry executives and analysts say the growth is likely to continue, at least as long as oil prices remain over $70 a barrel. _WSJ

Yes, this boom is likely to continue for as long as oil prices remain over $70 a barrel. In other words, as long as demand continues, the supplies will be located -- sometimes where you least expect them.

The Obama - Holdren - Salazar - Boxer coalition of energy starvation will not control the US government indefinitely. When the left-Luddite Malthusians are finally swept from power, a wider array of energy options will be placed upon the table for consideration.

When the energy markets are opened up, it is possible that demand will be insufficient to maintain oil prices at current inflated levels. Sure, the value of the dollar will continue to decline as long as US debt expands exponentially -- which drives the price of commodities higher, when priced in US dollars.

But other, more stable measures of value will come into more widespread use. When priced in more stable currencies, the price of oil is likely to fall in the long run, rather than rise.

Labels: ,

Carnival of Nuclear Energy #27 at NextBigFuture

Brian Wang hosts the 27th incarnation of his creation, the Blog Carnival of Nuclear Energy. Here is a short excerpt:
2. Nuclear green has Charles Forsberg's views on Generation IV nuclear costs." At the beginning of the 21st century, Charles Forsberg of ORNL and MIT proposed a hybrid molten salt cooled reactor that borrowed features from gas cooled reactors. The resultant reactor, Forsberg argued, would have a significantly lower cost than other Generation IV reactor designs.

3. Rod Adams at Atomic Insights has What keeps you up late at night? For me, tonight, it is the unholy alliance of natural gas, environmentalists and renewable energy advocates that are working hard to capture more market share from nuclear energy. They have lowered prices enough - for a while - to convince high level decision makers to slow down their nuclear ambitions.

This post provides a host of links that tie together a rather strange set of bedfellows who are excited about the often repeated story of a new abundance in natural gas. One aspect of the story of cheap gas prices into the distant future is often overlooked - Exxon Mobil and Chevron are investing tens of billions to purchase large blocks of natural gas production capacity. Do they know something about supply and demand?

4. A new twice a month energy blog about the making of PANDORA'S PROMISE has been launched on National Geographic. The first piece is on the IFR (Integral fast reactor).

5. Yes Vermont Yankee has Without Vermont Yankee, ISO-NE Predicts Possible Transmission Line Melting

The grid operator for New England earlier announced that "Vermont Yankee must stay in the 2013 forward energy auction." They recently translated this into English:the results of a recent study show transmission lines would be badly overloaded without Vermont Yankee. _NBF
Follow the link above for more carnival plus links to original articles.

Meanwhile in India, one of the country's top scientists declared that cold fusion is the best way to produce electrical power.

He will have the opportunity to prove his assertion over the next several years, no doubt. In science and industry -- unlike in politics, environmentalism, philosophy, and most of academia -- assertions are either proven or disproven by reality, eventually.


McKnight's Lawrence Wackett Looks at Advanced Biofuels

McKnight University biochemistry professor Larry Wackett is deeply involved in the search for better microbial enzymes which could be used to produce advanced biofuels. His research involves the genetic engineering of microbes, advanced methods of biodegrading materials such as cellulose, and several other significant areas of microbial enzyme development and analysis.

Wackett has an article in press at the journal Current Opinion in Biotechnology, which reviews current approaches to engineer microbes in the quest to create advanced biofuels:
The current biofuels landscape is chaotic. It is controlled by the rules imposed by economic forces and driven by the necessity of finding new sources of energy, particularly motor fuels. The need is bringing forth great creativity in uncovering new candidate fuel molecules that can be made via metabolic engineering. These next generation fuels include long-chain alcohols, terpenoid hydrocarbons, and diesel-length alkanes.

Renewable fuels contain carbon derived from carbon dioxide. The carbon dioxide is derived directly by a photosynthetic fuel-producing organism(s) or via intermediary biomass polymers that were previously derived from carbon dioxide. To use the latter economically, biomass depolymerization processes must improve and this is a very active area of research. There are competitive approaches with some groups using enzyme based methods and others using chemical catalysts.

With the former, feedstock and end-product toxicity loom as major problems. Advances chiefly rest on the ability to manipulate biological systems. Computational and modular construction approaches are key. For example, novel metabolic networks have been constructed to make long-chain alcohols and hydrocarbons that have superior fuel properties over ethanol. A particularly exciting approach is to implement a direct utilization of solar energy to make a usable fuel. A number of approaches use the components of current biological systems, but re-engineer them for more direct, efficient production of fuels.
—Wackett 2010

Lawrence P Wackett (2010) Engineering microbes to produce biofuels. Current Opinion in Biotechnology Article in Press doi: 10.1016/j.copbio.2010.10.010 _GCC

Ethanol is an imperfect biofuel, and maize ethanol is a less than ideal approach to making ethanol fuels when compared to cane ethanol. Making better biofuels from cellulosic biomass will require better ways of converting biological polymers to useful, superior next generation biofuels. Better enzymes from hardier and more prolific microbes will be helpful in this regard.

Labels: ,

Thursday, November 11, 2010

Bunge Teams with SG Biofuels for Jatropha Seed Biofuels

Jatropha curcas is a non-edible shrub native to Central America. Its seeds have high oil content, and can be processed to produce a high-quality energy feedstock. It can be effectively grown on marginal lands that are considered undesirable for food crops. _GCC

Jatropha curcas is a tropical oil seed shrub, with a non-edible oil content. Jatropha oil is a high quality oil for many uses -- including consumer products such as cosmetics and for industrial use. In order to process the seed in North America as a fuel, large quantities of jatropha seed would need to be supplied to process and refining plants. This will require a massive, multi-disciplinary effort to ramp up the growth and production of jatropha seed for a North American market.
SG Biofuels, a bioenergy crop company developing and producing elite seeds of Jatropha, has established a strategic partnership with Bunge North America, the North American operating arm of Bunge Limited to research and develop a model to process Jatropha seeds into a biofuel feedstock.

Bunge, a global leader in oilseed processing, joins an industry-leading team of partners, including Flint Hills Resources, a refining and petrochemical company and wholly-owned subsidiary of Koch Industries; Life Technologies Corporation, a global biotechnology tools company; and others that are collaborating with SG Biofuels to develop Jatropha as a viable source for cost-effective, sustainable crude plant oil.

The development of a successful market for Jatropha requires that all aspects of the value chain are met, from crop science and proper agronomics all the way through processing and refining. Bunge, together with Life Technologies and Flint Hills Resources, provides a fully integrated platform from which our customers can develop, produce and profit from large volumes of crude Jatropha oil.
—SG Biofuels President and Chief Executive Officer Kirk Haney

Jatropha curcas is a non-edible shrub native to Central America. Its seeds have high oil content, and can be processed to produce a high-quality energy feedstock. It can be effectively grown on marginal lands that are considered undesirable for food crops.

The company’s integrated breeding and biotechnology approach forms the foundation for its JMax Jatropha Optimization Platform, providing research agencies, growers and plantation developers with access to the company’s germplasm library, Jatropha curcas genome sequence, molecular markers and advanced biotech and synthetic biology tools to optimize elite Jatropha cultivars for unique growing conditions around the world. _GCC


Images from Steven Gorelick's "Oil Panic & the Global Crisis"

MK Hubbert was only the "King" of Peak Oil (DOOM!), but to many true believers he is a god. Stanford Professor Steven Gorelick examined Hubbert's assumptions and predictions. In Chapter 4 of Gorelick's fine book "Oil Panic & The Global Crisis: Predictions and Myths", many of Hubbert's ideas and predictions were pinned to the dissection table and examined. I recommend a close reading of Gorelick's book, as well as a careful reading of Leonardo Maugeri's "The Age of Oil."

I have posted a few images from Chapter 4 of Gorelick's book, to give you a flavour of some of the information you can find there. One of the many bonuses of "Oil Panic" is that it presents both sides of the story of peak oil, and allows you to decide which is the more credible.

Where Is The Bell Curve?

The assumption that oil production will assume a roughly symmetrical, logistic bell curve. Logically, there is no reason for this assumption, but simple logic does not always come into play in the real world. What actually happens with production, vis a vis the bell curve?
Hubbert also applied his logistic model to US gas production. Real world results are juxtaposed with Hubbert's prediction for natural gas production below.
Hubbert came closest to achieving predictive "bell curve" success with his prediction for US lower 48 state oil production peak. Hubbert's "best prediction" for US peak oil was 1965, although one of his prediction scenarios (which he himself considered unlikely) predicted a peak for around 1970. This date was quite close to the actual US production peak, although it received a good deal of assistance from the tsunami of oil regulations that followed the 1969 Santa Barbara offshore oil spill.

So, score a tentative point for Hubbert, against all his demonstrated errors. Still, watch below, as US oil production takes a decided departure from its Hubbertian bell curve trajectory.
The human brain craves knowledge of the future, and prediction is one of the central functions of the brain. Hubbert was one of many analysts seeking to predict the future of the oil resource, and honestly believed that he had discovered a secret to predicting resource peaks.

But Hubbert was far more intelligent and open to contrary data, than most of his modern-day followers. No doubt he would have juxtaposed his predictions and the data and modified his approach accordingly.

Peak Oil DOOM(!)ers have a tendency to deify their prophets, such as Hubbert -- although Hubbert is more of an arch-prophet or god in the peak oil world. When one deifies a person, one is unable to objectively examine data that contradicts the words of the deity. Mass religious and quasi-religious movements -- such as peak oil DOOM! or carbon hysteria DOOM! -- comprise large numbers of persons who have suspended rational judgment in this way, and are thus unable to critically examine the evidence.

But the world doesn't stop for anyone -- not even a demi-god. Our rational examination of the ongoing data should not stop either.

Cross-posted to Al Fin


Wednesday, November 10, 2010

The World Converts to Gasification


Gasification of coal, biomass, or other carbonaceous materials yields a syngas which can be utilised for electric power generation, fuel synthesis, or the synthesis of chemicals and materials. Gasification lends itself to much cleaner use of coal -- even the "dirtiest" coals -- than current combustion processes. While nuclear generation of electrical power is superior in many ways to the use of coal, coal is more versatile in terms of the broad array of uses to which syngas can be put. And there is a lot of coal in the ground, to be used.
The 2010 Worldwide Gasification Database, a collection of gasification plant data, describes the current world gasification industry and identifies near-term planned capacity additions. The database reveals that the worldwide gasification capacity has continued to grow for the past several decades and is now at 70,817 megawatts thermal (MWth) of syngas output at 144 operating plants with a total of 412 gasifiers.

Gasification is a technological process that uses heat, pressure, and steam to convert any carbon-based raw material into synthesis gas (syngas). Gasification is in use in more than 27 industrialized countries._GCC

As the gasification industry develops and grows, economies of scale will be added to the other advantages of this cleaner approach to energy, fuels, and chemicals.

Labels: ,

Big Wind Bubble Nears Bursting Point

From Spain to Denmark to the UK to the US, wind developments are being canceled and wind turbine factories are down-sizing or closing altogether. The best intentions of politicians and faux environmentalists can not make big wind into a reliable or affordable source of needed electrical power. Taxpayer subsidies are being squandered on a form of energy which can only fail, in the foreseeable future.
New US wind turbine installations have slowed significantly this year, compared to 2009, and the decline is having consequences. Among other fallout, Suzlon is mothballing a four-year-old wind turbine factory in Minnesota and laying off the remaining 110 workers, due to a lack of new orders. While the industry pins most of the blame for the slowdown on insufficiently aggressive federal energy policies, it suddenly occurred to me to wonder whether wind power, like housing, might have been caught up in an investment bubble that has finally popped, somewhat belatedly.

... _EnergyTribune
If not for massive government subsidies, grants, mandates, counter-productive regulations, and other promotions, big wind power would be in a world of hurt. But as already debt-ridden governments pour more and more good money into bad energy policy, the end-game is coming into view -- and it does not look very pretty. Eventually, the money squandered on big wind projects is borrowed money, which can never be paid back. As unredeemable bonds and loans accumulate, powerful submerged stresses and undercurrents build, which threaten to break out into open conflict.

Ironically, in the end the "Green Industry" may end up promoting the "War Industry", which is just the kind of paradox that good intentions typically generate.


Tuesday, November 09, 2010

Big Wind Crash & Burn: Bad for the Grid, Bad for the Economy

Orders for new wind turbines to fall 93% in UK for 2013 The UK was to have been the largest offshore wind producer. Now it looks like resources will have to be put to more reliable forms of power production. Wind power is apparently an expensive luxury for brainless ne'er-do-wells.

World's biggest maker of wind turbines must lay off 3,000 workers due to low demand

In fact, "Green Jobs" are in jeopardy all over, due to many factors -- including the belated realisation that big wind power sucks an economy dry, and wreaks havoc on a power grid.

The high cost of wind energy threatens the existence of many wind developers who jumped into the business with the best of intentions, and highest of hopes.

But some folks never seem to learn . . .

Big wind is a big sideshow of the great carbon hysterics' carnival and crusade. If people catch on to the hollow nature of carbon hysteria, the reasons for wasting scarce resources on big wind just crash and burn.


Distributed Pyrolysis Units Could Form Regional Economic Cores


Pyrolysis is the heating of carbonaceous material in the absence of oxygen. When biomass is pyrolysed, the resulting products include bio-char (solid carbon), pyrolysis gases, and pyrolysis liquids (or pyrolysis oil).

Locating pyrolysis units in local areas of high biomass production allows for the early conversion of high-bulk / low-density biomass to a lower-bulk / higher-density pyrolysis oil which is easier to transport + a bio-char which can be used to help fertilise local and regional crops and replenish soil carbon.
Agri-Therm Portable Pyrolysis Unit

Agri-Therm is developing a portable pyrolysis unit that could be transported to local areas of high biomass production, for local and regional conversion, pre-processing, and energy densification (by volume).

Pyrolysis oil can be fired in fuel oil furnaces, or co-fired with coal. It can also be further refined to high value fuels, chemicals, and more.

Some of the densification benefits of pyrolysis can also be obtained by solid biomass compression, torrefaction, and gasification -- although handling pressurised gas is often more difficult than handling liquids.

Local creation of valuable fertiliser and soil supplement is a decided plus for pyrolysis in agricultural regions.


Newer Posts Older Posts