Alt Energy Redundancy and the First KW Cost

EACH KW OF SOLAR OR WIND GENERATION YOU ADD TO THE GRID COSTS MORE THAN THE LAST. THE REASON IS THAT THE GRID DEMANDS A CERTAIN LEVEL OF RELIABILITY. RELIABILITY MUST COME FROM REDUNDANCY, AND REDUNDANCY COSTS A LOT.

Assume you had a source of alt energy (wind, solar, etc) that could produce 1 KW of power and that it was controlled by the roll of a die. If you rolled a "1" or a "2", then the source of power would turn on for a period of time. And and if rolled a "3" through "6", it would turn off for a period of time. The period of time could be anything you want: Seconds, minutes, hours or days--it just had to be constant. And after the period elapsed, you could roll again.

Overall, your source of power would be unavailable 2/3 of the time and available 1/3 of the time. The 1/3 represents the overall availability of the technology--solar (and wind) works really well about 8 hours of the day, but the other 2/3 it's not so useful. 

But what if you had two identical generators, located in different places, each controlled by the roll of separate die? In that case, you'd have at least one generator active 1-0.662 of the time, which would be 56.4% of the time. And similarly, if you had 10 generators, you could count on having at least one of them active 1-0.6610 = 98.4% of the time.

In more concrete terms, that means you'd need to install 10 KW of panels or turbines to get a very reliable 1 KW of generation. And of course, the core assumption is that there is zero correlation between the sources. Which means you'd need the panels and turbines spread throughout the world to feed your house. Not practical for a house, but geographic diversity is possible for utility-scale generation.

The grid in the US delivers about "three 9's", or 99.9% of uptime per year. This means that 0.1% of the time, the grid is down. This is about 8 hours a year. 

To achieve that 99.9% level of reliability from a random set of generators, you'd need at least 17 of those generators working together to deliver the 1KW with 99.9% uptime. That means that the costs previously discussed in the Solar Benchmark Costs post would be roughly 17X higher if that technology were to be used to deliver three 9's of reliability. In other words, instead of the $0.129/kwh figure you see for Phoenix, AZ, the cost to purely deliver the grid energy from solar in Phoenix would be 17 times that, or $2.19/kwh. The additional cost would be there to provide the redundancy needed.

Storage can help a lot with the math. If we have a battery than can provide 1 unit of energy for the unit of time in question and it takes another unit of time to recharge, then the problem gets a little different. In the game Craps, many of the bets depend on the order of events which occur over several rolls. Adding a battery results in a similar puzzle. If you roll a "1" or "2", then you succeed. But if you don't roll a "1" or "2", but on the second roll you do roll a "1" or "2" then that could be considered success. In that case, the roll that failed would be backed by the battery. The size of the battery is for whatever unit of time you are contemplating. 

We'll save that exercise for later. 

Summary

It can be puzzling to look at the raw costs of solar, see that they are competitive with retail rates for electricity, and then not understand why everything isn't going solar (or wind). Reliability is the big reason. When you see the retail cost for coal sitting at $0.12/kwh, understand that includes the 99.9% reliability already. And understand too that the $0.12 is the price the utility is asking YOU to pay. They are producing it for much less than that--probably around $0.04 to $0.06 or so, and that figure already includes the requisite 99.9% reliability. 

But this should also make it very clear why the first KW of solar generation added to the grid is so cheap, and why subsequent KW of generation added to the grid get more and more expensive. Reliability requires redundancy, and redundancy costs, and it costs a lot. 

The early adopters of solar and wind are, in a perverse way, riding for free. They enjoy the lost cost that comes from unreliable generation, while freely using the grid to provide any shortcomings they might encounter. If a cloud passes overhead while they want to run their AC, no problem. They fall back to the utility to make up the difference. But if everyone was using solar, that cloud passing over a city would be devastating unless the 17X factor discussed above had been paid. 

All up, A 99.9% reliable source of alt energy might be $2/kwh, while conventional energy is $0.05/kwh. That's a 40X gap. And that is why large-scale wind/solar won't make sense for a long, long time. It can help at the edges. But the bigger challenge is scale. 

There's no free lunch. Ever. 

 

 

The Cost for the Next KW

What is your "fair share" of CO2?