I live in the northeastern US which influences my perspective, but here is why I think that large grids will become obsolete.
I am not saying that that you would want to have FF devices deployed in ones and twos, but the way the grid is currently structured, electric power is frequently sent over hundreds of miles over high voltage lines. My guess is that if focus fusion takes off, that most generating stations would be local and have from 20 to 100 FF devices, or 100 to 500 MW. This would vastly simplify the grid and boost it’s reliability. This would mean very short transmission lines in all but sparsely populated areas.
My power bill gets itemized with things like transmission, distribution, and the actual cost of the generated electricity split out: (All numbers are dollars per Kwh.)
.02240 Transmission
.06169 Distribution
.09379 Generation (The thing that Focus Fusion does.)
.00853 Other
If the generation cost goes down to 0.005, my bill only gets cut in half. You would have to radically change the grid to make a significant impact on the rest of my power bill.
Cities like New York and areas like Long Island draw power from hundreds of miles away, as far as eastern Canada and the mid-western US. This has caused several major blackouts. The New York City, Long island area depend on about a dozen 345KV power lines for much of their electricity. Each line is capable of supplying over 600MW of power. Most of the lines run for hundreds of miles and cost about a million dollars per mile to build. They both increase the cost of power and cause reliability problems like the prospect of more blackouts. At a minimum most if not all of these lines would be replaced with local generation. Although New York is an extreme example, most large cities in the US are dependent on remotely generated electricity and would abandon power from long transmission lines in favor of locally generated electricity if Focus Fusion were available.
In addition to having the grid evolve to be more city scale rather than regional scale, most institutions that are big enough to have a campus rather than a single building would probably be able to operate a few Focus Fusion devices to eliminate any distribution or transmission charges from their electricity bill, in addition to being able to use waste heat for heating and cooling. These institutions would include universities, hospitals and manufacturers. All of them already have significant staff that manages their heating cooling and electrical systems.
Actually the main impact Focus Fusion will have on large grids is to make them obsolete. Since FF devices don’t have much environmental impact, they will be located near the point of use. (Waste heat should be the major environmental impact, may be district heating and adsorption chillers will become popular.)
It can be very complex and cumbersome to turn off the FF device and allow Decaborane to precipitate onto the beryllium electrodes, or costly to keep the device idling to prevent this.
I agree that the first use of a Focus Fusion device would be fore base load power, but I don’t think it would be hard to power cycle the machine. Even the prototype will need to have heaters to start the machine. (If I remember they have already been specified.) Preventing the decaborane from precipitating on electrodes should not be a problem. Just use a cold trap to precipitate the decaborane before you let the electrodes cool down. You probably can operate the Focus Fusion device decaborane lean as you power it down.
It used to be that a significant overpotential was required to generate hydrogen from water. Recent research has lowered the required overpotential quite a bit. Also the cost of the electricity is being lowered by over an order of magnitude. You can’t get much electricity today at $0.05/kwh, if a Focus Fusion device can produce it for $0.005, while selling it a peak prices for a few hours per day, the cost of the electricity used to generate hydrogen would be very low. The Focus Fusion devices could be located at the location that the hydrogen is required. The electricity could be shipped over the existing transmission lines.
I agree with Impaler. Their is no reason to convert the grid to DC.
However I doubt that batteries will be used for peak demand periods. It would be cheaper to just use Focus Fusion devices intermittently.
Edit: Since the incremental cost of running a Focus Fusion device is rather low, it could be used to generate Hydrogen from water for use in industrial processes when not providing peaking power.
It looks like the know about us now. From http://www.nature.com/news/fusion-furore-1.15596
And among the small fusion start-up companies worth considering for a federal small-business grant is Lawrenceville Plasma Physics in Middlesex, New Jersey, which is trying to exploit a configuration known as a dense plasma focus to build an extremely compact reactor that does not emit neutrons.
I know it negates a lot of the advantages of a focus fusion device, but does any one have any idea what kind of Q you could get out of a focus fusion device if you used D-T fuel? And how much power would come out as ions?
Lerner wrote: Beryllium dust is very toxic. Beryllium in bulk metal form is harmless. We will not be manufacturing the beryllium electrodes–where careful safety precautions are required to prevent exposure to the dust–we’ll just be using them.
I am under the impression that in a production machine the electrodes have to be replaced every few weeks because of erosion. Could this cause dust, or is the Beryllium vaporized and re-deposited on a cooler surface? Not a show stopper, but complicates operating a production machine.
nemmart wrote: We know that a heat engine can achieve 40% efficiency
… errrrrrrrrrr… not quite. Those kinds of thermal efficiencies are achieved at the expense of very high operating temperatures. Temperatures in the thousands of degrees C.
The most anyone has ever postulated for the output of an FF cooling stream is at ~700-800 degC… i.e. maybe 20-25% thermal-to-electrical conversion efficiency. That’s game over for a thermo-electric FF, I’m afraid.
… still would be a great x-ray source, though.
Actually ultra supercritical steam turbines run at ~600 degC and supercritical CO2 turbines could have an efficiency of 50% and run at ~550 C. Their main problem is cost. Also CO2 turbines are in the research phase.
Breakable wrote: There is no 90% efficiency requirement. Some of the . This is the energy flow diagram:
http://fusionenergyleague.org/assets/device/lpp/fofu_flow.jpg
Nice chart. What are the prospects of getting more than 66 kJ of gross fusion energy per shot? That would make conversion efficiency less of an issue.
Even if both ion and x-ray efficiency are 80% you still have 10 MW of heat produced. You might be able to capture a few MW by running it through a heat engine. Not so elegant but it should work. Remember you have to dissipate the heat any way.