#9891
redsnapper
Participant

(My apologies if this is a sort-of duplicate – something weird happened when I tried to do the spell check!)

Well, yes, the total efficiency theoretically is very competitive, but let’s face it, until there’s a working core, and a working X-ray converter, and a working Rogowski coil, all playing together with a humongous capacitor bank and switches, that efficiency remains theoretical. Given that the efficiency numbers pan out, I agree that the problem is the (small) size of the FF rig. I’m not quite sure what you mean by the high electrode temperature being a problem (don’t confuse heat with temperature) – indeed, the theoretical Carnot efficiency of a heat engine will only improve as the high-side temperature goes up, so in that sense, the higher the electrode temperature, the better. (Part of the reason the theoretical efficiency of the FF device is so high is precisely because its energy originates at 1Billion degC – somewhat higher than the hottest fossil-fuel fired power plant. :-)) Further, the higher the permissible electrode temperature, the more feasible it becomes to remove that excess heat using real materials between the electrode and the outside environment. Remember, the waste heat isn’t generated because the electrodes are hot, the electrodes are hot because they’re adjacent to a billion-degree plasma, and because there’s a phenomenal amount of energy passing through them.

As for high efficiency thermocouples (or any other device depending on the thermoelectric effect), I wouldn’t hold my breath. Current state of the art is less than 10% efficiency. My guess is that once you’ve got a working reactor, you’ll get a far better bang-for-the-buck by incremental increases in X-ray conversion, or the Rogowski coil efficiency, or incremental improvements in the electrodes and FF device itself such that more energy comes out in the ion beam or Xrays. One day there will probably be a breakthrough design for the electrodes themselves (for example: highly-conductive, Xray-transparent, ceramic electrodes that can handle 2000 degC; better yet, virtual electrodes that can handle 500MdegC). Scavenging waste heat (excluding the obviously direct applications of waste heat, e.g. heating a hotel or a swimming pool) probably adds cost faster than saleable energy output. As you point out, FF already promises to be a very efficient process by fossil-fuel standards. (BTW – we have trouble building conventional photovoltaic devices with efficiencies higher than 25% – and we’ve been working at this, perhaps only semi-seriously, for the last 40 years. It does seem to me we might be ridiculously optmistic to think we can convert 80% of Xrays into electricity as early as three years from now. Is the argument that the entropy of Xrays is so much lower than that of visible light, that there’s reason for optimism?)

I think it’s going to come down to permissible “waste heat density” – so to speak. If a breadbox-sized 5MW-net-output DPF core simply can’t be maintained (or can’t simply be maintained) cool enough for the electrodes to survive, just divide that 5MW across however many cores you can keep cool. Or if the core is already far-and-away the most expensive fraction of the cost of the powerplant (it might not be, once you’ve factored in the cost of the “minimal” cooling system ), just scale down the power (cycles/second of operation). Yes that raises the cost/MW, but I assume that the afore-mentioned incremental improvements in efficiency will eventually allow you to run the power back up. You were already so far ahead of the conventional power cost that you can afford to miss the original target by a substantial margin. So what if FF1.0 can only generate enough power for 200 houses instead of 2000? Somewhere in the world, that’s just the right size. Or a yacht instead of a cruise ship? There’ll be plenty of market for fusion power, in whatever increment is available at any given time. It’ll only improve with FF2.0.