The Focus Fusion Society › Forums › Noise, ZPE, AGW (capped*) etc. › GW Skeptics vs Scientific Concensus
Brian H wrote: Irresponsible people with overtly destructive goals are the risk there, not the plants. There are, in any case, plant types that are very hard to use as the bases for atomic weapons.
The plants don’t exist in a vacuum. They exist in a world of irresponsible people.
Rezwan wrote:
Irresponsible people with overtly destructive goals are the risk there, not the plants. There are, in any case, plant types that are very hard to use as the bases for atomic weapons.
The plants don’t exist in a vacuum. They exist in a world of irresponsible people.
Silly principle. By that reasoning, nothing dangerous should ever be built anywhere by anyone. Responsible people are actually the only assurance of safety, survival, and progress there is.
Brian H wrote: The oil rig disaster is almost a pure case of “black swan”; in fact, it may be a red or purple swan. It was apparently caused by a huge surge of gas pressure blowing the valves, sending flammable gas to the surface. It was so large that it is unique in drilling history: there has never been a blowout like it. Safety measures in place saved over 100 of the workers; the ones who died were almost certainly at “ground Zero” in the initial explosion, and probably vaporized by it.
Such events must be treated with care in formulating standards and assigning “blame”. “Hard cases make bad law.”
That’s the point of the black swan. You’ve got a lot of people trying to downplay risks by throwing out the “hard cases” data, (aka “outliers”, “black swans”), and looking at “most probable” events. The “most probable” approach is delusional.
Because you’ll be getting hard cases from time to time. And then, who does absorb the cost of those hard cases? Shouldn’t it be those who have been benefiting most by ignoring or not anticipating the hard case?
Ignoring hard cases hides true risks and costs. Is that rational?
Banking appeared “conservative” for decades, but they were sitting on a huge problem and simply ignoring/hiding the risk. It only takes a year to wipe out all the “profit” they made over the past x decades.
So – the public bails them out, or takes the fall in terms of job loss and houses lost and abusive changes to credit card rates and so forth. That means, effectively, you’ve “blamed” (outsourced costs to) the public.
Meanwhile, all those years, the folks in finance have been giving themselves huge bonuses. They take credit for the booms, but outsource the busts.
They say it’s because they work better on “incentives”. Meanwhile it’s all gambling and they hope we don’t notice. If things go well, their bonuses are huge. If things go less well, their bonuses are a bit smaller. If there are losses, they pretend the losses would have been bigger without them and still get a bonus. Or a golden parachute. If things are catastrophic, they get a bail out. God bless ’em.
Brian H wrote:
Irresponsible people with overtly destructive goals are the risk there, not the plants. There are, in any case, plant types that are very hard to use as the bases for atomic weapons.
The plants don’t exist in a vacuum. They exist in a world of irresponsible people.
Silly principle. By that reasoning, nothing dangerous should ever be built anywhere by anyone. Responsible people are actually the only assurance of safety, survival, and progress there is.
Finally! You’re admitting it’s dangerous.
Rezwan wrote:
Irresponsible people with overtly destructive goals are the risk there, not the plants. There are, in any case, plant types that are very hard to use as the bases for atomic weapons.
The plants don’t exist in a vacuum. They exist in a world of irresponsible people.
Silly principle. By that reasoning, nothing dangerous should ever be built anywhere by anyone. Responsible people are actually the only assurance of safety, survival, and progress there is.
Finally! You’re admitting it’s dangerous.
Not particularly. Just taking the reasoning to its conclusion. A car is dangerous in the hands of a fool or a texter or cell-phoner. Is the solution to get rid of cars?
Rezwan wrote:
…Ignoring hard cases hides true risks and costs. Is that rational?
…
Straw man arguments. Did I say “ignoring”? Being careful means using judgment about where particular precautions are appropriate. Attempting to provide perfect protection in all cases can end much activity which can’t be avoided. E.g.: perfect protection for high steel workers would prevent them from doing their jobs. Training and talent permit construction to proceed with few problems.
[The banking debacle was ordained by the Democratic Party when the Frank-Dodds cabal ordered Fanny and Freddie to back mortgages that no sane banker ever wanted to touch–despite all Administration and opposition efforts to audit and restrain them (about 17 times, I believe). The subsequent effort to bury the risk in speculative bundles was foolish, but the core was legislatively mandated junk paper creation in the first place. Not relevant; not so much a black swan as a rampage by an elephant in the living room.]
Brian H wrote:
Not particularly. Just taking the reasoning to its conclusion. A car is dangerous in the hands of a fool or a texter or cell-phoner. Is the solution to get rid of cars?
“Taking the reasoning to its conclusion” – Isn’t that the definition of “reductio ad absurdo” or something like that? I always mix those things up.
I didn’t say anything about getting rid of cars. I used cars as an example of tradeoffs people make. But the risk is very real. The danger is there. Have you ever been in a car accident? Lost anyone to an accident? This is why they spend billions on highway safety, car safety, licensing drivers. It’s dangerous.
If you don’t acknowledge the danger, if you don’t know what it is, you aren’t consciously taking a risk, you’re just acting out of ignorance and denial. It’s once you understand the danger that you can calculate the tradeoffs and make an informed choice.
Denying the dangers is counterproductive. You started this cycle by saying:
Brian H wrote: Fission is nothing to fear. It is expensive, but not dangerous–aside from decrepit plants run by decrepit countries, like the Chernobyl disaster.
I pointed out that statement was not accurate. There be risks and dangers. You may feel they are worth the gain, but being worth it is not the same as not existing at all.
“Nothing to fear.” You said. “Not dangerous”.
Then…finally, you acknowledged the dangers.
Although, now, you’re backtracking.
What was that word you used?
Silly.
Rezwan wrote:
../I pointed out that statement was not accurate. There be risks and dangers. You may feel they are worth the gain, but being worth it is not the same as not existing at all.
“Nothing to fear.” You said. “Not dangerous”.
Then…finally, you acknowledged the dangers.
Although, now, you’re backtracking.
What was that word you used?
Silly.
That’s either deliberate misunderstanding, or non-deliberate misunderstanding.
Dangers are uncontrolled and unpredictable risks. Aside from Chernobyl, which was state-bungled design and management, name another nuclear plant disaster. (Three Mile Island harmed no one; even the workers inside were fine. http://www.world-nuclear.org/info/inf36.html) .
So the “danger” is pure extrapolation with no stats or history. I.e., fear-mongering.
Climate warming forecasts grotesquely incompetent according to actual professional forecasters:
Those who were responsible for making the forecasts had no training or experience in the proper use of scientific forecasting methods. Furthermore, we were unable to find any indication that they made an effort to look for evidence from scientific research on forecasting. It is perhaps not surprising then that their implementation of their forecasting method was inappropriate.
But even the best forecasters are out of their depth with climate:
In other words, if one were to recruit the cleverest climate scientists in the world and give them access to all of the available facts about climate, and ensured that all facts were true and all data were valid and accurate, the experts could do no better at forecasting climate than people with only minimal expertise. And their forecasts would even be less accurate than those from a simple heuristic. This finding is astonishing to those who are not familiar with the eight decades of evidence in the peer-reviewed research literature, and nearly all who learn of it believe that while the finding might apply to others, it does not apply to them.
The IPCC seems uniquely lousy at it, however:
The forecasting procedures used by global warming alarmists were not validated for the situation. To address this oversight, we conducted an ex ante forecasting simulation of the IPCC forecasts (from the organization’s 1992 report) of a .03°C per year increase in global average temperature.
We used the period from 1850 through 2007, a period of industrialization and exponential growth in human emissions of carbon dioxide. In a head-to-head competition involving 10,750 forecasts, the forecast errors from the IPCC model were more than 7 times larger than the errors from a model more appropriate to the situation, the aforementioned naïve extrapolation. More importantly, the errors were 12.6 times larger for the long-term (91 to 100-year forecast horizons).
Governments love this kind of incompetence, unfortunately:
We are conducting an on-going study to examine earlier forecasts of manmade disasters such as the global cooling movement in the 1970s, and the environmental movement’s campaign to ban DDT. We have been actively seeking such analogous situations, especially from the people responsible for promulgating alarming forecasts of manmade global warming, to see if there have been any widely accepted forecasts of manmade disasters that proved to be accurate or where the forecasted disaster was successfully prevented by government actions.
In all, we have identified 72 analogous situations, and we judge 26 of them to be relevant. Based on an analysis of these 26 similar alarms with known outcomes, we found that none were based on forecasts derived from scientific forecasting procedures, and all were false alarms. Government actions were sought in 96% of the cases and, in the 92% of cases where government action was taken, the actions caused harm in 87%. (“Effects and outcomes of the global warming alarm: A forecasting project using the structured analogies method”).
The actual best way of projecting climate is to extrapolate last year’s climate, whatever it was, for the next 50 years:
In the case of global climate change over policy-relevant time scales, there is little uncertainty. Proper scientific forecasts provide extremely accurate forecasts. [i.e. –] Climate varies, but our validation study showed that simply extrapolating last year’s global mean temperature resulted in a mean absolute error of only 0.24°C for fifty-year ahead forecasts. It is difficult to imagine how policy makers would benefit if this error were reduced further, even to 0.0°C.
Brian H wrote:
I pointed out that statement was not accurate. There be risks and dangers. You may feel they are worth the gain, but being worth it is not the same as not existing at all.
That’s either deliberate misunderstanding, or non-deliberate misunderstanding.
Dangers are uncontrolled and unpredictable risks. Aside from Chernobyl, which was state-bungled design and management, name another nuclear plant disaster.
You have an interesting dictionary.
If a “risk” is “predictable”, then it’s not much of a risk, is it? If you know where the anvil is going to fall, you can step aside, at your leisure.
“Risk” is more like, “there’s a chance that one of these anvils will fall, we don’t know which one, or when, or even if.” Or, “if you run through this field of anvils, there’s a good chance you’ll bump your shin on one of them, and that won’t be fun.”
You also have a habit of changing definitions in mid stream:
You go from “Dangers are…risks” to “name another…disaster”. So, only “disasters” are “dangers”?
Again, a simple trip to the dictionary will clear things up:
danger |ˈdānjər|
noun
• the possibility of suffering harm or injury
• the possibility of something unwelcome or unpleasant
Brian H wrote:
Dangers are uncontrolled and unpredictable risks.
Wow. You either,
a) really don’t like to concede a point, or
b) are a bit delusional, thinking that most of the things in the world are under control.
There’s a certainty/uncertainty dynamic going on here.
Rezwan wrote:
Dangers are uncontrolled and unpredictable risks.
Wow. You either,
a) really don’t like to concede a point, or
b) are a bit delusional, thinking that most of the things in the world are under control.
There’s a certainty/uncertainty dynamic going on here.
Control varies; the control of nuclear power in fission reactors is actually very tight. Far better, say, than the control exercised over road or air traffic in general or by individuals using it. Both kill more people every year than all the nuclear plants in the world have done in all their history.
While certainty is unobtainable, those stats should satisfy the rational. Obviously many people are not on this subject.
To quote from my last year’s Masters notes on Risk Analysis and Nuclear Safety:
– Risk is defined as
Probability of a detriment occurring * Consequence of that detriment
– Risk can be reduced by engineered safeguards
– Hazard is an intrinsic property of an objective
– Hazard can be reduced by reducing the quantity of hazardous materials or modifying its form.
“While there is significant work on numerical calculations of risk. The question of developing a Safety Culture is one that involves whole questions of person to person interactions and organisational management. These may well have overtones which are not obvious” Rouchlin on “The Social Construction of Safety”
– Radiological protection principles require justification of any practice leading to exposure to radiation
– Nuclear Power generation is justified by the net benefit from the electricity produced
– After shut-down of a plant the benefit ceases, hence decommissioning policy is based on a systematic and progressive reduction of hazard
Risk is framed in three bands of ‘Tolerability of Risk’:
– Highest Risk: Intolerable region – unacceptable save in extraordinary cercumstance
– Intermediate: Risk must be managed as shown to be As Low As Reasonably Practicable (ALARP)
– Lowest Risk: Broadly acceptable region
This debate could go on, endlessly. Let’s set it aside and place some bets.
Would you wager $100 on a prediction with a 1-2 year time frame, and a set margin of error?
Would you lay odds?
Would you set that money aside, in escrow, to be adjudicated by an impartial other party?
This has been an interesting conversation. Let me add my 2 cents.
Everything that is or happens in the universe can be roughly categorized into hardware, software, and people. Hardware is the physical world of materials, machines, resources, infrastructure, etc. Software is the collection of ideas, rules, laws, contracts, customs, beliefs, fears, values, policies, etc. People are the ones who connect the hardware and software. In the example of driving, we have roads, street signs, and cars. We also have rules of the road, speed limits, etc. Finally, we have people who are supposed to follow the rules. In a perfect world, there would be no car wrecks because the roads would be in perfect condition, signs would be properly places, cars would never blow tires, and people would never be distracted or drunk while driving. However, even the most ideal, fool-proof hardware setup can be defeated by the die-hard fool, the black swan.
Risks are all the possible things that can go wrong. While all may be possible, most will not be probable. With limited resources, engineers and planners usually try to mitigate (not eliminate) the worst risks while not blowing their budget. The probable consequences can lead to degradation, but not failure. That’s acceptable risk. When building homes, planners take into account 25, 50, and 100-year floods. They can’t control the floods, or even predict when they will hit, but it quantifies the risk and time-averages the chance of complete wipeout. But, however the risks are quantified, people will blow right through warning signs, restrictions, safety protocols, and basic common sense.
So, how does that relate to global warming or climate change? To me, climate change is something that is happening and always has happened. Sort of like turns in the road happen. Scientists try to quantify the turns in the road to best prepare to compensate. It’s really hard to guess how the road will turn based on past turns in the road. It is harder to predict the weather and climate. Hurricane tracking still uses spaghetti models that are different, and lead to different predictions. The more factors we can see and understand, the better we can predict and take steps to mitigate problems.
So, do I believe in climate change? Sure. Do I know and can I accurately quantify the many factors? Nope. Can I predict how and why it will change? Nope. Can I take reasonable steps to protect myself in case of sea level rise, desertification, etc? Absolutely. Will climate change be a bad thing? For some. Will it be a good thing? For some. Are there other risks in my life that are more immediate, quantifiable, and potentially catastrophic? Yes. Personally, I’m more worried about global energy supplies running low within my lifetime, the fractional-reserve money base and 60 trillion dollars in derivatives collapsing in the next decade, and the widening division of classes and have/have-nots causing riots and wars in the next 5 years. But that’s just me.