Shouldn't....

Shouldn't all lying scientists be rounded up and tried before the International Court of Justice?

I'd include Slowman in the round-up for his extraordinary mouthing.

Even Gore runs for cover...

formatting link

But he'll probably get another Nobel Prize... "Most Profound Scientific Meddling" ;-) ...Jim Thompson

-- | James E.Thompson, CTO | mens | | Analog Innovations, Inc. | et | | Analog/Mixed-Signal ASIC's and Discrete Systems | manus | | Phoenix, Arizona 85048 Skype: Contacts Only | | | Voice:(480)460-2350 Fax: Available upon request | Brass Rat | | E-mail Icon at

formatting link
| 1962 | I love to cook with wine. Sometimes I even put it in the food.

Reply to
Jim Thompson
Loading thread data ...

formatting link

Yes, everyone but the idiots already knows this. So, we have increased atmospheric CO2 by almost 40% - how much new water vapor are we making? lol

Proof abounds - republicans are betas.

Reply to
Ouroboros Rex

Jim-out-of-touch-with-reality-Thompson strikes again.

What he probably doesn't realise is that if the legislation to enable this roundup was framed in terms of misleading the public, Exxon-Mobil and the front organisation they fund would the ones ending up in dock.

And if the legislation failed to provide for a defence on the basis of not intnentionally misleading the public, Jim's habit of picking up and re-broadcasting nonsesne that he doesn't undertstand would put him into the dock with them, with the URL he posted as evidence.

That pile of rubbish includes the line

"Gore is not backing away from his support for the theory of man-made climate change, but his concession that carbon dioxide only accounted for 40% of warming according to new studies"

which is a trifle comical.

CO2 has never accounted for 40% of warming. The Wikipedia article on greenhouse gases has carbon dioxide contributing between 9% and 26% of greenhouse warming (depending on where you are in the atmosphere) with water vapour contributing between 36%and 72%.

I'm not aware of any new work that is likely to change this much, though there has been some work on pressure broadening that might let us figure in the effect of the transient population of carbonic acid in the atmosphere that you get when CO2 and H2O collide and don't fly apart as fast as they would if they didn't form any kind of chemical bond.

Al Gore has known these figures for a very long time - though he may not have a deep understanding of why they have the values they do - and he won't have told the reporter that CO2 only accounted for 40% of global warming.

We can only guess what he might have been tryng to tell the reporter, who doesn't seem to have even a shallow understanding of the physics, but we can be pretty sure that the reporter failed to understand what he was actually being told.

All Jim understood was that the article was being rude about Al Gore which prompted him to post it to sci.electronics.design, rather thant to alt.political_abuse.inept where it would belong.

-- Bill Sloman, Nijmegen

Reply to
Bill Sloman

So, *water vapour* is the culprit od the month now, is it ?. I thought we didn't know very much about clouds yet. It sort of begs the question as to why there is so much fuss about reducing co2 emissions ?.

Perhaps you can explain to to me in simple, layman's terms. I find all this stuff increasingly confusing, as the goalposts appear to be moving this way and that...

Regards,

Chris

Reply to
ChrisQ

It always has been.

Clouds aren't water vapour - they are droplets of water, which is to say. condensed water vapour (which is a gas).

I can't explain it all to you - life's too short. I can explain why we worry about CO2 rather than water vapour.

The partial pressure of water vapour in the atmosphere is determined by the temperatures of the surface of the oceans, which are extensive. If we injected water vapour into the atmosphere, the vapour pressure would be back in equilibrium within a couple of weeks at most.

So the amount of greenhouse warming we get from water vapour is strictly determined by the surface temperature of the earth; the warmer it is, the more greenhouse warming we get. This is one of the postive feedbacks that amplified the rather small forcing from the Milankovitch Effect to give us alternating ice ages and interglacials for the past few million years.

CO2 also dissolves in the oceans, but nowhere near as fast - it takes about 800 years for an injection of CO2 into the atmosphere to get into equilibrium with the oceans.

About half the CO2 we have been injecting into the atmosphere for the past 250 years is still there, generating extra greenhouse warming directly, by forcing the effective emitting altitude for the infra-red radiation it absorbs higher up in the atmosphere, and indirectly, by raising the surface temperatures of the oceans enough to evaporate a bit more water, which also contributes extra greenhouse warming.

One of the many worrying aspects of the current level of anthropogenic global warming is that the positive feedback effect from CO2 that we see in the ice age to interglacial oscillation hasn't had a chance to kick in yet.

As the oceans warm up, their capacity to dissolve CO2 decreases, and it starts coming out of solution. We've been injecting more CO2 into the atmosphere much faster than the oceans are warming up, so - at the moment - some 30% of the CO2 we emit is still being taken up by the oceans.

Once the oceans have had their 800 years to warm up and turn over, we will get it all back, and more besides.

But you can take comfort from the fact that the third positive feedback that amplified the Milankovitch Effect enough to give us ice ages and interglacials was changing snow cover across the more northern parts of the norhtern hemisphere. There's not a lot of snow cover left up there, and once it has all melted that particular positive feedback won't be a worry.

This is old stuff, and these particular goal posts haven't moved for many years.

formatting link

-- Bill Sloman, Nijmegen

Reply to
Bill Sloman

Thanks for that. The paper this morning reported on how all the original data from Jones et al has been dumped and only the interpreted results are now available. Apparently, all the original data was on 9 track magtape and this was all dumped when they moved office. Unbelievable, but true. If they can no longer show proof of their results, at best it's sloppy science. At worst, what ?.

A cynic might say that that is "convenient", in terms of providing the ability to claim that they are unable to release the original measurement data. However, knowing the state of the art of computing in uk academia in the late 80's / early 90's, it was probably a vax /vms site, with the magtapes having around 45 Megabytes capacity per tape. There would probably be a lot of them, perhaps hundreds, but it does seem a bit of a poor excuse. Would assume that the original data must still be available somewhere though, perhaps microfiche or even written records.

After the recent leaks of emails, it would seem all the more important to allow independent scrutiny of *all* relevant data. I downloaded the

60Mb file this afternoon from wikileaks and there's stuff going back to 1993 or so, together with quite a lot of programming stuff in the form of fortran 90 source. The emails do make interesting reading and you can get a lot of insight into the internal culture from reading them...

Regards,

Chris

Reply to
ChrisQ

f
h

The raw data was meaningless without the metadata that told you what corrections to apply to each site. Only the corrected data - which was preserved - can be used as it is. If the corrections had been built into the hardware at each site, nobody would have been in the least worried that we hadn't preserved the raw data.

Proponents of the great global warming swindle, who are prepared to believe that almost all the world's climatologists are conspiring to frighten the world into paying them their regular academic salaries by faking evidence of global warming by systematically fudging the data, seem willing to believe that dumping the raw data was part of the grand conspiracy to mislead the public. The rest of us find this a carzy idea. .

Magnetic tapes take up space, and it is hard to get hold of a magentic tape reader hooked up to a moderm computer and the web, and getting increasingly harder. The temptation to squeeze a few more graduate students into the space ten beong occupied by an archive of useless magentic tapes must have been over-whelming.

Why? Conspiracy theory fans, like Ravinghorde, see evidence of all kinds of criminal activity in the e-mails, but they'd see the same sort of "evidence" in a laundry list. The only stuff that has worried me in the slightest in the leaked e-mails I've seen are the exerpts about trying to extrat information from old data - of the kind that has been lost - which are totally amateur-night. But that's what you'd expect with physicists - their approach to electronics is to reinvent if from scratch, which lead me to send a serios of rude comments to Rev. Sci.Instruments during the 1990's, complaining about various dumb newbie mistakes in the electronic circuit taht they had published. About a third of the comments got published. I presume that their approach to data-processing is equally slip-shod.

Only of you are a computer freak of the deepest dye.

-- Bill Sloman, Nijmegen

Reply to
Bill Sloman

What I don't understand is the need for post processing ?. If it's series of temperature or rainfall records, for example, they should need no "processing" at all, assuming that the measuring systems were accurate. 10 degrees C is the same anyway in the world.

Perhaps you could explain this a bit more ?.

I still have a 9 track drive, should you ever need to read tape, but I digress :-). The data could have been transcribed to dat or other more compact media. The truth is probably more mundane though, in that the systems used to write the tapes had been disposed of, or some individual decided that they were no longer important. Media archiving and portability is becoming a serous issue in all kinds of fields now.

The most serious thing about the loss of data is that it is now impossible to repeat the work that resulted in the conclusions and subsequent advice to government. This is very serious and can only strengthen the sceptics hand, even if it turns out that they are wrong. If the orignal science can no longer be repeated and verified, what does it mean ?.

Because science is about the search for truth above all else and the only way to get at it is to have as many peer review eyes examining the methods and procedures as possible. Real science has no problem with this, just as I have no problem with the code review process used in my work. A surprising number of errors bubble to the surface and although irritating and inconvenient at times, the result is nearly always a more accurate conclusion, ime.

Well, I have to confess that computing is a bit of an anorak interest here :-). Have been programming C for 30+ years, but never fortran, so it would take considerable effort to work out what their code was doing, even assuming that I understood their models to use as a reference. I was thinking more of the tribal professional culture that comes through by reading the emails...

Regards,

Chis

Reply to
ChrisQ

As far as I know, rainfall records don't need post-processing. Temperature records are corrected for the height of the measuring point, using the environmental lapse rate

formatting link

Weather stations get moved from time to time, and need a different correction at each location. This came up in relation to some New Zealand temperature records not too long ago.

It is impossible to repeat and difficult to verify historical data. Data on magnetic tape isn't the original record in any event, and could be faked without any great difficulty, so it isn't all that obvious why you are getting excited about it.

Peer-review is one thing. Harrassment by the denial industry is quite another. Other scientists with a legitimate interest in the results aren't going to spend much of their time going through other peoples data - it takes time and is unlikely to generate a widely cited scientific paper.

The denial industry has other priorities. They are subsidised to find flaws in the work of climate scientists, and if they can't find genuine flaws they are encouraged to invent a few of their own. There's no limit to the time they could spend on demanding acess to raw data, and to the meta-data required to make sense of the raw data, and - since their main job is to attack the crediblity of the scientific basis fir anthropogenic global warming - they are motivated to to do what they can to slow down and prevent useful research in the area.

o

an

This isn't exactly specific to climate science.

-- Bill Sloman, Nijmegen

Reply to
Bill Sloman

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.