http://web.archive.org/web/19990220235952/http://www.giss.nasa.gov/data/gistemp/GLB.Ts.txt
Disrupting the Borg is expensive and time consuming!
Google Search
-
Recent Posts
- Worst March Drought On Record
- ChartGL Process Control Demo
- The Biggest Money Laundering Scam
- Drought In The Headwaters Of Lake Powell
- Unrealistic Expectations Of Water Availibility
- Did Bill Gates Do This?
- Worst March Drought On Record In The US
- The Real Hockey Stick Graph
- Analyzing The Western Water Crisis
- Gaslighting 1924
- Climate Abstract Generator
- Climate Abstract Generator
- “Why Do You Resist?”
- Climate Attribution Model
- Fact Checking NASA
- Fact Checking Grok
- Fact Checking The New York Times
- New Visitech Features
- Ice-Free Arctic By 2014
- Debt-Free US Treasury Forecast
- Analyzing Big City Crime (Part 2)
- Analyzing Big City Crime
- UK Migration Caused By Global Warming
- Climate Attribution In Greece
- Climate Attribution In Greece
Email Subscription
Join 1,944 other subscribersRecent Comments
Jeff L. on Analyzing The Western Water Cr… Morgan Wright on Great Lakes Approaching 100% I… Morgan Wright on Great Lakes Set Another Spring… gelcarrion0t on New Visitech Features saveenergy on Ice-Free Arctic By 2014 gelcarrion0t on Ice-Free Arctic By 2014 gelcarrion0t on Debt-Free US Treasury Forecast gelcarrion0t on Seventeen Years Of Fun Barbara Stockwell on Nuclear Safety In The US saveenergy on 100% Tariffs On Chinese EV…


So. What’s up?
The Web archive data for 1997 have been disappeared. But those from 1999 remain? Does the webarchive have a use-by date?
Looks like they missed a few. I don’t expect the remaining ones to last long.
Paradoxically anything of value found in the Wayback Machine gets disappeared from it
I wonder what excuse the climate frauds are using to “cleanse” history.
Government data cannot be copyrighted. Thus they almost have to be declaring the data “secret” or something.
Of course they’re just following the lead of the Liar-in-Chief……
Someone is bound to have saved them. But that might add to the data integrity confusion.
Well, I found data, and it said 1998.
I show everything up to 1998 .
Wow, what a HUGE difference between 1997 and 1999!. 1999 shows the early record near zero, the 1997 version shows it much colder (the first ever recorded instance of data tampering going in a direction other than UP?)
Surely the heading – “GLOBAL Temperature Anomalies in .01 C ” – exposes the whole scam.
I was a Health Inspector and part of my job was to ensure that perishable food for sale was stored at the required temperatures. To do this my employer provided a “state of the art” digital thermometer. This was regularly certified as accurate as the readings may be used in legal proceedings.
This was a $1500 thermometer – a precision instrument.
It had a guaranteed accuracy of +- 0.1 degrees C – ten times less accuracy than the “GLOBAL Temperature Anomalies in .01 C ” ???????
How the hell do you obtain 0.01 C accuracy from a data set with at best +- 0.5 C accuracy ??
If a 20th century precision instrument is only certified to +- 0.1 C the old mercury in glass max/min thermometers aren’t going to be any more accurate !!
It is fraud anyway without the tampering !
All the industrial temperature measurement equipment I use and calibrate is typically rated to +/- 1 deg F or 0.5 deg C. A well calibrated instrument (using 7-9 data points) can do easily do 0.1 deg C.
My old instruments and measurements prof would be jumping up and down and turning purple at the thought of taking coarse data measurements, and massaging junk information of much higher precision out of it. At the very least, if these “scientists” are going to create “fake precision” data, they should always state the accuracy of the raw data input. And thus by extension declare the uncertainty of their “massaged” data.
You can take stack of 4000 boards measured to the nearest 0.1 inch and generate an average measurement with an apparent precision of 0.0001 inch. Yet that apparently accurate average has an inherent uncertainty of +/- 0.05 inches because the that is the accuracy of every single raw data point.
Calculators and computers allow the generation of junk data at lightning speed.
Agreed !!!
The About page for the Internet Archive says among other things:
“The Internet Archive is working to prevent the Internet – a new medium with major historical significance – and other “born-digital” materials from disappearing into the past. Collaborating with institutions including the Library of Congress and the Smithsonian, we are working to preserve a record for generations to come.”
So who has control to “disappear” things that were once stored in the wayback machine? Or were they ever stored there? I know some of the old data pages come up with a “You don’t have permission to access /data/update/gistemp on this server.” message. Example
Loading…
http://www.giss.nasa.gov/data/gistemp/GLB.Ts.txt%20and%20found | 16:20:54 Jan 5, 2014
Got an HTTP 302 response at crawl time
Redirecting to…
http://data.giss.nasa.gov/gistemp/GLB.Ts.txt%20and%20found
Red,
I got the same thing!
Notice it says “Data were checked and adjusted for urban warming.”
Yeah, that is a good reason to disappear it!!
http://data.giss.nasa.gov/gistemp/GLB.Ts.txt
Not found.
The web.archive.org link has data
Thanks for finally talking about >Important Reader Survey | Real Science <Loved it!