Podcast: Play in new window | Download | Embed
Subscribe: Apple Podcasts | Email | RSS
There are lies, darn lies, and statistics. Which begs the question… can we trust the data we’ve been getting? Recent announcements from the CDC let the world know that some of their data was not accurate. While this may be a simple correction, look at how many bad decisions have been made based on that bad data. As we say in Information Technology, garbage in garbage out. What can the American people do to protect themselves from bad data like this? We’ll look at some examples today.
The Constitution Study with Paul Engel on America Out Loud Talk Radio can be heard weekdays at 4 pm ET. Listen on iHeart Radio, our world-class media player, or our free apps on Apple, Android, or Alexa. Listen to other episodes of The Constitution Study, available on podcast.
Garbage in, garbage out, used to be the rule of law in data processing in the 1970s. Since then, we have evolved to REXXes, that allows us to sanitize data automatically, before they are handled by application programs. Correct the artifacts and sequencing errors automatically, using one line REXXes. For example, the universal REXX ?#!@>5&**^(? should fix almost anything, even unGodly vaccines. Of course the compiler to run this universal REXX will cost about five billion dollars after it is reverse engineered (de-assembled) into existence, and will only run on z/Os 3.0 operating system. To be made public a few years from now when Big Blue and its Chinese counterpart merges. Don’t blame the CDC, the technology is on the way, just make weekly backup copies of your bad data until then, using DFSMS.