Evolution of the Data Ecosystem

James Warner
by James Warner 16 Jan, 2023
Moore's Law, Big Data and Now, the Trusted System

Peter Sondergaard, Senior Vice President, Gartner once said "Information is the oil of the 21st century, and analytics is the combustion engine", and it is one of the quotes that holds profoundly true. We live in an age where a simple calculation of your heart rate is collected in your smartwatch and sent to the cloud. This data can save a person's life but it can also save a million lives, and that is the power of data.

The data out there is a potent tool and anyone with enough information (which is mined from data) is just as powerful. Now we know data and the information mined from it, can change the spin of the world but what is Big Data? It quite literally means a large amount of data. Now, let's go back to previous example: 60 seconds of heart beat for one person is a lot in a day or even a year but imagine a billion people owned the watch that calculated their heart rate? Now multiply with a year's worth of minutes.

 

This is a simple example of Big Data, but it's still not big enough, don't you think? Okay how about this: The CERN Large Hadron Collider produces about 40TB of computable data every second. You read that right, that's producing about ten 4K resolution videos of 52 minutes in length every second. Now, that is "Big Data", self-explanatory and everything.

 

It is also true that we have been producing so much data in the last two years as opposed to the entirety of known human history. This exponential growth in data is not something that had been lurking in the bushes, waiting to take the engineers of the world by surprise. It was predicted by Gordon Moore way back in 1965. He wasn't the Nostradmus of our time but a man of logic who theorized that the number of transistors on an integrated circuit doubled approximately every two years and he expected the trend to continue for at least 10 years. It actually continued for 45 years! The ICs over time became smaller and smaller since the transistors that performed the impeccable calculations were shrinking at an exponential rate as a result of Moore's law as it came to be called. Time and research at chip companies like Intel and Radeon went into shrinking the chips further and further while making the processing efficient, quicker and sleeker.

 

Relax, we aren't swerving off topic, this has a lot to do with Big Data Solutions. As a consequence of Moore's law, the memory size on our devices increased and fortunately became lighter on the wallet. Nowadays, a 128GB memory card is available at the same price and the physical size remained the same as the ol' 512. When things got smaller and more accessible to the crowd, the data produced doubled each year (hey, it's Moore's law again!) Every Facebook post, Instagram post, sharing, commenting, every video you watch on YouTube is all precious data that's collected and with a billion of us out there, data exponentially grew.

 

Wait, is Moore's law still around to process this much? Unfortunately no, our guiding light in making nano-sized chips powering even supercomputers is now cracked. Moore's law is dead (cue funeral music). While the quality of images and videos produced now are higher than ever, the computing speed for all the data coming in, is sadly getting slower. The information needs to be ideally extracted at the same pace it comes in, to be able to mine everything before the avalanche of data overwhelms and becomes gibberish, balderdash, poppycock, twaddle.

 

So, with Moore's law death the immediate effects are that the supercomputers will be the first affected. Studies on climate change, superconductors, etc. will take a hit. Mobile phones and tablet PCs will be affected much, much later due to the simple fact that they make use of very less computing powered chips (relatively speaking) Processing in the cloud is out of the question due to the incredible amounts of power that will double each year, eventually putting the city in darkness.

 

The R & Ds of many companies are working round the clock to figure out effective ways to process data. The criteria is simple extract data, process quickly and avoid melting down the entire city, but arriving there is harder than it seems. One particular approach is Photonics. The Intel labs at Texas are using light to process data quickly without any resistant losses that causes heat.

The impressive part of this idea is that it processes at the speed of light and doesn't generate as much heat. It's a start and we'll eventually get there. The other solution is of course collectively mining data and using a trusted system to keep that data safe with secure keys. The corruption of data is a lot more difficult with everyone holding the correct version. It's an efficient system and has been in the news all the time last year (2017). The overused term which made every cabbie speak about it, could be our next solution to this data processing crisis. You guessed, right, it's the blockchain.

About Author: James Warner is a business intelligence analyst and cloud application developer at offshore software development company NexSoftSys.