Faced with a media interview before Christmas about technology changes over the past decade, I did a bit of research, thought about all the new products that had been released and then came to an, in retrospect, obvious observation.
Beneath everything that has happened in the last decade, or even since the dawn of the computer age, lie advances at the microprocessor level.
Without new processors, there would be no iPhone or iPod, no social networking, no digital video editing, no digital media.
Ever-increasing processing power drives our industry and makes the previously impossible possible, and the previously imaginable real.
As part of my rather basic research I looked up an old edition of PC World to check processor speeds in 2000. The latest and greatest PC processors then delivered around 600MHz; now we are over 2GHz and with multiple cores instead of one.
Moore’s Law, which says the number of transistors that can be placed on an integrated circuit doubles every 18 months to two years, was formulated by and named after Intel co-founder Gordon E Moore. Almost ever since there have been people predicting its demise.
Even Gordon Moore agrees that this doubling can’t be sustained forever, and, as you would expect, some are predicting those limits to become apparent in this coming decade. That will be true only if some very clever physicists and engineers fail to deliver new ways of developing processing capacity, possibly in areas of quantum computing.
To some extent, the shift to multicore processors is driven by the failure to keep increasing processor clock speeds in line with Moore’s Law. Could this be the beginning of the end for Moore’s Law?
I’m going to assume it’s not. And if that is the case, it will be almost impossible to predict what our world will look like in 2020. A continuing doubling of processor power every two years from where we are now will totally transform our lives and our capabilities.
Whether that transformation is positive or negative depends on us. It could enable vastly improved surveillance, data matching and intrusion by government in our lives. It could also deliver medical and scientific advances that will help us answer questions and deliver cures that are beyond our capabilities now.
One thing is certain, though. It will continue to drive computing power into smaller mobile devices and create a wired population top to bottom, at least in the developed world.
For IT there is a downside. Computer technology is already being consumerised. People no longer need to know about technology to use it.
That success could turn amazing advances into mere background noise, assumed rather than relished, applied but not applauded.