The End of Computing

Today marks the 50th anniversary of manned space flight, as noted in the current Google Doodle:

I’ve written enough about my hero, Yuri Gagarin (here, here, here and here), so I won’t bore you anymore.  I will say three things, though:

While Gagarin was undoubtedly the first human being to go into outer space and return safely to Earth, some conspiracy theorists insist that the Soviet Union had launched other cosmonauts into space and had failed to recover them.  This seems unlikely to me.

We celebrate Gagarin today.  But my thoughts are often of the dogs that the USSR had launched prior to any human being.  They died the most horrible deaths imaginable, alone and terrified in Earth orbit.

Today is also the 30th anniversary of the first Space Shuttle launch.  I remember that day quite clearly.  The pilots, as I recall, were John Young (a seasoned moonwalker) and Robert Crippen, the latter on his very first space mission.

But…

Today’s real topic, though, is the future of the world of computers.  Now, I’m not a computer expert.  However, computers –their history, construction and programming– is one of my serious hobbies.  Not surprisingly, my Gwitter project is a marriage of my career (global health) and my love (computing).  You can still vote for Gwitter until April 29, by clicking on the “thumbs up” icon in the lower right of the linked page; so go forth and do so!

Ever heard of Moore’s Law?  First articulated in 1965 by Gordon Moore, it essentially states that the density of transistors on an integrated circuit would double every 1.5 years.  Translated from Nerdese, it says that the raw power of computers would double in that time period, based almost entirely on technological advances in making smaller and smaller components.

Moore’s Law has actually held true for the past 45 years, and is expected to continue to be true for another 10-15 years.  A corollary states that the computing power available to us will always exceed the amount of computation needed to be performed.  At the turn of the century, there was some concern that this corollary would no longer hold, as the Human Genome Project was about to dump ungodly amounts of data onto our laps.

However, to my knowledge, this did not happen.  The rate of computing advancement continued to outstrip society’s demand for raw computational power.  This had the added economic advantage of allowing the computer industry to put out a new generation of machine every 1-2 years.  (Ever wonder why your laptop from 5 years ago doesn’t cut the mustard anymore?  It still works fine.  But since transistor density has outpaced it, software has also grown to match the new computers’ abilities, so your old laptop can’t keep up with the young’uns anymore.)

In my opinion, this has been the major force in maintaining the personal computer industry as a multibillion dollar affair.

Now experts are predicting that Moore’s Law will cease to apply in 1-2 decades, due to absolute physical and quantum limits in how small one can make transistors.  Some manufacturers are already feeling the crunch, hence the proliferation of multiple-core processors –it’s a way of increasing computing power without increasing transistor density.

So, given that transistor technology development will slow in coming years, here are my non-expert predictions (informed in part by a recent article in University of Toronto Magazine) for what this will mean for the computing universe:

1. The new trend will continue: how many cores can one shove into a single computer?

2. The industry focus will shift away from developing more powerful computers to the development of personalized devices (smart phones, book readers, wearable computers, etc) in an attempt to increase market share without dramatically pushing technological advancement.

3. Since the raw guts of the machine will plateau in power, a greater reliance will be placed on software development for the purposes of improving speed and efficiency.  Perhaps this will mean a reduction in software bloat.

4. Cloud computing will continue to expand and evolve.  Since the rate limiting step in this process is network speed, more focus will be placed on vectorizing devices’ access to networks.  (Yes, “vectorize” is a real word; I borrowed it from my old FORTRAN programming days.)

4(a). New modalities for cloud computing will evolve.  A variety of smaller computing “farms” will arise to compete with Google for supremacy in the Cloud market.

5. A crisis in the IT industry will be experienced in 1.5-2 decades, as workers and companies must re-task themselves away from chip development toward application development.

6. A push will be on to develop the first consumer-ready quantum computers.  Such devices would be paradigm-shifting industry arrivals which may do away with the need for transistor density altogether.

And there you have it.  I’d welcome any comments from people who are actually experts in this field.

loading
×