I was thinking last night about how technology has changed over the years, and how Moore’s Law has apparently been broken as computer CPUs are not doubling computational power each 18 months.
We have not seen the transistor density and computing power doubling for a couple years now. I understand it is a feat of engineering and physics to do this and I'm not certain of the reason chips have slowed their advancements in recent years, but does it matter anymore?
I believe that software and the cloud computing has taken up the baton in the relay race for computing performance.We can do more faster utilizing the cloud which is essentially hosted computer processors that can process intensive computational tasks in parallel thus making the need for doubling performance on your desktop or laptop really unnecessary.
Autodesk is providing many technologies to take advantage of the cloud. Many of our desktop tools today have hooks into the cloud to process reality capture or rendering tasks or our cloud products like BIM 360 or Forge use the cloud for collaboration, data translations, design visualization, data management, design analysis, and more that are difficult and computationally expensive to a local computer but shared resources of the cloud are ideal.
So is Moore's Law really relevant in technology discussions today, or has the cloud muted the need for such rapid CPU and GPU development for personal computers? I think focus on performance and reduction in energy use should focus on CPUs and GPUs in the cloud now as it provides a way for us to do more, better, with less. More computations, better data results, and less wasted CPU energy and unused CPU/GPU cycles.