Sunday 17 January 2016

The end of the Moore's Law

The so called Moore's law stated that "every 18 months, the numbers of transistors on a chip doubles".

By the way, it is not a law.

It is, rather, a business and market strategy shared by most of what is called the "tech industry".

A way to force consumer to buy a new pc - now, a new smart-phone - every three years, when it could last thrice as much.

In this the hardware industry was aided and abetted by the software community,  who came to see hardware improvements as a given,  till a somewhat more obscure but equally  inflexible law was derived: "Software is like gas: it expands to fill every byte of ram and instruction cycle available".

This latter is the reason why a '486 with 4 MB of RAM takes the same time to boot of an I3 dual core with 4GB, and why both computers run their respective version of Word just fine enough that one can use the former to write the same books he can write with the second.

The Moore law, in its original form, hit the walls of the physical limits of the materials in use when it was stated, some years ago.

The hardware industry maintained the pace anyway, but had to resort to something more than just reducing the print dies of the same circuit.

They started by using copper in the circuitry instead of aluminium, than to making "tri-gate" transistors, which are in effect transistors that surge from the silicon substrate in order to have an active area bigger than their footprint, then they developed multiple prints, to create circuits with details smaller than the wavelength of the light used in the photolithographic processes that create chips.

These new processes have increased circuits density, but finally resulted in reduced yields - the sweet spot , with the lowest cost per transistor, seems to be the so called 22 nm technology.

Depending on who you read, or when was written the piece, the definitive end of Moore's Law is expected for either 2016 or 2020.

People will likely not realize it for a couple of years, as by that time smart phones will be in the middle of the switch from LCD to AMOLED screen that will probably mask processors and memories stall, but chances are a computer from 2025 will be only marginally better than a computer from 2020 - smart phones and tablets included.

There are a set of "impending" technologies that may prove viable and avoid the IT sector to follow the path of many other technologies thst crashed into "maturity", but one thing is almost sure.

The days of "picking the low hanging fruits" of computing technology are soon to be over.

What will it entail, for the average person that - ensnared by the brilliant light of their iPhone -  has bought so far that he was living in a time of thriving technological development?

Will he recognize the stasis?

Will the enthusiasm for our electronic gizmos die out the same way the enthusiasm for cars died out since the '90s, when people saw that, year after year, in all it's always the same crap with a better paint job?

No comments:

Post a Comment

Feel free to point me out conceptual, orthographical, grammatical, syntactical or usage's errors, as well as anything else