The Integrated Circuit was invented in 1958 by Jack Kilby who was then working at Texas Instruments; before this breakthrough, the world of computing was ruled by tubes, valves and ancient components that looked like crushed cockroaches.
Kilby, who was awarded the Nobel Prize in Physics in 2000 and died in 2005, devised what is in effect a roadmap for five decades of electronics revolution and made electronic devices not only cheaper, but also more efficient both in terms of power consumed and space occupied.
From Intel's Tukwila billion-transistors microprocessor to the integrated chip in your pocket calculator (ed : do we still have these around?), they all descend from Kilby's vision of having transistors squeezed on a piece of silicon.
There are now roughly 50 piece of integrated circuits produced per year per human being and the industry, which is growing at more than 10 percent per annum, will reach GBP 200 billion by 2011.
But while the template of the Microchip will remain the same, the material used will almost certainly change; Germanium was originally used and was replaced by silicon to keep up with Moore's law - which stipulates that the number of transistors per unit are doubles every two years.
Graphene, a special kind of carbon, has been widely tipped to replace Silicon because of its exotic characteristics and its versatility. Indium Gallium Arsenide is yet another potential candidate as electrons in the material travel much faster than in silicon.