It is a logic machine that can execute computer programs. This broad definition can easily be applied to many early computers that existed long before the term "CPU" ever came into widespread usage.
The term itself and its initialism have been in use in the computer industry at least since the early 1960s (Weik 1961).
The form, design and implementation of CPUs have changed dramatically since the earliest examples, but their fundamental operation has remained much the same.
Early CPUs were custom-designed as a part of a larger, sometimes one-of-a-kind, computer.
However, this costly method of designing custom CPUs for a particular application has largely given way to the development of mass-produced processors that are suited for one or many purposes.
This standardization trend generally began in the era of discrete transistor mainframes and minicomputers and has rapidly accelerated with the popularization of the integrated circuit (IC).
The IC has allowed increasingly complex CPUs to be designed and manufactured to tolerances on the order of nanometers.
Both the miniaturization and standardization of CPUs have increased the presence of these digital devices in modern life far beyond the limited application of dedicated computing machines.
Modern microprocessors appear in everything from automobiles to cell phones to children's toys.
Read the rest of the article here (opens in new tab)