5 Semiconductors, ICs, PCs

A semiconductor, as its name suggests, conducts electricity less than a full conductor like copper or gold, but more than an insulator like glass. Understanding the properties of semiconductors relies on quantum mechanics, but experiment and observation even before the advent of quantum theory led to early semiconductor diodes like the cat whisker detector used in early crystal radios from about 1904 to the 1920s. With the benefit of understanding the behavior of electrons in crystal lattices afforded by quantum mechanics, transistors and integrated circuits (ICs) became possible.

John Bardeen, William Shockley and Walter Brattain at Bell Labs, 1948
John Bardeen, William Shockley and Walter Brattain at Bell Labs, 1948

A transistor is a semiconductor device used to switch or amplify an electronic signal. The first working transistor was produced at Bell Labs in 1947 by William Shockley, John Bardeen, and Walter Brattain. They were looking to replace vacuum tubes with a smaller device that consumed less power. This was followed in 1951 by the bipolar junction transistor, and in 1959 by the metal-oxide-semiconductor field-effect transistor (MOSFET), the basic building block of modern electronics.

Mohamed Atalla
Mohamed Atalla in 1963.

Mohamed Atalla (1924-2009) was an early researcher into silicon semiconductors and co-inventor of the MOSFET. Atalla proposed a MOS integrated circuit chip in 1960, but Bell Labs ignored MOS technology. RCA and Fairchild, however, saw the promise and by 1963 Fairchild scientists had developed complementary metal-oxide semiconductors (CMOS), which opened the door for integrated circuits.

Photolithography diagramIntegrated circuits (ICs or chips) are miniaturized through photolithography rather than being constructed one transistor at a time. Photolithography uses photosensitive semiconductor materials and light to etch patterns on a substrate called a wafer. The light-exposed wafer is then treated with chemicals to remove the unwanted material. The basic patents were awarded in 1959, just in time to be used to etch ICs. A typical wafer can go through dozens or even hundreds of cycles of etching and chemical treatment – and all of this happens at microscopic scales requiring elaborate clean-rooms to prevent contamination.

In 1956, William Shockley opened Shockley Semiconductor Laboratory in Mountain View California and invented a four-layer diode now called the Shockley diode. But he had a paranoid streak and worried that someone was going to undermine him or give away his secrets. His employees put up with threats and even forced lie-detector tests, until eight engineers decided enough was enough and opened a lab of their own in 1957 as a division of Fairchild Camera and Instrument Company.

2N296 adShockley dubbed the founders of Fairchild Semiconductor the “traitorous eight” and predicted they and their company would never amount to anything. In 1958 began producing the 2N696/7, a transistor designed by Gordon Moore. The first batch of 100 was sold to IBM for $200 each, to build the computer in the prototype XB-70 Valkyrie supersonic strategic bomber. In 1960, Fairchild built the first silicon IC, with four transistors on a single wafer of silicon. When Mohamed Atalla demonstrated his MOSFET and Bell Labs yawned at it, Fairchild recognized its significance and began developing digital circuits.

Intel logoCredit for the invention of the IC is shared by Robert Noyce (then at Fairchild) and Jack Kilby at Texas Instruments. TI had manufactured the first transistor radio in 1954 (using germanium transistors because silicon was still too expensive). Kilby’s IC was made of germanium, while Noyce’s was in silicon. In 1960, Noyce invented the planar IC, which the industry preferred to TI’s because it was interconnected by a thin-film deposit rather than requiring wires like TI. Fairchild’s sales doubled yearly until they were second to TI and ahead of Motorola. Noyce, a corporate vice-president and head of the semiconductor division, was passed over for the CEO spot. In 1968, he and head of R&D Gordon Moore left Fairchild to found Intel.

Intel 4004
Intel 4004

The market Noyce and Moore set their sights on was semiconductor memory, which they believed could replace the expensive magnetic-core memory used in mainframe computers. Intel produced a static random access memory (SRAM) chip in 1969 and a dynamic DRAM chip in 1970. By 1972 the Intel 1103 chip was the bestselling memory chip in the world and had replaced core memory in many applications. But that was just the warm-up. Intel produced its first commercial microprocessor, the 4004, in 1971. The 4004 was originally designed for a Japanese calculator company, but was released to general applications a few months later in a 16-pin dual in-line package (DIP) form.

Intel 8008
Intel 8008

In 1972, Intel introduced its first 8-bit processor, the 8008. It used a ten micron process like the 4004, and although a little slower than the 4004 was able to process data 8 bits at a time and address significantly more RAM. It was packaged in an 18-pin DIP and used primarily for CRT terminals. An 8080 followed in 1974 in a 40-pin DIP package that allowed it to be used in early microcomputers like the Altair 8800. It was the original target processor for the CP/M (Control Program/Monitor) operating system designed by Gary Kildall of Digital Research (DR).

Wang 2200Machines that would later be recognized as early personal computers (PCs) began to appear in the 1970s. In 1973, minicomputer maker Wang Labs introduced a $7,400 machine called the 2200 that integrated a keyboard and CPU with a CRT (cathode ray tube) screen and a cassette tape for data storage.  Hewlett Packard produced a family of desktop computers beginning in 1974 called the 98 series, but continued to describe as programable calculators. And by the late 1970s, Commodore, Atari, Tandy Systems, and Apple were shipping affordable home computers on which hobbyists could learn programming languages like BASIC. The Apple II became the market leader in 1979 when the first “killer app”, VisiCalc, transformed it from a hobbyist machine to a business computer.

Intel 8086
Intel 8086

Intel developed an improved processor called the 8085 in 1976 and a 16-bit microprocessor called the 8086 in 1978. The 8086 was designed as an answer to the Zilog Z-80, an 8-bit CPU introduced in mid-1876 by former Intel employees. The 5 MHz 8086 was the first of Intel’s x86 microprocessor family, with 29,000 transistors fabricated in a 3-micron process. But it was a cheaper variation, the 8088, that IBM chose for the original IBM PC (the 8086 would be in later entry-level PS/2 models).

Apple AdIBM’s Personal Computer was a response to a $150 million market that was projected to grow by 40% annually. Big Blue controlled nearly two thirds of the mainframe computer market and was one of the world’s biggest corporations. IBM announced the PC in August 1981 at a starting price of $1,565. BYTE Magazine reported that 40,000 were ordered on the day of the announcement, mostly from software developers. Apple bought a full-page ad in the Wall Street Journal that said “Welcome, IBM. Seriously.” One year after its release IBM had not yet sold 100,000 units, but over 750 applications had been written for it.

Intel 80286 die
Intel 80286 die

Intel’s next CPU, the 80286, was introduced in 1982. With 134,000 transistors in a 1.5-micron process, the processor shipped in a 68-pin PGA (pin grid array) package and was designed for multi-user systems with multi-tasking applications like automated PBXs (private branch exchanges) and real-time process control. The performance improvement of the 286 over the 8086 was 100% or 2x at the same clock speeds, and the 286 could run at 8 or 12.5 MHz. The 286 was used in the IBM PC/AT and was the first CPU used by most PC clone systems.

IBM had asked Digital Research, the makers of CP/M, to provide a version for its original PC, but DR had objected to the terms of the deal and had refused. Microsoft took advantage of the opportunity to provide an OEM version of MS-DOS called PC-DOS, which they bundled with GW-BASIC (the GW probably stands for Gates-Whitten, the two designers of the app, but was always called Gee-Whiz by Gates and users). DR offered a competitive operating system in 1981 called CP/M-86, and in 1985 offered 286-users a real-time multi-user system called Concurrent DOS 286. Digital Research also wrote a Concurrent DOS for the Motorola 68000 processor and collaborated with AT&T to develop software for UNIX System V.

Intel released the 32-bit 80386 microprocessor in 1985 and volume shipments began in 1986. The first personal computer to incorporate the 386 was a Compaq Deskpro system, which was the first time a clone company had incorporated a new processor before IBM. Intel refused to license the design for nearly five years, which gave Intel much greater control over its use. The 386 had 275,000 transistors in a 1-micron process and ran at clock speeds from 12 to 40 MHz. The next processor, the 80486, was introduced in 1989 and included a level-one cache and integrated math coprocessor.

Macintosh
Original Macintosh computer used 68000 CPU

I’ve already mentioned a couple of Intel’s competitors. First on the list in the early CPU days was Motorola, which had developed a 6800 processor in the 1970s that had been quickly eclipsed by the Zilog and Intel CPUs. In 1980, Motorola began shipping 68000 chips that were quickly adopted in Unix-based systems like Sun and Apollo workstations, Digital’s VAXStation and SGI’s Iris 1000 as well as in Apple Lisa and Macintosh personal computers. Motorola also made inroads in the gaming industry, and many arcade games and consoles were designed around the 68000.

When Seymour Cray designed the CDC 6600, he used a load/store architecture that divided instructions into two categories, memory access and arithmetical logic unit (ALU) operations. This instruction set design was not copied in microprocessors, which were based on a complex instruction set computing (CISC) which could execute several low-level operations within a single instruction. As process improvements allowed semiconductor dies to carry hundreds of thousands and then millions of transistors, a new architecture called reduced instruction set computing (RISC) gained popularity.

IBM began experimenting with RISC between 1975 and 1980, with designs that led to the POWER-PC line of processors. The Berkeley RISC project and the Stanford MIPS project both began seeing results in the early 1980s and MIPS Computer Systems was established in 1984 as a fabless semiconductor design company. A fabless company uses contract semiconductor manufacturers to produce its designs. The world’s largest semiconductor foundry is Taiwan Semiconductor Manufacturing Company (TSMC), founded in 1987. MIPS’s produced the first commercial RISC CPU, the R2000, in 1986. It competed with Intel’s 80386 processors and Motorola’s 68000, and was used in Silicon Graphics, Ardent, and Northern Telecom systems as well as MIPS’s own line of Unix workstations.

sgi O2The R3000 was released in 1988 and used in systems from SGI, Ardent, and MIPS, but also by DEC, NEC, Prime, Tandem, and Seiko Epson. It was also used in the Sony Playstation and PS/2, and in NASA’s New Horizons Probe, which flew by Jupiter in 2007 and Pluto in 2015. The R4000 was announced in 1991 and began shipping in volume in 1993, after SGI acquired MIPS. The company then released R8000 and R10000 CPUs in 1994 and 1996, which were the centerpieces of SGI’s workstations and servers while versions of the R4000 redesigned to be used in additional Sony Playstations and in the Nintendo 64.

The Berkeley RISC architecture became the basis for Sun Microsystems SPARC processors between 1986 and 87. Sun, which had been established in 1982 by several Stanford graduate students and Bill Joy, a Berkeley grad student who had been instrumental in creating Berkeley’s BSD Unix, had used Motorola 68000s for its first three workstations. The Sun 4 used a 32-bit SPARC CPU in 1987; a 64-bit UltraSPARC design was implemented in 1995.

After releasing workstations based on MIPS R2000 processors in the late 1980s, Digital created the Alpha processor in 1994. Alpha was successful for a brief period due to its superior design, and was the first processor to implement a secondary cache onboard. DEC sold its Alpha division to Compaq in 1998 and Compaq, already an Intel house, phased Alpha out when Intel Itanium CPUs became available.

IBM POWER-PC processors ironically found their way into Apple’s Macintosh G3, G4, and G5 systems, due to a joint venture between IBM and Motorola to manufacture and market the chip. The CPU also powered Wii, X-box 360, Playstation 3, CISCO Edge routers, and the Mars rover Curiosity.

Graph of Moore's Law

We’re reaching a point where the power of CPU chips, cheap availability of memory and mass storage, and data transfer rates across networks has increased at a relatively steady rate, doubling every two years or so for the past fifty years, following the predictions of Moore’s Law. Software, on the other hand, has not visibly improved at the same rate. How much more powerful is MS Word or Excel, really, than the versions that shipped with a Macintosh Plus in 1985? Features have been added, but functions have remained more or less the same for many applications. It is now possible to emulate nearly any computing platform on any other, which has allowed for user interfaces and experiences to become similar or identical across wide varieties of devices. In the next chapter, we’ll look at some of the software and networking advances that have aided in the creation of a globally networked society.

 

 

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

History of High Tech Copyright © by Dan Allosso is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book