alex-wiltshire-home-computers

Home Computers: 100 Icons that Defined a Digital Generation (excerpt)

Whether you remember waiting for dial-up access, tiny screens, and green lines of text or not, you'll get a kick out of Alex Wiltshire's travel back in time to when computers came with wires. Enjoy this excerpt of Home Computers, courtesy of MIT press, with nostalgia photography by John Short.

Home Computers: 100 Icons that Defined a Digital Generation

Excerpted from the Introduction to Home Computers: 100 Icons that Defined a Digital Generation by Alex Wiltshire, with photographs by John Short. Reprinted with Permission from The MIT PRESS. Copyright © 2020. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.

The computer has taken many forms over its long history. Antikythera mechanism; planimeter; arithmometer; Old Brass Brains. Each of these devices took in information, manipulated it according to a series of instructions, and then spat out results, helping to build structures, navigate space, reveal natural phenomena, and express concepts that would otherwise be beyond their creators’ minds. They have quietly extended the limits of what humanity can do for centuries. But over a few short years at the end of the twentieth century, the computer experienced a revolution.

This was the moment in which the computer evolved from a tool designed to perform particular tasks for specialists into a general machine for all. It began taking part in everyday life, playing an essential role in homes and offices and changing the nature of work and leisure. It inspired generations of artists, engineers, and designers and helped form new fields of creativity, entertainment, and production. It made fortunes and took them, and it carved the essential foundation for an even wider digital revolution that was still to come.

During these tumultuous and defining years, the computer became electronic and digital, a humming, beige case that barged its way onto desks and trailed wires across rooms. It took over TV screens, presenting its users with the steady blink of an idle cursor and introducing them to arcane new languages which acted as an interface between them.

The microcomputer was another step along the computer’s journey from clanking calculating mechanism to ubiquitous digital device, the result of a series of technological advances which brought about a crucial fusion of miniaturization and mass production. But it was also the child of many steps of theoretical development which established ways of representing and transfiguring abstract numbers with minute pulses of electricity.

img-4651

Apple II (Photography by ©John Short / courtesy of MIT Press)

img-4652

Apple II (Photography by ©John Short / courtesy of MIT Press)

img-4653

Apple II (Photography by ©John Short / courtesy of MIT Press)

img-4654

Apple IMAC (Photography by ©John Short / courtesy of MIT Press)

img-4655

Apple IMAC (Photography by ©John Short / courtesy of MIT Press)

img-4656

Apple IMAC (Photography by ©John Short / courtesy of MIT Press)

The key work was done in the first half of the twentieth century by mathematicians such as Alan Turing, Walther Bothe, Akira Nakashima, and Claude Shannon. They began to expand on older theories, such as that by the seventeenth-century polymath Gottfried Wilhelm Leibniz, who, inspired by the I Ching, showed how binary numbers could perform logical and arithmetical functions. They went back to the papers of Charles Sanders Peirce, who had realized at the end of the nineteenth century that electrical circuits could perform logical functions. They explored ways in which logic gates could take binary inputs and produce outputs, and how it could be transported within the circuits of a machine.

The first culmination of their theory was realized by Max Newman and Tommy Flowers in 1943 as they completed Colossus, the first digital electronic computer, and in 1948 when the Manchester Baby became the first electronic computer that could store programs. But many more projects were developing across Europe and North America. Frequently built on the mechanical computers that directed weaponry and decoded communications during the Second World War, there was MIT’s Whirlwind I, one of the first computers that could calculate in parallel, and ENIAC, the first general-purpose electronic computer, made for the US Army’s Ballistic Research Laboratory.

It seems probable that once the machine thinking method had started, it would not take long to outstrip our feeble powers… They would be able to converse with each other to sharpen their wits. At some stage therefore, we should have to expect the machines to take control.

– Alan Turing

I visualize a time when we will be to robots what dogs are to humans. And I am rooting for the machines.

– Claude Shannon

En masse, these machines inspired a wave of new invention which further accelerated their development. In 1947, physicists working at Bell Labs invented the transistor, a tiny semiconductor which could control electronic signals. Replacing big, hot, and unreliable vacuum tubes, the transistor allowed electronics engineers to build ever more intricate circuits, packing more components closer together and raising their computational power.

By the 1960s, large companies such as IBM and Control Data Corporation had grown to design and build mainframes. Wardrobe-sized and stratospherically expensive, these large and powerful computers were capable of storing and processing vast sets of data such as population statistics and industrial outputs, but they were confined to corporate headquarters and university campuses. It would take another vital advance before the computer could make the jump to the human scale of the garage workbench, office desk, or the floor in front of the living-room TV.

img-4657

Apple IMAC (Photography by ©John Short / courtesy of MIT Press)

img-4658

Apple Macintosh (Photography by ©John Short / courtesy of MIT Press)

img-4659

Apple Macintosh (Photography by ©John Short / courtesy of MIT Press)

img-4660

Apple Macintosh (Photography by ©John Short / courtesy of MIT Press)

img-4661

Apple Macintosh(Photography by ©John Short / courtesy of MIT Press)

That jump was called the silicon gate. In the late 1960s, Federico Faggin at Fairchild Semiconductor in Palo Alto, California, tried exchanging aluminium control gates in transistors for ones made of polycrystalline silicon, and found they leaked less current, required less space, and worked more reliably. Suddenly, the multiple boards of components that comprised the innards of the previous generations of computers could be compressed into tiny integrated circuits. The microprocessor was born: a single chip which could perform multiple functions at a far lower cost of production.

The microprocessor enabled mass production for the mass market. The first to be commercially available was Intel’s 4-bit 4004 in 1971. It held 2,300 transistors and its circuit lines were 10 microns wide, and it was capable of performing 92,600 instructions per second. Two other microprocessors also appeared around the same time: Garrett AiResearch’s MP944, which was first used as part of the Central Air Data Computer for F14 fighter planes, and Texas Instruments’ TMS 1000. None of them was powerful – the 4004 could only really drive a calculator – and they couldn’t remotely compete in pure performance with mainframes.

But they were just the vanguard. Three years later, Intel shipped the 8-bit 8080, which was much quicker, supported a greater variety of instructions, and could interface with other components more flexibly, and it powered the very first generation of kit microcomputers.

Kits comprised circuit designs, build instructions, and the components to make them, and they were the first computers that made their way into family homes. Requiring soldering skills and an understanding of electronics, not to mention a good deal of money, kits such as the Altair 8800 were very much the domain of hobbyists, enthusiasts who tinkered in their garages to explore what a computer could do. The act of building them lent insights into how they worked and gave opportunities to customize and augment them with better parts.

That self-built nature naturally led to dreams of running businesses: if you could make one for yourself, perhaps you could make your own to sell? Especially in places like Silicon Valley, where so much computer research and development was going on, a cottage industry of manufacturers who designed new kits and components grew. Magazines such as Popular Electronics rushed to support it, sharing circuit designs and program listings, reviewing products, and selling advertising.

img-4662

Commodore (Photography by ©John Short / courtesy of MIT Press)

img-4663

Commodore (Photography by ©John Short / courtesy of MIT Press)

img-4664

Commodore (Photography by ©John Short / courtesy of MIT Press)

Clubs started up, the most emblematic being the Homebrew Computer Club, which was founded by Gordon French and Fred Moore in Menlo Park, California, in March 1975. Their mission was to help make computers accessible to anyone, exchanging ideas and know-how over beers. They also published a newsletter that became the voice of Silicon Valley, with such material as a letter from Microsoft cofounder Bill Gates in February 1976 called ‘An Open Letter to Hobbyists’ that called out the scene about fair pay for his new company’s software:

As the majority of hobbyists must be aware, most of you steal your software. Hardware must be paid for, but software is something to share. Who cares if the people who worked on it get paid?

Is this fair? … The royalty paid to us, the manual, the tape and the overhead make it a break-even operation. One thing you do is prevent good software from being written. Who can afford to do professional work for nothing? What hobbyist can put 3-man years into programming, finding all bugs, documenting his product and distribute for free? The fact is, no one besides us has invested a lot of money in hobby software. We have written 6800 BASIC, and are writing 8080 APL and 6800 APL, but there is very little incentive to make this software available to hobbyists. Most directly, the thing you do is theft.

The Homebrew Computer Club achieved its aim. Inspired by being one of the thirty-two people who attended its first meeting, Steve Wozniak designed and built the Apple I. While showing it off at a later meeting he met Steve Jobs, and together they founded the American company which arguably did the most to explore and expand the microcomputer’s potential.

Outside meetings, the newsletter helped to establish the language and shape of this new category of computers, establishing the concept of the ‘personal computer‘: a machine designed for a person’s everyday individual use. After all, for most of the 1970s, there was no agreed form to the microcomputer. It wasn’t until a more formalized commercial industry began to grow that it started to come in cases or be supplied with integrated keyboards, speakers, or displays. Computers weren’t designed, in the sense that they were intended for a particular use. They were, more or less, just computers for computers’ sake.

Until, that is, 1977. That year, three companies introduced new computers which were very much designed along the lines of discussions at the Homebrew Computer Club. Two out of the three – the Apple II and the PET 2001 – were specifically marketed as ‘personal computers’; in other words, they were aimed at a new market of buyers who weren’t looking to self-build or gain great insights into electronic circuitry. This new market, it was hoped, wanted off-the-shelf machines that came with every necessary component and only needed to be plugged into the wall before they worked.

In other words, it was time to popularize the microcomputer. As Jack Tramiel, CEO of Commodore, makers of the PET 20018, once put it: Computers for the masses, not the classes.

img-4665

Science of Cambridge (Photography by ©John Short / courtesy of MIT Press)

img-4666

Science of Cambridge (Photography by ©John Short / courtesy of MIT Press)

img-4667

Science of Cambridge (Photography by ©John Short / courtesy of MIT Press)

img-4668

SDC Minivac (Photography by ©John Short / courtesy of MIT Press)

img-4669

SDC Minivac (Photography by ©John Short / courtesy of MIT Press)

img-4670

SDC Minivac (Photography by ©John Short / courtesy of MIT Press)

img-4671

Sinclair Spectrum (Photography by ©John Short / courtesy of MIT Press)

img-4672

Sinclair Spectrum (Photography by ©John Short / courtesy of MIT Press)

img-4673

Sinclair Spectrum (Photography by ©John Short / courtesy of MIT Press)

RESOURCES AROUND THE WEB