Менеджеры и менеджмент (Executives and Management) - 19 стр.

UptoLike

193
other things– electricity, transistors, keyboards and Bill Gates. Yet in the 1830s he came astonishingly close to
producing something very much like the computers that would be celebrated decades after he died. Unfortu-
nately, his skill at innovation was not matched by an ability to generate venture capital, and his plans were
tossed into the unforgiving core dump of history.
The idea of a programmable machine that performed humanity's mental labors reappeared in the 1930s.
Specifically, the breakthrough came at the hands of another eccentric English mathematician, Alan Turing, who
outlined how it was possible to build something that could perform virtually any mathematical task that one
could describe. His proof involved an ingenious imaginary device that would be known as the Universal Turing
Machine essentially, a machine that could duplicate the work of any other machine. Even if the "machine"
were a human calculator. Turing knew what the rest of us are still trying to wrap our minds around such a
contraption, a computer, can do anything. It's an invention that breeds invention itself.
But it took a war to bring about the physical devices that would be known as the first real computers. (A
small but noisy controversy among computer historians involves whether a device constructed in 1939 by John
Atanasoff and his student at Iowa State University, Clifford Berry, deserves the true mantle of First Electronic
Computer.) In England Turing himself worked on machines that helped crack the secret codes used by the
Germans. In Germany itself, a wizard named Konrad Zuse was working on that country's computing effort but
never fully realized his ideas. And in America, a Hungarian genius named John von Neumann–perhaps the
premier mathematician of this century-was pondering mechanical devices to help perform the calculations re-
quired for the Manhattan Project. A chance meeting at a train platform in 1944 led him to a team of scientists
working at the University of Pennsylvania to create ENIAC (Electronic Numerical Integrator and Computer),
which many people consider the true Adam of computers. Designed by J. Presper Eckert and John Mauchly to
help crunch numbers for artillery-target estimates, this device used 18, 000 vacuum tubes and cost $400, 000.
Von Neumann was fascinated, and he worked with the ENIAC people to take computing to the next level:
EDVAC, which was essentially a blueprint for the machines that followed: memory, stored programs and a cen-
tral processor for number crunching. This scheme was sufficiently versatile to launch computers into the com-
mercial realm. But even then, underestimation was as thick as in Babbage's day. Thomas Watson Sr., the head
of the company that was perhaps most prescient of all in embracing the idea–IBM thought it unimaginable
that there would ever be a worldwide need for the machine. "I think there is a world market," said Watson, "for
maybe five computers."
As we know, IBM sold a lot more than five computers. During the '50s and '60s big institutions and busi-
nesses used these expensive devices to perform complicated tasks, churning out responses to programs fed into
the machine on manila cards. But while a quasi-priesthood of caretakers controlled access to the rooms that
held these beasts, a small underground proto-hacker culture also emerged. These adventuresome super nerds
used the computer to process words, to draw pictures and even to play chess. (Nay Sayers predicted that a com-
puter would never master this purely human intellectual pursuit. Garry Kasparov probably wishes they were
right.)
What finally bound those two cultures together was the development of the personal computer. This was
made possible by the invention of the microprocessor a computer on a chip – by Intel Corp.'s Ted Hoff in
1971. Essentially, what once filled a room and cost as much as a mansion had been shrunk down to the size of a
postage stamp and the cost of a dinner. By 1975, the PC was just waiting to be born, and the obstetrician was
Ed Roberts, a Florida-born engineer who dreamed of a machine that would deliver to the ordinary man a ma-
chine that was the mental equivalent of what the pharaohs had in Egypt: thousands of workers to do one's bid-
ding. His Altair microcomputer was announced in January of that year, and though it had limited practical value
(the only way to put a program in was to painstakingly flick little switches), it caused a sensation among a small
cult of tweak-heads and engineers.
Like who? A Harvard student named Gates, for one, who instantly began writing Altair software. Another
acolyte was Stephen Wozniak, who quickly designed his own machine, the Apple II.
Even then, people still kept underestimating. Consider what Ken Olsen, head of the then powerful Digital
Equipment Corp., had to say when asked about the idea of the computer's becoming a common device: "There
is no reason for any individual to have a computer in his home." What proved him wrong was the grass-roots
development of software for these small devices: word processing, games and, perhaps the most crucial of all, a
program called VisiCalc that not only automated the previously tedious task of calculating financial spread-
sheets, but made modeling of business plans as easy as sneezing. Electronic spreadsheets were the tool that per-