Менеджеры и менеджмент (Executives and Management). Коломейцева Е.М - 16 стр.

UptoLike

190
T a s k 4. Memorize the following vocabulary:
to obsess, to captivate, to infuriate, to dominate, millennium, underestimation, eventually, ultimately, dazzling, amaz-
ing, to launch a quest, information-processing machines, a calculator, elaborate, to dub, to generate venture capital, to
toss, a programmable machine/device, breakthrough, to outline, ingenious, contraption, controversy, to crack secret
codes, to ponder, versatile, prescient, acolyte, feasibility.
T a s k 5. Read and translate the text from the magazine "News-weekExtra", Winter 1997-98.
The Computer
By Steven Levy
As the century comes to a close, the technology that obsesses us, captivates us, infuriates us and dominates us is
the computer. But ultimately, this most amazing of inventions won't be seen as an artifact of the old millennium but the
defining force of the one just dawning. Do you really think that we're already into the computer age? That's a gross un-
derestimation of what the computer will eventually do to change our world, our lives and perhaps the nature of reality
itself.
Underestimation, as it turns out, has been a constant in the brief but dazzling History of this amazing machine.
Surprisingly, the tale begins in the 19th century, when Charles Babbage, an English mathematician born in 1791,
launched a lifelong quest to build information-processing machines–first a calculator called the Difference Engine and
then a more elaborate programmable device dubbed the Analytical Engine. He lacked–among other things– electricity,
transistors, keyboards and Bill Gates. Yet in the 1830s he came astonishingly close to producing something very much
like the computers that would be celebrated decades after he died. Unfortunately, his skill at innovation was not
matched by an ability to generate venture capital, and his plans were tossed into the unforgiving core dump of history.
The idea of a programmable machine that performed humanity's mental labors reappeared in the 1930s. Specifi-
cally, the breakthrough came at the hands of another eccentric English mathematician, Alan Turing, who outlined how
it was possible to build something that could perform virtually any mathematical task that one could describe. His proof
involved an ingenious imaginary device that would be known as the Universal Turing Machine – essentially, a machine
that could duplicate the work of any other machine. Even if the "machine" were a human calculator. Turing knew what
the rest of us are still trying to wrap our minds around – such a contraption, a computer, can do anything. It's an inven-
tion that breeds invention itself.
But it took a war to bring about the physical devices that would be known as the first real computers. (A small but
noisy controversy among computer historians involves whether a device constructed in 1939 by John Atanasoff and his
student at Iowa State University, Clifford Berry, deserves the true mantle of First Electronic Computer.) In England
Turing himself worked on machines that helped crack the secret codes used by the Germans. In Germany itself, a wiz-
ard named Konrad Zuse was working on that country's computing effort but never fully realized his ideas. And in
America, a Hungarian genius named John von Neumann–perhaps the premier mathematician of this century-was pon-
dering mechanical devices to help perform the calculations required for the Manhattan Project. A chance meeting at a
train platform in 1944 led him to a team of scientists working at the University of Pennsylvania to create ENIAC (Elec-
tronic Numerical Integrator and Computer), which many people consider the true Adam of computers. Designed by J.
Presper Eckert and John Mauchly to help crunch numbers for artillery-target estimates, this device used 18, 000 vacuum
tubes and cost $400, 000.
Von Neumann was fascinated, and he worked with the ENIAC people to take computing to the next level: ED-
VAC, which was essentially a blueprint for the machines that followed: memory, stored programs and a central proces-
sor for number crunching. This scheme was sufficiently versatile to launch computers into the commercial realm. But
even then, underestimation was as thick as in Babbage's day. Thomas Watson Sr., the head of the company that was
perhaps most prescient of all in embracing the idea–IBM – thought it unimaginable that there would ever be a world-
wide need for the machine. "I think there is a world market," said Watson, "for maybe five computers."
As we know, IBM sold a lot more than five computers. During the '50s and '60s big institutions and businesses
used these expensive devices to perform complicated tasks, churning out responses to programs fed into the machine on
manila cards. But while a quasi-priesthood of caretakers controlled access to the rooms that held these beasts, a small
underground proto-hacker culture also emerged. These adventuresome super nerds used the computer to process words,
to draw pictures and even to play chess. (Nay Sayers predicted that a computer would never master this purely human
intellectual pursuit. Garry Kasparov probably wishes they were right.)
What finally bound those two cultures together was the development of the personal computer. This was made
possible by the invention of the microprocessor – a computer on a chip – by Intel Corp.'s Ted Hoff in 1971. Essentially,
what once filled a room and cost as much as a mansion had been shrunk down to the size of a postage stamp and the
cost of a dinner. By 1975, the PC was just waiting to be born, and the obstetrician was Ed Roberts, a Florida-born engi-
neer who dreamed of a machine that would deliver to the ordinary man a machine that was the mental equivalent of
what the pharaohs had in Egypt: thousands of workers to do one's bidding. His Altair microcomputer was announced in
January of that year, and though it had limited practical value (the only way to put a program in was to painstakingly
flick little switches), it caused a sensation among a small cult of tweak-heads and engineers.
Like who? A Harvard student named Gates, for one, who instantly began writing Altair software. Another acolyte
was Stephen Wozniak, who quickly designed his own machine, the Apple II.