By the millions, it
is beeping its way into offices, schools and homes
By Otto Friedrich. Reported by Michael Mortiz/San Francisco, J. Madeleine
Nash/Chicago and Peter Stoler/New York
WILL SOMEONE PLEASE TELL ME, the bright red advertisement asks in mock
irritation, WHAT A PERSONAL COMPUTER CAN DO? The ad provides not merely
an answer, but 100 of them. A personal computer, it says, can send letters
at the speed of light, diagnose a sick poodle, custom- tailor an insurance
program in minutes, test recipes for beer. Testimonials abound. Michael
Lamb of Tucson figured out how a personal computer could monitor anesthesia
during surgery; the rock group Earth, Wind and Fire uses one to explode
smoke bombs onstage during concerts; the Rev. Ron Jaenisch of Sunnyvale,
Calif., programmed his machine so it can recite an entire wedding ceremony.
In the cavernous Las Vegas Convention Center a month ago, more than 1,000
computer companies large and small were showing off their wares, their
floppy discs and disc drives, joy sticks and modems, to a mob of some
50,000 buyers, middlemen and assorted technology buffs. Look! Here is
Hewlett-Packard's HP9000, on which you can sketch a new airplane, say,
and immediately see the results in 3-D through holograph imaging; here
is how the Votan can answer and act on a telephone call in the middle
of the night from a salesman on the other side of the country; here is
the Olivetti M20 that entertains bystanders by drawing garishly colored
pictures of Marilyn Monroe, here is a program designed by The Alien Group
that enables an Atari computer to say aloud anything typed on its keyboard
in any language. It also sings, in a buzzing humanoid voice, Amazing Grace
and When I'm 64 or anything else that anyone wants to teach it.
As both the Apple Computer advertisement and the Las Vegas circus indicate,
the enduring American love affairs with the automobile and the television
set are now being transformed into a giddy passion for the personal computer.
This passion is partly fad, partly a sense of how life could be made better,
partly a gigantic sales campaign. Above all, it is the end result of a
technological revolution that has been in the making for four decades
and is now, quite literally, hitting home.
Americans are receptive to the revolution and optimistic about its impact.
A new poll* for TIME by Yankelovich, Skelly and White indicates that nearly
80% of Americans expect that in the fairly near future, home computers
will be a commonplace as television sets or dishwashers. Although they
see dangers of unemployment and dehumanization, solid majorities feel
that the computer revolution will ultimately raise production and therefore
living standards (67%), and that it will improve the quality of their
children's education (68%).
[*The telephone survey of 1,019 registered voters was conducted on Dec.
8 and 9. The margin of sampling error is plus or minus 3%.]
The sales figures are awesome and will become more so. In 1980 some two
dozen firms sold 724,000 personal computers for $1.8 billion. The following
year 20 more companies joined the stampede, including giant IBM, and sales
doubled to 1.4 million units at just under $3 billion. When the final
figures are in for 1982, according to Dataquest, a California research
firm, more than 100 companies will probably have sold 2.8 million units
for $4.9 billion.
To be sure, the big, complex, costly "mainframe" computer has
been playing an increasingly important role in practically everyone's
life for the past quarter-century. It predicts the weather, processes
checks, scrutinizes tax returns, guides intercontinental missiles and
performs innumerable other operations for governments and corporations.
The computer has made possible the exploration of space. It has changed
the way wars are fought, as the Exocet missile proved in the South Atlantic
and Israel's electronically sophisticated forces did in Lebanon.
Despite its size, however, the mainframe does its work all but invisibly,
behind the closed doors of a special, climate-controlled room. Now, thanks
to the transistor and the silicon chip, the computer has been reduced
so dramatically in both bulk and price that it is accessible to millions.
In 1982 a cascade of computers beeped and blipped their way into the American
office, the American school, the American home. The "information
revolution" that futurists have long predicted has arrived, bringing
with it the promise of dramatic changes in the way people live and work,
perhaps even in the way they think. America will never be the same.
In a larger perspective, the entire world will never be the same. The
industrialized nations of the West are already scrambling to computerize
(1982 sales: 435,000 in Japan, 392,000 in Western Europe). The effect
of the machines on the Third World is more uncertain. Some experts argue
that computers will, if anything, widen gap between haves and have-nots.
But the prophets of high technology believe the computer is so cheap and
so powerful that it could enable under- developed nations to bypass the
whole industrial revolution. While robot factories could fill the need
for manufactured goods, the microprocessor would create myriad new industries,
and an international computer network could bring important agricultural
and medical information to even the most remote villages. "What networks
of railroads, highways and canals were in another age, networks of telecommunications,
information and computerization...are today," says Austrian Chancellor
Bruno Kreisky. Says French Editor Jean-Jacques Servan-Schreiber, who believes
that the computer's teaching capability can conquer the Third World's
illiteracy and even its tradition of high birth rates: "It is the
source of new life that has been delivered to us."
The year 1982 was filled with notable events around the globe. It was
a year in which death finally pried loose Leonid Brezhnev's frozen grip
on the Soviet Union, and Yuri Andropov, the cold-eyed ex-chief of the
KGB, took command. It was a year in which Israel's truculent Prime Minister
Menachem Begin completely redrew the power map of the Middle East by invading
neighboring Lebanon and smashing the Palestinian guerrilla forces there.
The military campaign was a success, but all the world looked with dismay
at the thunder of Israeli bombs on Beirut's civilians and at the massacres
in the Palestinian refugee camps. It was a year in which Argentina tested
the decline of European power by seizing the Falkland Islands, only to
see Britain, led by doughty Margaret Thatcher, meet the test by taking
them back again.
Nor did all of the year's major news derive from wars or the threat of
international violence. Even as Ronald Reagan cheered the sharpest decline
in the U.S. inflation rate in ten years, 1982 brought the worse unemployment
since the Great Depression (12 million jobless) as well as budget deficits
that may reach an unprecedented $180 billion in fiscal 1982. High unemployment
plagued Western Europe as well, and the multibillion-dollar debts of more
than two dozen nations gave international financiers a severe fright.
It was also a year in which the first artificial heart began pumping life
inside a dying man's chest, a year in which millions cheered the birth
of cherubic Prince William Arthur Philip Louis of Britain, and millions
more rooted for a wrinkled, turtle-like figure struggling to find its
way home to outer space.
There are some occasions, though, when the most significant force in a
year's news is not a single individual but a process, and a widespread
recognition by a whole society that this process is changing the course
of all other processes. That is why, after weighing the ebb and flow of
events around the world, TIME has decided that 1982 is the year of the
computer. It would have been possible to single out as Man of the Year
one of the engineers or entrepreneurs who masterminded this technological
revolution, but no one person has clearly dominated those turbulent events.
More important, such a selection would obscure the main point. TIME's
Man of the Year for 1982, the greatest influence for good or evil, is
not a man at all. It is a machine: the computer.
It is easy enough to look at the world around us and conclude that the
computer has not changed things all that drastically. But one can conclude
from similar observations that the earth is flat, and that the sun circles
it every 24 hours. Although everything seems much the same from one day
to the next, changes under the surface of life's routines are actually
occurring it almost unimaginable speed. Just 100 years ago, parts of New
York City were lighted for the first time by a strange new force called
electricity; just 100 years ago, the German Engineer Gottlieb Daimler
began building a gasoline-fueled internal combustion engine (three more
years passed before he fitted it to a bicycle). So it is with the computer.
The first fully electronic digital computer built in the U.S. dates back
only to the end of World War II. Created at the University of Pennsylvania.
ENIAC weighed 30 tons and contained 18,000 vacuum tubes, which failed
at an average of one every seven minutes. The arrival of the transistor
and miniaturized circuit in the 1950s made it possible to reduce a room-size
computer to a silicon chip the size of a pea. And prices kept dropping.
In contract to the $487,000 paid for ENIAC, a top IBM personal computer
today costs about $4,000, and some discounters offer a basic Timex-Sinclair
1000 for $77.95. One computer expert illustrates the trend by estimating
that if the automobile business had developed like the computer business,
a Rolls-Royce would now cost $2.75 and run 3 million miles on a gallon
of gas.
Looking ahead, the computer industry sees pure gold. There are 83 million
U.S. homes with TV sets, 54 million white-collar workers, 26 million professionals,
4 million small businesses. Computer salesmen are hungrily eyeing every
one of them. Estimates for the number of personal computers in use by
the end of the century run as high as 80 million. Then there are all the
auxiliary industries: desks to hold computers, luggage to carry them,
cleansers to polish them. "The surface is barely scratched,"
says Ulric Weil, an analyst for Morgan Stanley.
Beyond the computer hardware lies the virtually limitless market for software,
all those prerecorded programs that tell the willing but mindless computer
what to do. These discs and cassettes range from John Wiley & Sons'
investment analysis program for $59.95 (some run as high as $5,000) to
Control Data's PLATO programs that teach Spanish or physics ($45 for the
first lesson, $35 for succeeding ones) to a profusion of space wars, treasure
hunts and other electronic games.
This most visible aspect of the computer revolution, the video game, is
its least significant. But even if the buzz and clang of the arcades is
largely a teen-age fad, doomed to go the way of Rubik's Cube and the Hula
Hoop, it is nonetheless a remarkable phenomenon. About 20 corporations
are selling some 250 different game cassettes for roughly $2 billion this
year. According to some estimates, more than half of all the personal
computers bought for home use are devoted mainly to games.
Computer enthusiasts argue that these games have educational value, by
teaching logic, or vocabulary, or something. Some are even used for medical
therapy. Probably the most important effect of these games, however, is
that they have brought a form of the computer into millions of homes and
convinced millions of people that it is both pleasant and easy to operate,
what computer buffs call "user friendly." Games, says Philip
D. Estridge, head of IBM's personal computer operations, "aid in
the discovery process."
Apart from games, the two things that the computer does best have wide
implications but are quite basic. One is simply computation, manipulating
thousands of numbers per second. The other is the ability to store, sort
through and rapidly retrieve immense amounts of information. More than
half of all employed Americans now earn their living not by producing
things but as "knowledge workers," exchanging various kinds
of information, and the personal computer stands ready to change how all
of them do their jobs.
Frank Herringer, a group vice president of Transamerica Corp., installed
an Apple in his suburban home in Lafayette, Calif., and spent a weekend
analyzing various proposals for Transamerica's $300 million takeover of
the New York insurance brokerage firm of Fred S. James Co. Inc. "It
allowed me to get a good feel for the critical numbers," says Herringer.
"I could work through alternative options, and there were no leaks."
Terry Howard, 44, used to have a long commute to his job at a San Francisco
stock brokerage, where all his work involved computer data and telephoning.
With a personal computer, he set up his own firm at home in San Rafael.
Instead of rising at 6 a.m. to drive to the city, he runs five miles before
settling down to work. Says he: "It didn't make sense to spend two
hours of every day burning up gas, when my customers on the telephone
don't care whether I'm sitting at home or in a high rise in San Francisco."
John Watkins, safety director at Harriet & Henderson Yarns, in Henderson,
N.C., is one of 20 key employees whom the company helped to buy home computers
and paid to get trained this year. Watkins is trying to design a program
that will record and analyze all mill accidents: who was injured, how,
when, why. Says he: "I keep track of all the cases that are referred
to a doctor, but for every doctor case, there are 25 times as many first-aid
cases that should be recorded." Meantime, he has designed a math
program for his son Brent and is shopping for a work-processing program
to help his wife Mary Edith write her master's thesis in psychology. Says
he: "I don't know what it can't do. It's like asking yourself, `What's
the most exciting thing you've ever done?' Well, I don't know because
I haven't done it yet."
Aaron Brown, a former defensive end for the Kansas City Chiefs and now
an office-furniture salesman in Minneapolis, was converted to the computer
by his son Sean, 15, who was converted at a summer course in computer
math. "I thought of computers very much as toys," says Brown,
"but Sean started telling me. `You could use a computer in your work.'
I said, `Yeah, yeah, yeah.'" Three years ago, the family took a vote
on whether to go to California for a vacation or to buy an Apple. The
Apple won, 3 to 1, and to prove its value, Sean wrote his father a program
that computes gross profits and commissions on any sale.
Brown started with "simple things," like filing the names and
telephone numbers of potential customers. "Say I was going to a particular
area of the city," Brown says. "I would ask the computer to
pull up the accounts in a certain zip-code area, or if I wanted all the
customers who were interested in whole office systems, I could pull that
up too." The payoff: since he started using the computer, he has
doubled his annual sales to more than $1 million.
Brown has spent about $1,500 on software, all bound in vinyl notebooks
along a wall of his home in Golden Valley, Minn., but Sean still does
a lot of programming on his won. He likes to demonstrate one that he designed
to teach French. "Vive la France!" it says, and then starts
beeping the first notes of La Marseillaise. His mother Reatha uses the
computer to help her manage a gourmet cookware store, and even Sister
Terri, who originally cast the family's lone vote against the computer,
uses it to store her high school class notes. Says Brown: "It's become
kind of like the bathroom. Is someone is using it, you wait your turn."
Reatha Brown has been lobbying for a new carpet, but she is becoming resigned
to the prospect that the family will acquire a new hard-disc drive instead.
"The video-cassette recorder," she sighs, pointing across the
room, "that was my other carpet." Replies her husband, setting
forth an argument that is likely to be replayed in millions of household
in the years just ahead: "We make money with the computer, but all
we can do with a new carpet is walk on it. Somebody once said there were
five reasons to spend money: on necessities, on investments, on self-improvement,
on memories and to impress your friends. The carpet falls in that last
category, but the computer falls in all five."
By itself, the personal computer is a machine with formidable capabilities
for tabulating, modeling or recording. Those capabilities can be multiplied
almost indefinitely by plugging it into a network of other computers.
This is generally done by attaching a desk-top model to a telephone line
(two-way cables and earth satellites are coming increasingly into use).
One can then dial an electronic data base, which not only provides all
manner of information but also collects and transmits messages: electronic
mail.
The 1,450 data bases that now exist in the U.S. range from general information
services like the Source, a Reader's Digest subsidiary in McLean, Va.,
which can provide stock prices, airline schedules or movie reviews, to
more specialized services like the American Medical Association's AMA/NET,
to real esoterica like the Hughes Rotary Rig Report. Fees vary from $300
an hour to less than $10.
Just as the term personal computer can apply to both a home machine and
an office machine (and indeed blurs the distinction between the two places)
many of the first enthusiastic users of these devices have been people
who do much of their work at home: doctors, lawyers, small businessmen,
writers, engineers. Such people also have special needs for the networks
of specialized data.
Orthopedic Surgeon Jon Love, of Madisonville, Ky., connects the Apple
in his home to both the AMA/NET, which offers, among other things, information
on 1,500 different drugs, and Medline, a compendium of all medical articles
published in the U.S. "One day I accessed the computer three times
in twelve minutes," he says. "I needed information on arthritis
and cancer in the leg. It saved me an hour and a half of reading time.
I want it to pay me back every time I sit down at it."
Charles Manly III practices law in Grinnell, Iowa (pop. 8,700) a town
without a law library, so he pays $425 a month to connect his CPT work
processor to Westlaw, a legal data base in St. Paul. Just now he needs
precedents in an auto insurance case. He dials the Westlaw telephone number,
identifies himself by code, then types: "Courts (Iowa) underinsurance."
The computer promptly tells him there is only one such Iowa case, and
it is 14 years old. Manly asks for a check on other Midwestern states,
and it gives him a long list of precedents in Michigan and Minnesota.
I'm not a chiphead," he says, "but if you don't keep up with
the new developments, even in a rural general practice, you're not going
to have the competitive edge."
The personal computer and its networks are even changing that oldest of
all home businesses, the family farm. Though only about 3% of commercial
farmers and ranchers now have computers, that number is expected to rise
to nearly 20% within the next five years. One who has grasped the true
faith is Bob Johnson, who helps run his family's 2,800-acre pig farm near
De Kalb, Ill. Outside, the winter's first snowflakes have dusted the low-slung
roofs of the six red-and-white barns and the brown fields specked with
corn stubble. Inside the two- room office building, Johnson slips a disc
into his computer and types "D" (for dial) and a telephone number.
He is immediately connected to the Illinois farm bureau's newly computerized
AgriVisor service. It not only gives him weather conditions to the west
and the latest hog prices on the Chicago commodities exchange, but also
offers advice. Should farmers continue to postpone the sale of their newly
harvested corn? "Remember," the computer counsels, "that
holding on for a dime or a nickel may not be worth the long-term wait."
Johnson started out playing computer games on an Apple II, but then "those
got shoved in the file cabinet." He began computerizing all his farm
records, which was not easy. "We could keep track of the hogs we
sold in dollars, but we couldn't keep track of them by pounds and numbers
at the same time." He started shopping around and finally acquired
a $12,000 combination at a shop in Lafayette, Ind.: a microcomputer from
California Computer Systems, a video screen from Ampex, a Diablo would
printer and an array of agricultural programs.
Johnson's computer now knows the yields on 35 test plots of corn, the
breeding records of his 300 sows, how much feed his hogs have eaten (2,787,260
lbs.) and at what cost ($166,047.73). "This way, you can charge your
hogs the cost of the feed when you sell them and figure out if you're
making any money," says Johnson. "We never had this kind of
information before. It would have taken too long to calculate. But we
knew we needed it."
Just as the computer is changing the way work is done in home offices,
so it is revolutionizing the office. Routine tasks like managing payrolls
and checking inventories have long since been turned over to computers,
but now the typewriter is giving way to the work processor, and every
office thus becomes part of a network. This change has barely begun: about
10% of the typewriters in the 500 largest industrial corporations have
so far been replaced. But the economic imperatives are inescapable. All
told, office professionals could save about 15% of their time if they
used the technology now available, says a study by Booz, Allen & Hamilton,
and that technology is constantly improving. In one survey of corporations,
55% said they were planning to acquire the latest equipment. This technology
involves not just word processors but computerized electronic message
systems that could eventually make paper obsolete, and wall-size, two-way
TV teleconference screens that will obviate traveling to meetings.
The standard home computer is sold only to somebody who wants one, but
the same machine can seem menacing when it appears in an office. Secretaries
are often suspicious of new equipment, particularly if it appears to threaten
their jobs, and so are executives. Some senior officials resist using
a keyboard on the ground that such work is demeaning. Two executives in
a large firm reportedly refuse to read any computer print-out until their
secretaries have retyped it into the form of a standard memo. "The
biggest problem is introducing computers into an office is management
itself," says Ted Stout of National Systems Inc., an office design
firm in Atlanta. "They don't understand it, and they are scared to
death of it."
But there is an opposite fear that drives anxious executives toward the
machines: the worry that younger and more sophisticated rivals will push
ahead of them. "All you have to do," says Alexander Horniman,
an industrial psychologist at the University of Virginia's Darden School
of Business, "is walk down the hall and see people using the computer
and imagine they have access to all sorts of information you don't."
Argues Harold Todd, executive vice president at First Atlanta Bank: "Managers
who do not have the ability to use a terminal within three to five years
may become organizationally dysfunctional." That is to say, useless.
If more and more offices do most of their work on computers, and if a
personal computer can be put in a living room, why should anyone have
to go to work in an office at all? The question can bring a stab of hope
to anybody who spends hours every day on the San Diego Freeway or the
Long Island Rail Road. Nor is "telecommuting" as unrealistic
as it sounds. Futurist Jack Nilles of the University of Southern California
has estimated that many home computer would soon pay for itself from savings
in commuting expenses and in city office rentals.
Is the great megalopolis, the marketplace of information, about to be
doomed by the new technology? Another futurist, Alvin Toffler, suggests
at least a trend in that direction. In his 1980 book, The Third Wave,
he portrays a 21st century world in which the computer revolution has
canceled out many of the fundamental changes wrought by the Industrial
Revolution: the centralization and standardization of work in the factory,
the office, the assembly line. These changes may seem eternal, but they
are less than two centuries old. Instead, Toffler imagines a revived version
of pre-industrial life in what he has named "the electronic cottage,"
a utopian abode where all members of the family work, learn and enjoy
their leisure around the electronic hearth, the computer. Says Vice President
Louis H. Mertes of the Continental Illinois Bank and Trust Co. of Chicago,
who is such a computer enthusiast that he allows no paper to be seen in
his office (though he does admit to keeping a few files in the drawer
of an end table): "We're talking when--not if--the electronic cottage
will emerge."
Continental Illinois has experimented with such electronic cottages by
providing half a dozen workers with word processors so they could stay
at home. Control Data tried a similar experiment and ran into a problem:
some of its 50 "alternate site workers" felt isolated, deprived
of their social life around the water cooler. The company decided to ask
them to the office for lunch and meetings every week. "People are
like ants, they're communal creatures," say Dean Scheff, chairman
and founder of CPT Corp., a word-processing firm near Minneapolis. "They
need to interact to get the creative juices flowing. Very few of us are
hermits."
TIME's Yankelovich poll underlines the point. Some 73% of the respondents
believed that the computer revolution would enable more people to work
at home. But only 31% said they would prefer to do so themselves. Most
work no longer involves a hayfield, a coal mine or a sweatshop, but a
field for social intercourse. Psychologist Abraham Maslow defined work
as a hierarchy of functions: it first provides food and shelter, the basics,
but then it offers security, friendship, "belongingness." This
is not just a matter of trading gossip in the corridors; work itself,
particularly in the information industries, requires the stimulation of
personal contact in the exchange of ideas: sometimes organized conferences,
sometimes simply what is called "the schmooze factor." Says
Sociologist Robert Schrank: "The workplace performs the function
of community."
But is this a basic psychological reality or simply another rut dug by
the Industrial Revolution? Put another way, why do so many people make
friends at the office rather than among their neighbors? Prophets of the
electronic cottage predict that it will once again enable people to find
community where they once did: in their communities. Continental Illinois
Bank, for one, has opened a suburban "satellite work station"
that gets employees out of the house but not all the way downtown. Ford,
Atlantic Richfield and Merrill Lynch have found that teleconferencing
can reach far more people for far less money than traditional sales conferences.
Whatever the obstacles, telecommuting seems particularly rich with promise
for millions of women who feel tied to the home because of young children.
Sarah Sue Hardinger has a son, 3, and a daughter three months old; the
computer in her cream-colored stucco house in South Minneapolis is surrounded
by children's books, laundry, a jar of Dippity Do. An experienced programmer
at Control Data before she decided to have children, she now settles in
at the computer right after breakfast, sometimes holding the baby in a
sling. She starts by reading her computer mail, then sets to work converting
a PLATO grammar program to a disc that will be compatible with Texas Instruments
machines. "Mid-morning I have to start paying attention to the three-
year-old, because he gets antsy," says Hardinger. "Then at 11:30
comes Sesame Street and Mr. Rogers, so that's when I usually get a whole
lot done." When her husband, a building contractor, comes home and
takes over the children, she returns to the computer. "I use part
of my house time for work, part of my work time for the house," she
says. "The baby has demand feeding, I have demand working."
To the nation's 10 million physically handicapped, telecommuting encourages
new hopes of earning a livelihood. A Chicago-area organization called
Lift has taught computer programming to 50 people with such devastating
afflictions as polio, cerebral palsy and spinal damage. Lift President
Charles Schmidt cites a 46-year-old man paralyzed by polio: "He never
held a job in his life until he entered our program three years ago, and
now he's a programmer for Walgreens."
Just as the vast powers of the personal computer can be vastly multiplied
by plugging it into an information network, they can be extended in all
directions by attaching the mechanical brain to sensors, mechanical arms
and other robotic devices. Robots are already at work in a large variety
of dull, dirty or dangerous jobs: painting automobiles on assembly lines
and transporting containers of plutonium without being harmed by radiation.
Because a computerized robot is so easy to reprogram, some experts foresee
drastic changes in the way manufacturing work is done: toward customization,
away from assembly- line standards. When the citizen of tomorrow wants
a new suit, one futurist scenario suggests, his personal computer will
take his measurements and pass them on to a robot that will cut his choice
of cloth with a laser beam and provide him with a perfectly tailor garment.
In the home too, computer enthusiasts delight in imagining machines performing
the domestic chores. A little of that fantasy is already reality. New
York City Real Estate Executive David Rose, for example, uses his Apple
in business deals, to catalogue his 4,000 books and to write fund-raising
letters to his Yale classmates. But he also uses it to wake him in the
morning with soft music, turn on the TV, adjust the lights and make the
coffee.
In medicine, the computer, which started by keeping records and sending
bills, now suggests diagnoses. CADUCEUS knows some 4,000 symptoms of more
than 500 diseases: MYCIN specializes in infectious diseases: PUFF measures
lung functions. All can be plugged into a master network called SUMEX-AIM,
with headquarters at Standard in the West and Rutgers in the East. This
may all sound like another step toward the disappearance of the friendly
neighborhood G.P., but while it is possible that a family doctor would
recognize 4,000 different symptoms. CADUCEUS is more likely to see patterns
in what patients report and can then suggest a diagnosis. The process
may sound dehumanized, but in one hospital where the computer specializes
in peptic ulcers, a survey of patients showed that they found the machine
"more friendly, polite, relaxing and comprehensible" than the
average physician.
The microcomputer is achieving dramatic effects on the ailing human body.
These devices control the pacemakers implanted in victims of heart disease:
they pump carefully measured quantities of insulin into the bodies of
diabetics, they test blood samples for hundreds of different allergies;
they translate sounds into vibrations that the deaf can "hear",
they stimulate deadened muscles with electric impulses that may eventually
enable the paralyzed to walk.
In all the technologists' images of the future, however, there are elements
of exaggeration and wishful thinking. Though the speed of change is extraordinary,
so is the vastness of the landscape to be changed. New technologies have
generally taken at least 20 years to establish themselves, which implied
that a computer salesman's dream of a micro on every desk will not be
fulfilled in the very near future. If ever.
Certainly the personal computer is not without its flaws. As most new
buyers soon learn, it is not that easy for a novice to use, particularly
when the manuals contain instructions like this specimen from Apple: "This
character prevents script from terminating the currently forming output
line when it encounters the script command in the input stream."
Another problem is that most personal computers end up costing considerable
more than the ads imply. The $100 model does not really do very much,
and the $1,000 version usually requires additional payments for the disc
drive or the printer or the modem. Since there is very little standardization
of parts among the dozens of new competitors, a buyer who has not done
considerable homework is apt to find that the parts he needs do not fit
the machine he bought.
Software can be a major difficulty. The first computer buyers tended to
be people who enjoyed playing with their machines and designing their
own programs. But the more widely the computer spreads, the more it will
have to be used by people who know no more about its inner workings than
they do about the insides of their TV sets--and do not want to. They will
depend entirely on the commercial programmers. Good programs are expensive
both to make and to buy. Control Data has invested $900 million in its
PLATO educational series and has not yet turned a profit, though its hopes
run into the billions. A number of firms have marketed plenty of shoddy
programs, but they are not cheap either. "Software is the new bandwagon,
but only 20% of it is any good," say Diana Hestwood, a Minneapolis-based
educational consultant. She inserts a math program and deliberately makes
ten mistakes. The machine gives its illiterate verdict: "You taken
ten guesses." Says Atari's chief scientist, Alan Kay: "Software
is getting to be embarrassing."
Many of the programs now being touted are hardly worth the cost, or hardly
worth doing at all. Why should a computer be needed to balance a checkbook
or to turn of the living-room lights? Or to recommend a dinner menu, particularly
when it can consider (as did a $34 item called the Pizza Program) ice
cream as an appetizer? Indeed, there are many people who may quite reasonably
decide that they can get along very nicely without a computer. Even the
most impressive information networks may provide the customer with nothing
but a large telephone bill. "You cannot rely on being able to find
what you want," says Atari's Kay. It's really more useful to go to
a library."
It is becoming increasingly evident that a fool assigned to work with
a computer can conceal his own foolishness in the guise of high-tech authority.
Lives there a single citizen who has not been commanded by a misguided
computer to pay an income tax installment or department store bill that
he has already paid?
What is true for fools is no less true for criminals, who are now able
to commit electronic larceny from the comfort of their living room. The
probable champion is Stanley Mark Rifkin, a computer analyst in Los Angeles,
who tricked the machines at the Security Pacific National Bank into giving
him $10 million. While free on bail for that in 1979 (he was eventually
sentenced to eight years), he was arrested for trying to steal $50 million
from Union Bank (the charges were eventually dropped). According to Donn
Parker, a specialist in computer abuse at SRI International (formerly
the Stanford Research Institute), "Nobody seems to know exactly what
computer crime is, how much of it there is, and whether it is increasing
or decreasing. We do know that computers are changing the nature of business
crime significantly."
Even if all the technical and intellectual problems can be solved, there
are major social problems inherent in the computer revolution. The most
obvious is unemployment, since the basic purpose of commercial computerization
is to get more work done by fewer people. One British study predicts that
"automation-induced unemployment" in Western Europe could reach
16% in the next decade, but most analyses are more optimistic. The general
rule seems to be that new technology eventually creates as many jobs as
it destroys, and often more. "People who put in computers usually
increase their staffs as well," says CPT's Scheff. "Of course,"
he adds, "one industry may kill another industry. That's tough on
some people."
Theoretically, all unemployed workers can be retrained, but retraining
programs are not high on the nation's agenda. Many new jobs, moreover,
will require an aptitude in using computers, and the retraining needed
to use them will have to be repeated as the technology keeps improving.
Says a chilling report by the Congressional Office of Technology Assessments:
"Lifelong retraining is expected to become the norm for many people."
There is already considerable evidence that the school children now being
educated in the use of computers are generally the children of the white
middle class. Young blacks, whose unemployment rate stands today at 50%,
will find another barrier in front of them.
Such social problems are not the fault of the computer, of course, but
a consequence of the way the American society might use the computer.
"Even in the days of the big mainframe computers, they were a machine
for the few," says Katherine Davis Fishman, author of The Computer
Establishment. "It was tool to help the rich get richer. It still
is to a large extent. One of the great values of the personal computer
is that smaller concerns, smaller organizations can now have some of the
advantages of the bigger organizations."
How society uses its computers depends greatly on what kind of computers
are made and sold, and that depends, in turn, on an industry in a state
of chaotic growth. Even the name of the product is a matter of debate:
"microcomputer" sounds too technical, but "home computer"
does not fit an office machine. "Desktop" sounds awkward, and
"personal computer" is at best a compromise. Innovators are
pushing off in different directions. Hewlett Packard is experimenting
with machines that respond to vocal commands; Osborne is leading a rush
toward portable computers, ideally no larger than a book. And for every
innovator, there are at least five imitators selling copies.
There is much talk of a coming shakeout, and California Consultant David
E. Gold predicts that perhaps no more than a dozen vendors will survive
the next five years. At the moment, Dataquest estimates that Texas Instruments
leads the low-price parade with a 35% share of the market in computers
selling for less than $1,000. Next come Timex (26%), Commodore (15%) and
Atari (13%). In the race among machines priced between $1,000 and $5,000,
Apple still commands 26% followed by IBM (17% and Tandy/Radio Shack (10%).
But IBM, which has dominated the mainframe computer market for decades,
is coming on very strong. Apple, fighting back, will unveil its new Lisa
model in January, putting great emphasis on user friendliness. The user
will be able to carry out many functions simply by pointing to a picture
of what he wants done rather than typing instructions. IBM is also reported
to be planning to introduce new machines in 1983, as are Osborne and others.
Just across the horizon, as usual, lurk the Japanese. During the 1970s,
U.S. computer manufacturers complacently felt that they were somehow immune
from the Japanese combination of engineering and salesmanship that kept
gnawing at U.S. auto, steel and appliance industries. One reason was that
the Japanese were developing their large domestic market. When they belatedly
entered the U.S. battlefield, they concentrated not on selling whole systems
but on particular sectors--with dramatic results. In low-speed printers
using what is known as the dot-matrix method, the Japanese had only a
6% share of the market in 1980; in 1982, they provided half the 500,000
such printers sold in the U.S. Says Computerland President Ed Faber: "About
75% of the dot-matrix printers we sell are Japanese, and almost all the
monitors. There is no better quality electronics than what we see coming
from Japan."
Whatever its variations, there is an inevitability about the computerization
of America. Commercial efficiency requires it, Big Government requires
it, modern life requires it, and so it is coming to pass. But the essential
element in this sense of inevitability is the way in which the young take
to computers: not as just another obligation imposed by adult society
but as a game, a pleasure, a tool, a system that fits naturally into their
lives. Unlike anyone over 40, these children have grown up with TV screens;
the computer is a screen that responds to them, hooked to a machine that
can be programmed to respond the way they want it to. That is power.
There are now more than 100,000 computers in U.S. schools, compared with
52,000 only 18 months ago. This is roughly one for every 400 pupils. The
richer and more progressive states do better. Minnesota leads with one
computer for every 50 children and a locally produced collection of 700
software programs. To spread this development more evenly and open new
doors for business. Apple has offered to donate one computer to every
public school in the U.S.--a total of 80,000 computers worth $200 million
retail--if Washington will authorize a 25% tax write-off (as is done for
donations of scientific equipment to colleges). Congress has so far failed
to approve the idea, but California has agreed to a similar proposal.
Many Americans concerned about the erosion of the schools put faith in
the computer as a possible savior of their children's education, at school
and at home. The Yankelovich poll showed that 57% thought personal computers
would enable children to read and to do arithmetic better. Claims William
Ridley, Control Data's vice president for education strategy: "If
you want to improve youngsters one grade level in reading, our PLATO program
with teacher supervision can do it up to four times faster and for 40%
less expense than teachers alone."
No less important than this kind of drill, which some critics compare
with the old-fashioned flash cards, is the use of computers to teach children
about computers. They like to learn programming, and they are good at
it, often better than their teachers, even in the early grades. They treat
it as play, a secret skill, unknown among many of their parents. They
delight in cracking corporate security and filching financial secrets,
inventing new games and playing them on military networks, inserting obscene
jokes into other people's programs. In soberer versions that sort of skill
will become a necessity in thousands of jobs opening up in the future.
Beginning in 1986, Carnegie-Mellon University expects to require all of
its students to have their own personal computers. "People are willing
to spend a large amount of money to educate their children," says
Author Fishman. "So they're all buying computers for Johnny to get
a head start (though I have not heard anyone say, `I am buying a computer
for Susie')."
This transformation of the young raises a fundamental and sometimes menacing
question: Will the computer change the very nature of human thought? And
if so, for better or worse There has been much time wasted on the debate
over whether computers can be made to think, as HAL seemed to be doing
in 2001, when it murdered the astronauts who might challenge its command
of the spaceflight. That answer is simple: computers do not think, but
they do simulate many of the processes of the human brain: remembering,
comparing, analyzing. And as people rely on the computer to do things
that they used to do inside their heads, what happens to their heads?
Will the computer's ability to do routine work mean that human thinking
will shift to a higher level? Will IQs rise Will there be more intellectuals?
The computer may make a lot of learning as unnecessary as memorizing the
multiplication tables. But if a dictionary stored in the computer's memory
can easily correct any spelling mistakes, what is the point of learning
to spell? And if the mind is freed from intellectual routine, will it
race off in pursuit of important ideas or lazily spend its time on more
video games?
Too little is known about how the mind works, and less about how the computer
might change that process. The neurological researches of Mark Rosenzweig
and his colleagues at Berkeley indicate that animals trained to learn
and assimilate information develop heavier cerebral cortices, more glial
cells and bigger nerve cells. But does the computer really stimulate the
brain's activity or, by doing so much of its work, permit it to go slack?
Some educators do believe they see the outlines for change. Seymour Papert,
professor of mathematics and education at M.I.T. and author of Mindstorms:
Children, Computers and Powerful Ideas, invented the computer language
named Logo, with which children as young as six can program computers
to design mathematical figures. Before they can do that, however, they
must learn how to analyze a problem logically, step by step. "Getting
a computer to do something," says Papert, "requires the underlying
process to be described, on some level, with enough precision to be carried
out by the machine." Charles P. Lecht, president of the New York
consulting firm Lecht Scientific, argues that "what the lever was
to the body, the computer system is to the mind." Says he: "Computers
help teach kids to think. Beyond that, they motivate people to think.
There is a great difference between intelligence and manipulative capacity.
Computers help us to realize that difference."
The argument that computers train minds to be logical makes some experts
want to reach for the computer key that says ERASE. "The last thing
you want to do is think more logically," says Atari's Kay. "The
great think about computers is that they have no gravity systems. The
logical system is one that you make up. Computers are a wonderful way
of being bizarre."
Sherry Turkle, a sociologist now finishing a book titled The Intimate
Machine: Social and Cultural Studies of Computers and People, sees the
prospect of change in terms of perceptions and feelings. Says she: "Children
define what's special about people by contrasting them with their nearest
neighbors, which have always been the animals. People are special because
they know how to think. Now children who work with computers see the computer
as their nearest neighbor, so they see that people are special because
they feel. This may become much more central to the way people think about
themselves. We may be moving toward a re-evaluation of what makes us human."
For all such prophecies, M.I.T. Computer Professor Joseph Weizenbaum has
answers ranging from disapproval to scorn. He has insisted that "giving
children computers to play with...cannot touch...any real problem,"
and he has described the new computer generation as "bright young
men of disheveled appearance [playing out] megalomaniacal fantasies of
omnipotence."
Weizenbaum's basic objection to the computer enthusiasts is that they
have no sense of limits. Says he: "The assertion that all human knowledge
is encodable in streams of zeros and ones--philosophically, that's very
hard to swallow. In effect, the whole world is made to seem computable.
This generates a kind of tunnel vision, where the only problems that seem
legitimate are problems that can be put on a computer. There is a whole
world of real problems, of human problems, which is essentially ignored."
So the revolution has begun, and as usually happens with revolutions,
nobody can agree on where it is going or how it will end. Nils Nilsson,
director of the Artificial Intelligence Center at SRI International, believes
the personal computer, like television, can "greatly increase the
forces of both good and evil." Marvin Minsky, another of M.I.T.'s
computer experts, believes the key significance of the personal computer
is not the establishment of an intellectual ruling class, as some fear,
but rather a kind of democratization of the new technology. Says he: "The
desktop revolution has brought the tools that only professionals have
had into the hands of the public. God knows what will happen now."
Perhaps the revolution
will fulfill itself only when people no longer see anything unusual in
the brave New World, when they see their computer not as a fearsome challenger
to their intelligence but as a useful linkup of some everyday gadgets:
the calculator, the TV and the typewriter. Or as Osborne's Adam Osborne
puts it: "The future lies in designing and selling computers that
people don't realize are computers at all."
|