Or maybe it came a few weeks ago, when the United States, angered at factories within China that churned out unauthorized versions of movies, records and computer programs, declared the most populist country in the world a software pirate. The United States unleashed harsh trade sanctions, a step never taken when China’s only misdeed was the torture and suppression of human beings.
Or perhaps it was the news that one of the capers pulled off by Kevin Mitnick, the uber-hacker who was nabbed last week in a North Carolina garden apartment, included a raid on a California computer that netted him 20,000 live credit-card numbers.
Twenty thousand at one blow, and he never bothered to use any of them. And then, possibly due to a typing error, he wiped out the accounting records of a computer-conferencing center called the WELL, almost destroying one of the Net’s most endearing outposts. Is cyberspace that fragile?
Maybe the moment came when one stressed-out journalist on deadline (don’t ask who) checked his Mosaic hot list and realized he had frittered away the last four hours Net-surfing. While his article lay buried somewhere on his electronic desktop, he logged in visits to a coffee room in Oxford, a newsgroup discussing extraterrestrial visitors and a web site devoted to scientific articles listing foreign objects found inside people’s bodies.
No question about it–the information revolution is here, at last. All those ones and zeros we’ve been passing around–the fuel that flames the digital fire–have reached critical mass and ignited, big time. There may still be plenty of stragglers who have yet to nuzzle up to computers, but there is no one unaffected by the explosion of computer technology. Everything from media to medicine, from data to dating, has been radically transformed by a tool invented barely 50 years ago. It’s the Big Bang of our time–we might even call it the Bit Bang.
The revolution has only just begun, but already it’s starting to overwhelm us. It’s outstripping our capacity to cope, antiquating our laws, transforming our mores, reshuffling our economy, reordering our priorities, redefining our workplaces, putting our Constitution to the fire, shifting our concept of reality and making us sit for long periods in front of computer screens while CD-ROM drives grind out another video clip.
It’s time to take a deep breath and examine where the revolution might be headed and what we might do to ease the transitions and ensure that its benefits will be broad and benign. This special issue of NEWSWEEK is our attempt to grapple with the challenges and conundrums presented to us at the crossroads of the Bit Bang.
WHY IS THIS REVOLUTION DIFFERENT from all other revolutions? Because of the nature of the computational beast. Ed Roberts, who 20 years ago created the very first personal computer, the Altair, understood this. “When you talk about power,” he once said, “what you’re really saying is “How many people do you control?’ if I were to give you an army of 10,000 people, could you build a pyramid? A computer gives the average person, a high-school freshman, the power to do things in a week that all the mathematicians who ever lived until 30 years ago couldn’t do.” By now, that highschool freshman could perform the pyramid-level calculations Roberts was talking about not in a week, but in a few minutes. And he or she could also publish and distribute a magazine, create a sophisticated 3-D drawing, churn out a business proposal, send a letter to a million friends or conduct frighteningly realistic maneuvers of a simulated F-16.
This sounds marvelous. But when everyone has the equivalent of an army of cognitive pyramid builders at his or her command, things can get pretty cluttered at Giza. There’s a real question as to whether our current social structures can accommodate such empowerment. As this century closes and we enter the first computational millennium, one of the great conflicts in civilization will be the attempt to reorder society, culture and government in a manner that exploits this digital bonanza yet prevents it from running roughshod over the checks and balances so delicately constructed in those palmy precomputer years.
Our first instinct, of course, is to deny the enormity of the transition. Even as we trash our typewriters and sail off to the skies with laptops in tow, we cling to our habits, understandably reluctant to face the unsettling reality: almost nothing is the same. But we can’t avoid it. You may not agree with everything that Newt Gingrich says, but in joining Al Gore on the ramparts of info-revolution flackcry, he is among the few who really Get It: “You’re talking about transformations on such a scale that everything changes,” he lectored a largely military crowd at the Armed Forces Communications and Electronics Association conclave earlier this month. Then he described the electronic battlefield, where every soldier will be equipped with cell telephone, computer and fax, so that “during lulls they can arrange a date, they can settle on what they want to have fixed for dinner, and they can remind their home computer that it’s time to water the plants.” And, predicted Gingrich, they can take direction from armchair warhorses on the home front. “CNN will be in your living room. . .” said the Speaker, “and you will be able to see the battle in real time. You’ll then be able to pick up your telephone and call your son or daughter who you are watching real time in a firefight. You will chat with them about your view of how they are conducting their squad operations.”
It sounds weird, but the weird will soon be commonplace. This territory is far off the map. The only guarantee is that the transitions will be jarring, especially if the changes arrive pell-mell, and we fail to take measures to shape the direction of this revolution.
But there is a bright side to this uncertainty: the current crossroads provides an opportunity to rethink civilization at the dawn of the new millennium. The road forks at every turn, presenting signposts that require tough decisions and new sorts of solutions to implement them. Here are some of the alternatives:
For many years computers were thought to be a centralizing force-those in the upper levels of a hierarchy could access up-to-date files on millions of people and keep an Orwellian eye on their domains. But since the advent of personal computers and distributed networks like the Internet, we now understand that the essential character of the computer is decentralizing. The symbol for this is the lone teenage hacker, sitting at a cheap computer in his bedroom, wreaking havoc on powerful institutions. But an equally valid symbol might be the low-level white-collar employee who can access mountains of business data to get a high-level picture of the company’s operations and send her evaluation via e-mail to the CEO.
The two modes of computer-think–one a relic of the mainframe days, the other stemming from the antiestablishment spirit of the microcomputer industry–are in constant opposition. But in the battle between control and decentralization, the pressure is really on those who hold power. They somehow have to make sure that everyone in the organization gets the benefits of new technology, while maintaining their position in a structure threatened by that same technology. Obviously, the decentralizing nature of the computer poses a threat to dictators, who have to choose between keeping their countries in the digital dark ages (and suffering dire economic consequences) or liberating a technology that might dangerously open up the entire society But the same dynamic confounds managers everywhere, as computers and networks amplify the powers of individuals and twist the corporate organizational charts into spaghettilike tangles.
Whitfield Diffie, an influential cryptographer, likes to compare the current state of privacy with the conditions existing around the time of our Founding Fathers. Back then, two people venturing from a public thoroughfare could speak with total confidence that their conversation would be private. Today, even the most seemingly remote conversation can be monitored by way of bug or shotgun mike.
Phones can be wiretapped; cell phones can be tracked by cheap scanners; e-mail can be “packetsniffed” by Internet hackers. Then there are the mountains of information stored about us on databases of every stripe. In the age of information, we’re open books. But the same forces of technology that have robbed us of our privacy can restore it. Due to the work of cryptographers like Diffie, it is now possible to imagine a system whereby high-tech encryption techniques all keep conversations and electronic messages snoop-proof. Other forms of encryption could make the information on databases unavailable to digital trespassers.
There are also cryptographic protocols that would enable a system of anonymity to further protect us from privacy violators. All of these involve some risks to society, however-risks that the FBI and national-security agencies have been quick to note.
What happens to our wiretaps if the eavesdroppers can’t make out the encrypted conversations? Kidnappers, child pornographers and terrorists would benefit, claims the FBI. One proposed compromise has been the Clipper Chip, which provides the ability to conduct private conversations but provides the government with the “keys” to decode the conversation. To the relief of privacy activists, the Clipper Chip has been a flop in the marketplace, but the government is seeking other ways to implement key escrow. In the meantime, government discourages the use of cryptography by strict enforcement of export laws, a practice that delays the adoption of user-friendly global standards that will ensure privacy for all. From the other side, crypto activists are distributing rebelware products like the notorious PGP, which are fairly hard to use but encode your e-mail so not even your government can read it.
The unanswered question: What will happen to society if we freely utilize the tools of privacy? What will happen to individuals if we attempt to suppress those tools?
The Internet, forerunner of a ubiquitous global web of digital communications, combines aspects of the telephone and broadcasting. Like the phone, it makes use of a dazzlingly intricate web capable of connecting literally billions of people. But like broadcasting, a single source can get a message across to millions. The Net has no prime locations on its dial–the CBS home page has no inherent advantage over Joe SixChip’s web site. A single person can send e-mail to millions. And there’s no control over the content of these messages.
Some people get very nervous about this. They point to the proliferation of pornography on the Net, where anyone can open a virtual dirty-books store from a desktop, and opportunities for libel and harassment abound. Case in point: a short story created and published, via Internet, by Jake Baker, a University of Michigan student. The story itself was a brief first-person fictional account, rendered in Penthouse Forum-style prose, of an episode of sexual torture and murder.
But when a Michigan alumnus Net-surfing on a Moscow computer spotted the story while accessing the alt.sex.stories electronic bulletin board, he noted that Baker’s fictional victim shared the name of a Michigan classmate, and notified the campus authorities. They in turn investigated, discovering that Baker had written some e-mail to a friend indicating he might have an interest in actually committing such a crime. Baker now faces five years in prison.
Should electronic publishing be regarded more harshly than printed matter? Nebraska Sen. James Exon thinks so, and he is sponsoring a bill that would ban indecent or harassing matter from cyberspace. In a digital age, though, there’s a high cost from attempts to silence the pornographers and other speech offenders. Requiring network providers to monitor what goes out over their systems is unworkable–it’s like asking the phone companies to monitor what’s uttered in billions of conversations. The only way that you can really control content is to cripple the whole network.
Like other dilemmas and unanswered questions of the digital age, traditional approaches simply won’t work. We’re going to have to accept less intrusive, probably more exotic solutions, like providing intelligent software filters to those who want a version Internet Lite. Before these sorts of tools arrive–if they ever do get here–the First Amendment may experience its toughest test to date.
It’s remarkable to contemplate: all of the information in a multivolume encyclopedia compressed into a shiny five-inch disc, with relevant entries “hotlinked” to each other, so you can switch from “lion” to “Africa,” say, with the click of a mouse. This hypertext model is the basis of multimedia, the educational format of choice among software purveyors. (It’s also a potential gold mine for them, since CD-ROMs generally cost more than books.) Multimedia is also associated with flashy video clips, musical sound bytes and colorful animations; the theory seems to be that in order to compete with television, education must outdazzle the boob tube. But while multimedia may appeal to the MTV-fueled rhythms of a hot-wired generation, some critics believe that all that hot-linking is an educational detriment. Considering the sorry state of literacy, there’s real danger in even a partial abandonment of narrative forms and rigorous modes of thought associated with logical arguments, where A leads to B. Multimedia’s forte is not reason, but hot emotional impact-the same ingredients that make local TV news compelling yet less filling. Will the level of discourse in this country, already fuzzied up by television, sink to that of videogames? Or will the proliferation of information and new techniques to impart it (like the vast libraries linked into the Net) initiate a new Renaissance?
As Nicholas Negroponte, of MIT’s Media Lab, puts it, the essence of the information revolution is the difference between atoms and bits. The former are the building blocks for physical stuff, which until now has formed the basis of our economy as well as our consciousness. Bits, however, are ephemeralthey are simply ones and zeros. From that slight scaffolding, we have the bounty of the information age: all the documents, spreadsheets, audio CDs, multimedia CD-ROMs, movie special effects and virtual-reality environments. As more of our experience comes to us by way of bits, reality itself gradually changes. Literally out of nothing, a new dimension emerges: cyberspace, a place made out of bits, whose intangible nature does not prevent it from becoming a second home, or a primary workplace, for masses of infonauts.
But we’re still trying to sort out the shift from atoms to bits. Can a photograph or a tape recording ever be certified as evidence, when bits can be flicked at on any keyboard? Can copyright be protected when the bits that form intellectual property can be copied by the millions at any desktop? Can the integrity of international borders be protected when bits flow across them as freely as the wind, and governments cannot tax their value or limit their content?
Taken together, the weight of the decisions we must make can seem terrifying, especially since we’re bound to make mistakes. just as those who tried to imagine the effect of the automobile failed to visualize that it would create the suburbs, a phenomenon that would shift the political and economic landscape, we are clueless as to the eventual results of the Bit Bang. But ultimately, the process may prove exhilarating, as we learn to exploit our newly augmented ability to remake the world with the products of mind and the tools of collaboration. As we grapple with the unanswered questions, we’re in for the ride of a lifetime.