Related:
2 June 1994,
NYT: Flaw Discovered in Federal Plan for Wiretapping
http://www.nytimes.com/1994/06/12/magazine/battle-of-the-clipper-chip.html
June 12, 1994
Battle of the Clipper Chip
By Steven Levy
On a sunny spring day in Mountain View, Calif., 50 angry activists are plotting against the United States Government. They may not look subversive sitting around a conference table dressed in T-shirts and jeans and eating burritos, but they are self-proclaimed saboteurs. They are the Cypherpunks, a loose confederation of computer hackers, hardware engineers and high-tech rabble-rousers.
The precise object of their rage is the Clipper chip, officially known as the MYK-78 and not much bigger than a tooth. Just another tiny square of plastic covering a silicon thicket. A computer chip, from the outside indistinguishable from thousands of others. It seems improbable that this black Chiclet is the focal point of a battle that may determine the degree to which our civil liberties survive in the next century. But that is the shared belief in this room.
The Clipper chip has prompted what might be considered the first holy war of the information highway. Two weeks ago, the war got bloodier, as a researcher circulated a report that the chip might have a serious technical flaw. But at its heart, the issue is political, not technical. The Cypherpunks consider the Clipper the lever that Big Brother is using to pry into the conversations, messages and transactions of the computer age. These high-tech Paul Reveres are trying to mobilize America against the evil portent of a "cyberspace police state," as one of their Internet jeremiads put it. Joining them in the battle is a formidable force, including almost all of the communications and computer industries, many members of Congress and political columnists of all stripes. The anti-Clipper aggregation is an equal-opportunity club, uniting the American Civil Liberties Union and Rush Limbaugh.
The Clipper's defenders, who are largely in the Government, believe it represents the last chance to protect personal safety and national security against a developing information anarchy that fosters criminals, terrorists and foreign foes. Its adherents pose it as the answer, or at least part of the answer, to a problem created by an increasingly sophisticated application of an age-old technology: cryptography, the use of secret codes.
For centuries, cryptography was the domain of armies and diplomatic corps. Now it has a second purpose: protecting personal and corporate privacy. Computer technology and advanced telecommunications equipment have drawn precious business information and intimate personal communications out into the open. This phenomenon is well known to the current Prince of Wales, whose intimate cellular phone conversations were intercepted, recorded and broadcast worldwide. And corporations realize that competitors can easily intercept their telephone conversations, electronic messages and faxes. High tech has created a huge privacy gap. But miraculously, a fix has emerged: cheap, easy-to-use, virtually unbreakable encryption. Cryptography is the silver bullet by which we can hope to reclaim our privacy.
The solution, however, has one drawback: cryptography shields the law abiding and the lawless equally. Law-enforcement and intelligence agencies contend that if strong codes are widely available, their efforts to protect the public would be paralyzed. So they have come up with a compromise, a way to neutralize such encryption. That's the Clipper chip and that compromise is what the war is about.
The idea is to give the Government means to override other people's codes, according to a concept called "key escrow." Employing normal cryptography, two parties can communicate in total privacy, with both of them using a digital "key" to encrypt and decipher the conversation or message. A potential eavesdropper has no key and therefore cannot understand the conversation or read the data transmission. But with Clipper, an additional key -- created at the time the equipment is manufactured -- is held by the Government in escrow. With a court-approved wiretap, an agency like the F.B.I. could listen in. By adding Clipper chips to telephones, we could have a system that assures communications will be private -- from everybody but the Government.
And that's what rankles Clipper's many critics. Why, they ask, should people accused of no crime have to give Government the keys to their private communications? Why shouldn't the market rather than Government determine what sort of cryptosystem wins favor? And isn't it true that the use of key escrow will make our technology so unattractive to the international marketplace that the United States will lose its edge in the lucrative telecommunications and computer fields? Clipper might clip the entire economy.
Nonetheless, on Feb. 4 the White House announced its approval of the Clipper chip, which had been under study as a Government standard since last April, and the Crypto War broke out in full force. Within a month, one civil liberties group, Computer Professionals for Social Responsibility, received 47,000 electronic missives urging a stop to Clipper. "The war is upon us," wrote Tim May, co-founder of the Cypherpunks, in an urgent electronic dispatch soon after the announcement. "Clinton and Gore folks have shown themselves to be enthusiastic supporters of Big Brother."
And though the Clinton Administration's endorsement of Clipper as a Government standard required no Congressional approval, rumblings of discontent came from both sides of the Capitol. Senator Patrick J. Leahy, the Vermont Democrat whose subcommittee has held contentious hearings on the matter, has called the plan a "misstep," charging that "the Government should not be in the business of mandating particular technologies."
Two weeks ago, an AT&T Bell Laboratories researcher revealed that he had found a serious flaw in the Clipper technology itself, enabling techno-savvy lawbreakers to bypass the security function of the chip in some applications. Besides being a bad idea, Clipper's foes now say, it doesn't even work properly.
Yet the defenders of Clipper have refused to back down, claiming that the scheme -- which is, they often note, voluntary -- is an essential means of stemming an increasing threat to public safety and security by strong encryption in everyday use. Even if Clipper itself has to go back to the drawing board, its Government designers will come up with something quite similar. The underlying issue remains unchanged: If something like Clipper is not implemented, writes Dorothy E. Denning, a Georgetown University computer scientist, "All communications on the information highway would be immune from lawful interception. In a world threatened by international organized crime, terrorism and rogue governments, this would be folly."
The claims from both sides sound wild, almost apocalyptic. The passion blurs the problem: Can we protect our privacy in an age of computers -- without also protecting the dark forces in society?
The crypto war is the inevitable consequence of a remarkable discovery made almost 20 years ago, a breakthrough that combined with the microelectronics revolution to thrust the once-obscure field of cryptography into the mainstream of communications policy.
It began with Whitfield Diffie, a young computer scientist and cryptographer. He did not work for the Government, which was strange because in the 1960's almost all serious crypto in this country was done under Federal auspices, specifically at the Fort Meade, Md., headquarters of the supersecret National Security Agency. Though it became bigger than the C.I.A., the N.S.A. was for years unknown to Americans; the Washington Beltway joke was that the initials stood for "No Such Agency." Its working premise has always been that no information about its activities should ever be revealed. Its main mission involved cryptography, and the security agency so dominated the field that it had the power to rein in even those few experts in the field who were not on its payroll.
But Whitfield Diffie never got that message. He had been bitten by the cryptography bug at age 10 when his father, a professor, brought home the entire crypto shelf of the City College library in New York. Then he lost interest, until he arrived at M.I.T.'s Artificial Intelligence Laboratory in 1966. Two things rekindled his passion. Now trained as a mathematician, he had an affinity for the particular challenges of sophisticated crypto. Just as important, he says, "I was always concerned about individuals, an individual's privacy as opposed to Government secrecy."
Diffie, now 50, is still committed to those beliefs. When asked about his politics, he says, "I like to describe myself as an iconoclast." He is a computer security specialist for Sun Microsystems, a celebrated cryptographer and an experienced hand at Congressional testimony. But he looks like he stumbled out of a Tom Robbins novel -- with blond hair that falls to his shoulders and a longish beard that seems a virtual trademark among code makers. At a Palo Alto, Calif., coffeehouse one morning, he describes, in clipped, precise cadence, how he and Martin E. Hellman, an electrical engineering professor at Stanford University, created a crypto revolution.
Diffie was dissatisfied with the security on a new time-sharing computer system being developed by M.I.T. in the 1960's. Files would be protected by passwords, but he felt that was insufficient. The system had a generic flaw. A system manager had access to all passwords. "If a subpoena was served against the system managers, they would sell you out, because they had no interest in going to jail," Diffie says. A perfect system would eliminate the need for a trusted third party.
This led Diffie to think about a more general problem in cryptography: key management. Even before Julius Caesar devised a simple cipher to encode his military messages, cryptography worked by means of keys. That is, an original message (what is now called "plaintext") was encrypted by the sender into seeming gibberish (known as "ciphertext"). The receiver, using the same key, decrypted the message back into the original plaintext. For instance, the Caesar key was the simple replacement of each letter by the letter three places down in the alphabet. If you knew the key, you could encrypt the word help into the nonsense word khos; the recipient of the message would decrypt the message back to help.
The problem came with protecting the key. Since anyone who knew the Caesar key would be able to understand the encoded message, it behooved the Romans to change that key as often as possible. But if you change the key, how do you inform your spies behind enemy lines? (If you tell them using the old code, which may have already been cracked, your enemies will then learn the new code.) For centuries, generals and diplomats have faced that predicament. But a few years ago, it took on added urgency.
With computers and advanced telecommunications, customers outside Government were discovering a need for information security. Cryptography was the answer, but how could it be applied widely, considering the problem of keys? The best answer to date was something called a key-management repository, where two parties who wanted secrecy would go to a trusted third party who would generate a new key for the private session. But that required just what Diffie deplored -- an unwanted third wheel.
"The virtue of cryptography should be that you don't have to trust anybody not directly involved with your communication," Diffie says. "Without conventional key distribution centers, which involved trusting third parties, I couldn't figure how you could build a system to secure, for instance, all the phones in the country."
When Diffie moved to Stanford University in 1969, he foresaw the rise of home computer terminals and began pondering the problem of how to use them to make transactions. "I got to thinking how you could possibly have electronic business, because signed letters of intent, contracts and all seemed so critical," he says. He devoured what literature he could find outside the National Security Agency. And in the mid-1970's, Diffie and Hellman achieved a stunning breakthrough that changed cryptography forever. They split the cryptographic key.
In their system, every user has two keys, a public one and a private one, that are unique to their owner. Whatever is scrambled by one key can be unscrambled by the other. It works like this: If I want to send a message to Whit Diffie, I first obtain his public key. (For complicated mathematical reasons, it is possible to distribute one's public key freely without compromising security; a potential enemy will have no advantage in code-cracking if he holds your public key alone.) Then I use that key to encode the message. Now it's gobbledygook and only one person in the world can decode it -- Whit Diffie, who holds the other, private, key. If he wants to respond to me with a secret message, he uses my public key to encode his answer. And I decode it, using my private key.
It was an amazing solution, but even more remarkable was that this split-key system solved both of Diffie's problems, the desire to shield communications from eavesdroppers and also to provide a secure electronic identification for contracts and financial transactions done by computer. It provided the identification by the use of "digital signatures" that verify the sender much the same way that a real signature validates a check or contract.
Suddenly, the ancient limitations on cryptography had vanished. Now, perhaps before the millennium, strong cryptography could find its way to every telephone, computer and fax machine -- if users wanted it. Subsequent variations on the Diffie-Hellman scheme focused on using crypto algorithms to insure the anonymity of transactions. Using these advances, it is now possible to think of replacing money with digital cash -- while maintaining the comforting untraceability of bills and coins. The dark art of cryptography has become a tool of liberation.
From the moment Diffie and Hellman published their findings in 1976, the National Security Agency's crypto monopoly was effectively terminated. In short order, three M.I.T. mathematicians -- Ronald L. Rivest, Adi Shamir and Leonard M. Adleman -- developed a system with which to put the Diffie and Hellman findings into practice. It was known by their initials, RSA. It seemed capable of creating codes that even the N.S.A. could not break. They formed a company to sell their new system; it was only a matter of time before thousands and then millions of people began using strong encryption.
That was the National Security Agency's greatest nightmare. Every company, every citizen now had routine access to the sorts of cryptographic technology that not many years ago ranked alongside the atom bomb as a source of power. Every call, every computer message, every fax in the world could be harder to decipher than the famous German "Enigma" machine of World War II. Maybe even impossible to decipher!
The genie was out of the bottle. Next question: Could the genie be made to wear a leash and collar? Enter the Clipper chip.
When illustrating the Government's need to control crypto, Jim Kallstrom, the agent in charge of the special operations division of the New York office of the F.B.I., quickly shifts the discussion to the personal: "Are you married? Do you have a child? O.K., someone kidnaps one of your kids and they are holding your kid in this fortress up in the Bronx. Now, we have probable cause that your child is inside this fortress. We have a search warrant. But for some reason, we cannot get in there. They made it out of some new metal, or something, right? Nothing'll cut it, right? And there are guys in there, laughing at us. That's what the basis of this issue really is -- we've got a situation now where a technology has become so sophisticated that the whole notion of a legal process is at stake here!"
Kallstrom is a former head of the Bureau Tech Squad, involved in the bugging operation that brought John Gotti to justice. Some have described him as the F.B.I.'s answer to "Q," the gadget wizard of the James Bond tales.
"From the standpoint of law enforcement, there's a super big threat out there -- this guy is gonna build this domain in the Bronx now, because he's got a new steel door and none of the welding torches, none of the boomerangs, nothing we have is gonna blast our way in there. Sure, we want those new steel doors ourselves, to protect our banks, to protect the American corporation trade secrets, patent rights, technology. But people operating in legitimate business are not violating the laws -- it becomes a different ball of wax when we have probable cause and we have to get into that domain. Do we want a digital superhighway where not only the commerce of the nation can take place but where major criminals can operate impervious to the legal process? If we don't want that, then we have to look at Clipper."
Wiretapping is among law enforcement's most cherished weapons. Only 919 Federal, state and local taps were authorized last year, but police agencies consider them essential to fighting crime. Obviously if criminals communicate using military-grade cryptosystems, wiretapping them becomes impossible.
For two years, the F.B.I. has been urging Congress to pass the proposed Digital Telephony and Communications Privacy Act, which would in essence require that new communications technologies be designed to facilitate wiretapping. Even if the bill should somehow pass, overcoming the opposition of the communications industry and civil libertarians, the extra effort and expense will be wasted if the only thing the wiretappers can hear is the hissy white noise of encrypted phone conversations and faxes. If cryptography is not controlled, wiretapping could be rendered obsolete. Louis J. Freeh, the Director of the F.B.I., surely fears that prospect. He has told Congress that preserving the ability to intercept communications legally, in the face of these technological advances, is "the No. 1 law enforcement, public safety and national security issue facing us today."
Some people criticize Clipper on the basis that truly sophisticated criminals would never use it, preferring other easily obtained systems that use high-grade cryptography. Despite Clipper, kidnappers and drug kingpins may construct Kallstrom's virtual fort in the Bronx with impunity, laughing at potential wiretappers.
The Government understands the impossibility of eradicating strong crypto. Its objective is instead to prevent unbreakable encryption from becoming routine. If that happens, even the stupidest criminal would be liberated from the threat of surveillance. But by making Clipper the standard, the Government is betting that only a tiny percentage of users would use other encryption or try to defeat the Clipper.
At a rare public appearance in March at a conference on computers and privacy, Stewart A. Baker, then general counsel of the National Security Agency, tried to explain. "The concern is not so much what happens today when people go in and buy voice scramblers," said Baker, a dapper, mustached lawyer who worked as an Education Department lawyer in the Carter Administration. "It is the prospect that in 5 years or 10 years every phone you buy that costs $75 or more will have an encrypt button on it that will interoperate with every other phone in the country and suddenly we will discover that our entire communications network is being used in ways that are profoundly antisocial. That's the real concern, I think, that Clipper addresses. If we are going to have a standardized form of encryption that is going to change the world, we should think seriously about what we are going to do when it is misused."
Not all law-enforcement experts believe that cryptography will unleash a riot of lawlessness. William R. Spernow, a Sacramento, Calif., computer crime specialist who works on a grant from the Federal Bureau of Justice Assistance, has encountered a few cases in which criminals have encrypted information unbreakably, including one involving a pedophile who encrypted the identities of his young victims. Yet Spernow sees no reason to panic. "In cases where there's encryption, the officers have been able to make the case through other investigative means," he says. "If we hustle, we can still make our cases through other kinds of police work."
But crime is only part of the problem. What happens to national security if cryptography runs free? Those who know best, officials of the National Security Agency, won't say. When the agency's director, Vice Adm. John M. McConnell testified before a Senate subcommittee on May 3, he withheld comment on this question until the public hearing was terminated and a second, classified session convened in a secure room.
Still, the effect of strong crypto on N.S.A. operations is not difficult to imagine. The agency is charged with signals intelligence, and it is widely assumed that it monitors all the communications between borders and probably much of the traffic within foreign countries. (It is barred from intercepting domestic communications.) If the crypto revolution crippled N.S.A.'s ability to listen in on the world, the agency might miss out on something vital -- for instance, portents of a major terrorist attack.
No compelling case has been made, however, that the key-escrow system would make it easier for authorities to learn of such an attack. The National Security Agency would take the legal steps to seek the telltale keys after it had first identified those potential terrorists and wiretapped their calls, then discovered the impenetrable hiss of encryption. Even then, the keys would be useful only if the terrorists were encoding conversations with Clipper technology, the one kind the Government had the capability to decode instantly. What sort of nuclear terrorist would choose Clipper?
The Government response has been to say that potential terrorists might indeed use alternative crypto methods to converse among themselves. But if Clipper were the accepted standard, the terrorists would have to use it to communicate with outsiders -- banks, suppliers and other contacts. The Government could listen in on those calls. However, the work of the Bell Labs researcher, Matthew Blaze, casts serious doubt on that contention. Blaze has uncovered a flaw in Clipper that would allow a user to bypass the security function of the chip. Anyone who tinkered with Clipper in this way could communicate in privacy with anyone else with a Clipper phone and Government wiretappers would be unable to locate the key to unscramble the conversations.
Nonetheless, it was the terrorist threat, along with national security concerns, that moved the Clinton Administration to support the key-escrow initiative. White House high-tech policy makers share a recurrent fear: one day they might be sitting before an emergency Congressional investigation after the destruction of half of Manhattan by a stolen nuclear weapon planted in the World Trade towers and trying to explain that the Government had intercepted the communications of the terrorists but could not understand them because they used strong encryption. If Clipper were enacted, they could at least say, "We tried."
Obviously the Government views the Crypto revolution with alarm and wants to contain it. For years, much of its efforts have focused on the use of stringent export controls. While cryptography within the United States is unrestricted, the country's export laws treat any sort of encryption as munitions, like howitzers or nuclear triggers. The National Security Agency is the final arbiter and it will approve exports of cryptosystems in computer software and electronic hardware only if the protective codes are significantly weakened.
The N.S.A. stance is under attack from American businesses losing sales to foreign competitors. Listen to D. James Bidzos, the 39-year-old president of RSA Data Security, the Redwood City, Calif., company that controls the patents for public-key cryptography: "For almost 10 years, I've been going toe to toe with these people at Fort Meade. The success of this company is the worst thing that can happen to them. To them, we're the real enemy, we're the real target."
RSA is making a pitch to become the standard in encryption; its technology has been adopted by Apple, AT&T, Lotus, Microsoft, Novell and other major manufacturers. So imagine its unhappiness that its main rival is not another private company, but the National Security Agency, designer of the key-escrow cryptosystems. The agency is a powerful and dedicated competitor.
"We have the system that they're most afraid of," Bidzos says. "If the U.S. adopted RSA as a standard, you would have a truly international, interoperable, unbreakable, easy-to-use encryption technology. And all those things together are so synergistically threatening to the N.S.A.'s interests that it's driving them into a frenzy."
The export laws put shackles on Bidzos's company while his overseas competitors have no such restraints. Cryptographic algorithms that the N.S.A. bans for export are widely published and are literally being sold on the streets of Moscow. "We did a study on the problem and located 340 foreign cryptographic products sold by foreign countries," says Douglas R. Miller, government affairs manager of the Software Publishers Association. "The only effect of export controls is to cripple our ability to compete."
The real potential losses, though, come not in the stand-alone encryption category, but in broader applications. Companies like Microsoft, Apple and Lotus want to put strong encryption into their products but cannot get licenses to export them. Often, software companies wind up installing a weaker brand of crypto in all their products so that they can sell a single version worldwide. This seems to be the Government's intent -- to encourage "crypto lite," strong enough to protect communications from casual intruders but not from Government itself.
In the long run, however, export regulation will not solve the National Security Agency's problem. The crypto business is exploding. People are becoming more aware of the vulnerability of phone conversations, particularly wireless ones. Even the National Football League is adopting crypto technology; it will try out encrypted radio communication between coaches and quarterbacks, so rivals can't intercept last-minute audibles.
Anticipating such a boom, the N.S.A. devised a strategy for the 90's. It would concede the need for strong encryption but encourage a system with a key-escrow "back door" that provides access to communications for itself and law enforcement. The security agency had already developed a strong cryptosystem based on an algorithm called Skipjack, supposedly 16 million times stronger than the previous standard, D.E.S. (Data Encryption Standard). Now the agency's designers integrated Skipjack into a new system that uses a Law Enforcement Access Field (LEAF) that adds a signal to the message that directs a potential wiretapper to the appropriate key to decipher the message. These features were included in a chip called Capstone, which could handle not only telephone communications but computer data transfers and digital signatures.
Supposedly, this technology was designed for Government use, but in 1993 the National Security Agency had a sudden opportunity to thrust it into the marketplace. AT&T had come to the agency with a new, relatively low-cost secure-phone device called the Surity 3600 that was designed to use the nonexportable DES encryption algorithm. The N.S.A. suggested that perhaps AT&T could try something else: a stripped-down version of Capstone for telephone communications. This was the Clipper chip. As a result, AT&T got two things: an agreement that Uncle Sam would buy thousands of phones for its own use (the initial commitment was 9,000, from the F.B.I.) and the prospect that the phone would not suffer the unhappy fate of some other secure devices when considered for export. There was also the expectation that AT&T would sell a lot more phones, since private companies would need to buy Clipper-equipped devices to communicate with the Government's Clipper phones.
It was an ingenious plan for several reasons. By agreeing to buy thousands of phones, and holding out the promise that thousands, or even millions more might be sold, AT&T phones gained a price advantage that comes with volume. (The original price of the Surity 3600 was $1,195, considerably less than the previous generation of secure phones; Mykotronx, the company making the Clipper chip, says that each chip now costs $30, but in large orders could quickly go as low as $10.) That would give the phones a big push in the marketplace. But by saturating the market, Clipper had a chance to become the standard for encryption, depending on whether businesses and individuals would be willing to accept a device that had the compromise of a government-controlled back door.
This compromise, of course, is the essence of Clipper. The Government recognizes the importance of keeping business secrets, intimate information and personal data hidden from most eyes and ears. But it also preserves a means of getting hold of that information after obtaining "legal authorization, normally a court order," according to a White House description.
The N.S.A. presented the idea to the Bush Administration, which took no action before the election. Then it had to convince a Democratic Administration to adopt the scheme, and started briefing the Clinton people during the transition. Many in the computer industry figured that with Vice President Al Gore's enthusiastic endorsement of the high-frontier virtues of the information highway, the Administration would never adopt any proposal so tilted in favor of law enforcement and away from his allies in the information industries. They figured wrong. A little more than two months after taking office, the Clinton Administration announced the existence of the Clipper chip and directed the National Institute of Standards and Technology to consider it as a Government standard.
Clipper was something the Administration -- starting with the Vice President -- felt compelled to adopt, and key escrow was considered an honorable attempt to balance two painfully contradictory interests, privacy and safety.
The reaction was instant, bitter and ceaseless. The most pervasive criticisms challenged the idea that a Clipper would be, as the standard said, "voluntary." The Government's stated intent is to manipulate the marketplace so that it will adopt an otherwise unpalatable scheme and make it the standard. Existing systems have to cope with export regulations and, now, incompatibility with the new Government Clipper standard. Is it fair to call a system voluntary if the Government puts all sorts of obstacles in the way of its competitors?
Others felt that it was only a matter of time before the National Security Agency pressured the Government to require key escrow of all cryptographic devices -- that Clipper was only the first step in a master plan to give Uncle Sam a key to everyone's cyberspace back door.
"That's a real fear," says Stephen T. Walker, a former N.S.A. employee who is now president of Trusted Information Systems, a company specializing in computer security products. "I don't think the Government could pull it off -- it would be like prohibition, only worse. But I think they might try it."
But mostly, people were unhappy with the essence of Clipper, that the Government would escrow their keys. As Diffie notes, key escrow reintroduces the vulnerability that led him to invent public key cryptography -- any system that relies on trusted third parties is, by definition, weaker than one that does not. Almost no one outside the Government likes the key-escrow idea. "We published the standard for 60 days of public comments," says F. Lynn McNulty, associate director for computer security at the National Institute of Standards and Technology. "We received 320 comments, only 2 of which were supportive."
Many people thought that in the face of such opposition, the Administration would quietly drop the Clipper proposal. They were dismayed by the Feb. 4 announcement of the adoption of Clipper as a Government standard. Administration officials knew they were alienating their natural allies in the construction of the information superhighway but felt they had no alternative. "This," said Michael R. Nelson, a White House technology official, "is the Bosnia of telecommunications."
If clipper is the administration's Techno-Bosnia, the crypto equivalent of snipers are everywhere -- in industry, among privacy lobbyists and even among Christian Fundamentalists. But the most passionate foes are the Cypherpunks. They have been meeting on the second Saturday of every month at the offices of Cygnus, a Silicon Valley company, assessing new ways they might sabotage Clipper. The group was co-founded in September 1992 by Eric Hughes, a 29-year-old freelance cryptographer, and Tim May, a 42-year-old physicist who retired early and rich from the Intel company. Other Cypherpunk cells often meet simultaneously in six or seven locations around the world, but the main gathering place for Cypherpunks is the Internet, by means of an active mailing list in which members post as many as 100 electronic messages a day.
Cypherpunks share a few common premises. They assume that cryptography is a liberating tool, one that empowers individuals. They think that one of the most important uses of cryptography is to protect communications from the Government. Many of them believe that the Clipper is part of an overall initiative against cryptography that will culminate in Draconian control of the technology. And they consider it worth their time to fight, educating the general public and distributing cryptographic tools to obstruct such control.
Both Hughes and May have composed manifestos. Hughes's call to arms proclaims: "Cypherpunks write code. We know that someone has to write software to defend privacy, and since we can't get privacy unless we all do, we're going to write it."
May's document envisions a golden age in which strong cryptography belongs to all -- an era of "crypto anarchism" that governments cannot contain. To May, cryptography is a tool that will not only bestow privacy on people but help rearrange the economic underpinnings of society.
"Combined with emerging information markets, cryptography will create a liquid market for any and all material that can be put into words and pictures," May's document says. "And just as a seemingly minor invention like barbed wire made possible the fencing-off of vast ranches and farms, thus altering forever the concepts of land and property rights in the frontier West, so too will the seemingly minor discovery out of an arcane branch of mathematics come to be the wire clippers which dismantle the barbed wire around intellectual property."
At a recent meeting, about 50 Cypherpunks packed into the Cygnus conference room, with dozens of others participating electronically from sites as distant as Cambridge, Mass., and San Diego. The meeting stretched for six hours, with discussions of hardware encryption schemes, methods to fight an electronic technique of identity forgery called "spoofing," the operation of "remailing" services, which allow people to post electronic messages anonymously -- and various ways to fight Clipper.
While the Cypherpunks came up with possible anti-Clipper slogans for posters and buttons, a bearded crypto activist in wire-rim glasses named John Gilmore was outside the conference room, showing the latest sheaf of cryptography-related Freedom of Information documents he'd dragged out of Government files. Unearthing and circulating the hidden crypto treasures of the National Security Agency is a passion of Gilmore, an early employee of Sun Microsystems who left the company a multimillionaire. The Government once threatened to charge him with a felony for copying some unclassified-and-later-reclassified N.S.A. documents from a university library. After the story hit the newspapers, the Government once again declassified the documents.
"This country was founded as an open society, and we still have the remnants of that society," Gilmore says. "Will crypto tend to open it or close it? Our Government is building some of these tools for its own use, but they are unavailable -- we have paid for cryptographic breakthroughs but they're classified. I wish I could hire 10 guys -- cryptographers, librarians -- to try to pry cryptography out of the dark ages."
Perhaps the most admired Cypherpunk is someone who says he is ineligible because he often wears a suit. He is Philip R. Zimmermann, a 40-year-old software engineer and cryptographic consultant from Boulder, Colo., who in 1991 cobbled together a cryptography program for computer data and electronic mail. "PGP," he called it, meaning Pretty Good Privacy, and he decided to give it away. Anticipating the Cypherpunk credo, Zimmermann hoped that the appearance of free cryptography would guarantee its continued use after a possible Government ban. One of the first people receiving the program placed it on a computer attached to the Internet and within days thousands of people had PGP. Now the program has been through several updates and is becoming sort of a people's standard for public key cryptography. So far, it appears that no one has been able to crack information encoded with PGP.
Like Diffie, Zimmermann developed a boyhood interest in crypto. "When I was a kid growing up in Miami, it was just kind of cool -- secret messages and all," he says. Later, "computers made it possible to do ciphers in a practical manner." He was fascinated to hear of public key cryptography and during the mid-1980's he began experimenting with a system that would work on personal computers. With the help of some colleagues, he finally devised a strong system, albeit one that used some patented material from RSA Data Security. And then he heard about the Senate bill that proposed to limit a citizen's right to use strong encryption by requiring manufacturers to include back doors in their products. Zimmermann, formerly a nuclear freeze activist, felt that one of the most valuable potential uses of cryptography was to keep messages secret from the Government.
Zimmermann has put some political content into the documentation for his program: "If privacy is outlawed, only outlaws will have privacy. Intelligence agencies have access to good cryptographic technology. So do the big arms and drug traffickers. So do defense contractors, oil companies, and other corporate giants. But ordinary people and grassroots political organizations mostly have not had access to affordable 'military grade' public-key cryptographic technology. Until now."
He has been told that Burmese freedom fighters learn PGP in jungle training camps on portable computers, using it to keep documents hidden from their oppressive Government. But his favorite letter comes from a person in Latvia, who informed him that his program was a favorite among one-time refuseniks in that former Soviet republic. "Let it never be," wrote his correspondent, "but if dictatorship takes over Russia, your PGP is widespread from Baltic to Far East now and will help democratic people if necessary."
Early last year, Zimmermann received a visit from two United States Customs Service agents. They wanted to know how it was that the strong encryption program PGP had found its way overseas with no export license. In the fall, he learned from his lawyer that he was a target of a grand jury investigation in San Jose, Calif. But even if the Feds should try to prosecute, they are likely to face a tough legal issue: Can it be a crime, in the process of legally distributing information in this country, to place it on an Internet computer site that is incidentally accessible to network users in other countries? There may well be a First Amendment issue here: Americans prize the right to circulate ideas, including those on software disks.
John Gilmore has discovered that Government lawyers have their own doubts about these issues. In some documents he sued to get, there are mid-1980's warnings by the Justice Department that the export controls on cryptography presented "sensitive constitutional issues." In one letter, an assistant attorney general warns that "the regulatory scheme extends too broadly into an area of protected First Amendment speech."
Perhaps taking Phil Zimmermann to court would not be the Government's best method for keeping the genie in the bottle.
The Clipper program has already begun. About once a month, four couriers with security clearances travel from Washington to the Torrance, Calif., headquarters of Mykotronx, which holds the contract to make Clipper chips. They travel in pairs, two from each escrow agency: the NIST and the Treasury Department. The redundancy is a requirement of a protocol known as Two-Person Integrity, used in situations like nuclear missile launches, where the stakes are too high to rely on one person.
The couriers wait while a Sun work station performs the calculations to generate the digital cryptographic keys that will be imprinted in the Clipper chips. Then it splits the keys into two pieces, separate number chains, and writes them on two floppy disks, each holding lists of "key splits." To reconstruct the keys imprinted on the chip, and thereby decode private conversations, you would need both sets of disks.
After being backed up, the sets of disks are separated, each one going with a pair of couriers. When the couriers return to their respective agencies, each set of disks is placed in a double-walled safe. The backup copies are placed in similar safes. There they wait, two stacks of floppy disks that grow each month, now holding about 20,000 key splits, the so-called back doors.
Will this number grow into the millions as the Government hopes? Ultimately the answer lies with the American public. Administration officials are confident that when the public contemplates scenarios like the Fortress in the Bronx or the Mushroom Cloud in Lower Manhattan, it will realize that allowing the Government to hold the keys is a relatively painless price to pay for safety and national security. They believe the public will eventually accept it in the same way it now views limited legal wiretapping. But so far the Administration hasn't recruited many prominent supporters. The main one is Dorothy Denning, a crypto expert who heads the computer science department at Georgetown University.
Since endorsing Clipper (and advocating passage of the Digital Telephony initiative) Denning has been savagely attacked on the computer nets. Some of the language would wither a professional wrestler. "I've seen horrible things written about me," Denning says with a nervous smile. "I try to actually now avoid looking at them, because that's not what's important to me. What's important is that we end up doing the right thing with this. It was an accumulation of factors that led me to agree with Clipper, and the two most important areas, to me, are organized crime and terrorism. I was exposed to cases where wiretaps had actually stopped crimes in the making, and I started thinking, 'If they didn't have this tool, some of these things might have happened.' You know, I hate to use the word responsibility, but I actually feel some sense of responsibility to at least state my position to the extent so that people will understand it."
The opponents of Clipper are confident that the marketplace will vote against it. "The idea that the Government holds the keys to all our locks, before anyone has even been accused of committing a crime, doesn't parse with the public," says Jerry Berman, executive director of the Electronic Frontier Foundation. "It's not America."
Senator Leahy hints that Congress might not stand for the Clinton Administration's attempt to construct the key-escrow system, at an estimated cost of $14 million dollars initially and $16 million annually. "If the Administration wants the money to set up and run the key-escrow facilities," he says, "it will need Congressional approval." Despite claims by the National Institute of Standards and Technology deputy director, Raymond G. Kammer, that some foreign governments have shown interest in the scheme, Leahy seems to agree with most American telecommunications and computer manufacturers that Clipper and subsequent escrow schemes will find no favor in the vast international marketplace, turning the United States into a cryptographic island and crippling important industries.
Leahy is also concerned about the Administration's haste. "The Administration is rushing to implement the Clipper chip program without thinking through crucial details," he says. Indeed, although the Government has been buying and using Clipper encryption devices, the process of actually getting the keys out of escrow and using them to decipher scrambled conversations has never been field tested. And there exists only a single uncompleted prototype of the device intended to do the deciphering.
Leahy is also among those who worry that, all policy issues aside, the Government's key escrow scheme might fail solely on technical issues. The Clipper and Capstone chips, while powerful enough to use on today's equipment, have not been engineered for the high speeds of the coming information highway; updates will be required. Even more serious are the potential design flaws in the unproved key-escrow scheme. Matthew Blaze's discovery that wrongdoers could foil wiretappers may be only the first indication that Clipper is unable to do the job for which it was designed. In his paper revealing the glitch, he writes, "It is not clear that it is possible to construct EES (Escrow Encryption Standard) that is both completely invulnerable to all kinds of exploitation as well as generally useful."
At bottom, many opponents of Clipper do not trust the Government. They are unimpressed by the elaborate key-escrow security arrangements outlined for Clipper. Instead, they ask questions about the process by which the Clipper was devised -- how is it that the N.S.A., an intelligence agency whose mission does not ordinarily include consumer electronics design, has suddenly seized a central role in creating a national information matrix? They also complain that the Skipjack cryptographic algorithm is a classified secret, one that cryptographic professionals cannot subject to the rigorous, extended testing that has previously been used to gain universal trust for such a standard.
"You don't want to buy a set of car keys from a guy who specializes in stealing cars," says Marc Rotenberg, director of the Electronic Privacy Information Center. "The N.S.A.'s specialty is the ability to break codes, and they are saying, 'Here, take our keys, we promise you they'll work.' "
At the March conference on computers and privacy, Stewart Baker responded to this sort of criticism. "This is the revenge of people who couldn't go to Woodstock because they had too much trig homework," he said, evoking some catcalls. "It's a kind of romanticism about privacy. The problem with it is that the beneficiaries of that sort of romanticism are going to be predators. PGP, they say, is out there to protect freedom fighters in Latvia. But the fact is, the only use that has come to the attention of law enforcement agencies is a guy who was using PGP so the police could not tell what little boys he had seduced over the net. Now that's what people will use this for -- it's not the only thing people will use it for, but they will use it for that -- and by insisting on having a claim to privacy that is beyond social regulation, we are creating a world in which people like that will flourish and be able to do more than they can do today."
Even if Clipper flops, the Crypto War will continue. The Administration remains committed to limiting the spread of strong cryptography unless there's a back door. Recently, it has taken to asking opponents for alternatives to Clipper. One suggestion it will not embrace is inaction. "Deciding that the genie is out of the bottle and throwing our arms up is not where we're at," says a White House official.
The National Security Agency will certainly not go away. "The agency is really worried about its screens going blank" due to unbreakable encryption, says Lance J. Hoffman, a professor of computer science at George Washington University. "When that happens, the N.S.A. -- said to be the largest employer in Maryland -- goes belly-up. A way to prevent this is to expand its mission and to become, effectively, the one-stop shop for encryption for Government and those that do business with the Government."
Sure enough, the security agency is cooking up an entire product line of new key-escrow chips. At Fort Meade, it has already created a high-speed version of the Skipjack algorithm that outperforms both Clipper and Capstone. There is also another, more powerful, encryption device in the works named Baton. As far as the agency is concerned, these developments are no more than common sense. "To say that N.S.A. shouldn't be involved in this issue is to say that Government should try to solve this difficult technical and social problem with both hands tied behind its back," Stewart Baker says.
But Phil Zimmermann and the Cypherpunks aren't going away, either. Zimmermann is, among other things, soliciting funds for a PGP phone that will allow users the same sort of voice encryption provided by the Clipper chip. The difference, of course, is that in his phone there is no key escrow, no back door. If the F.B.I. initiated a wiretap on someone using Zimmermann's proposed phone, all the investigators would hear is static that they could never restore to orderly language.
What if that static shielded the murderous plans of a terrorist or kidnapper? Phil Zimmermann would feel terrible. Ultimately he has no answer. "I am worried about what might happen if unlimited security communications come about," he admits. "But I also think there are tremendous benefits. Some bad things would happen, but the trade-off would be worth it. You have to look at the big picture."
Steven Levy is the author of "Hackers: Heroes of the Computer Revolution" and a columnist for Macworld Magazine.