Encryption technology basically comes in two flavors: weak encryption and strong encryption. Weak encryption systems provide a degree of protection but can be cracked with some effort. Strong encryption systems are, by their very nature, virtually uncrackable. In the 1990’s strong encryption became the focus of intense political debate.
Just as passwords can be encrypted, data files, email, and web traffic can also be encrypted.[3] As access to strong encryption technologies, such as Phil Zimmermann’s PGP (Pretty Good Protection) software, increased throughout the 1990’s government agencies such as the National Security Agency (NSA) and Federal Bureau of Investigation (FBI), became concerned that strong encryption would hamper their ability to gather information on suspected illegal activities. How, they asked, will the government be able to prevent terrorist bombings or drug smuggling if criminals can communicate with one another in codes that are unbreakable? Civil libertarians responded that the Constitution guarantees citizens a right to privacy in their personal affairs.
In 1994, the U.S. government tried to address both concerns with the “Clipper” chip. Clipper was supposed to be a strong encryption device with a catch. While it normally would encrypt messages in a way that is secure, each clipper device would have a special “master key” that would allow intercepted messages to be decoded. The government proposed that each master key would be divided into two parts: one half of the key would be held by one branch of government, while the other half would be held by a different branch of government. According to the Clipper proposal, only under a court order could the two halves of a key be reunited to allow the government to decrypt the encoded messages of suspected wrongdoers.
Even with the added protection of the master keys being divided, many people felt uncomfortable with the government holding the keys that would enable them to decode private communications. Other objections to Clipper arose from the fact that the encryption algorithm used by the system was not made public. In other words, computer scientists outside of the National Security Agency (NSA), where Clipper was developed, could not verify that its encryption really was as secure as the government claimed.
The original 1994 Clipper proposal did not prove popular and was not adopted. In late 1996, the Clinton Administration proposed a follow on to Clipper that would have placed more control in the hands of industry. The proposal would have allowed industry to develop strong encryption technology for both domestic and foreign use, as long as the companies maintained backup access keys and were willing to turn these keys over to the government when under court order to do so. Like the original Clipper proposal, this effort failed to attract popular support.
One reason for the failure of Clipper and related proposals was that strong encryption techniques, without “government approved” back doors, already existed and were rather widespread by the mid 1990’s. Why would someone use a product that allowed the government to listen in (under certain circumstances) when other products existed that were completely secure, opponents asked?
The encryption debate was complicated by the fact that the U.S. government has historically considered encryption technology (both encryption software and hardware) to be munitions (i.e., weapons). Though restrictions have eased in recent years, in the past it was illegal to export products that included strong encryption technology. Many groups pointed out the futility of such export controls, since encryption algorithms had been widely published in the scientific literature throughout the world and strong encryption programs were freely available, at no cost, over the Internet. They argued that the only result of export restrictions on strong encryption technology was to place U.S. firms at a competitive disadvantage in foreign markets.
By the turn of the century, the encryption debate appeared to be over, with industry and government agreeing that the advantages of strong encryption outweighed its negative aspects. Efforts to place restrictions on domestic use of strong encryption were abandoned, and export controls on encryption technology were eased.
Immediately following the September 2001 terrorist attacks on the Pentagon and World Trade Center however, there were renewed calls by some in Congress to reopen the encryption debate. While these efforts failed to gain widespread support, many civil libertarians continue to worry that pressure will eventually build to outlaw use of strong encryption technology by private citizens.
Whether or not our elected representatives and the courts eventually decide that the negative aspects of unbreakable encryption outweigh our right to privacy, we should recognize that a law against strong encryption would be a fundamentally new type of law. The legality of a communication would be based on the form of the message (the code it was written in) not the content of the message (what it actually says). Hence, an encrypted diary or series of letters to your fiancée might be deemed just as illegal as an encrypted plan to blow up the White House.
If strong encryption for domestic use is ever made illegal, what would that say about our fundamental right to privacy? If strong encryption is not outlawed, how will law enforcement agencies protect the citizenry from terrorists and other unsavory individuals?
Many computer scientists contend that the strong encryption genie has already been released and is so widespread that governments must simply learn to live with it. Only time will tell what the ultimate effects of strong encryption on law enforcement will be.
Footnotes
[3] For example, programs such as “Hotspot Shield” use encryption and “redirection servers” to let you securely surf the web over public Wi-Fi. Free version available at http://download.cnet.com/hotspot-shield/