In California, where I live, it takes the signatures of just less than 700,000 voters to get an amendment to our state constitution on the ballot and a simple majority vote will get it adopted. Oddly enough, this process of amendment by initiative has saddled us with a constitutional requirement that practically all local tax increases be approved by a two-thirds vote--what's known as a "super majority"--via a series of amendments that were themselves voted in by a simple majority.
That's a little too undemocratic for me, so I'm trying to gather support for another amendment called "The Fairer Voting Act" that would reduce the super majority vote to three-fifths of votes cast. (It reduces the super majority instead of abolishing it, because I'm afraid the California Supreme Court would rule that eliminating the super majority was a "reform", rather than an amendment and throw it out. California constitutional law can be extremely complex--and often counter-intuitive.)
In doing the research that led me to write The Fairer Voting Act, I've spent a lot of time studying the California Constitution. I disapprove of many provisions of that document, but I'm extremely glad that the framers had the good sense explicitly to spell out the right to privacy in Article I.
Most Americans mistakenly believe that the U.S. Constitition includes a specific guarantee of the right of personal privacy. It doesn't. Although the Supreme Court has ruled that a combination of the Fourth Amendment, (which safeguards against "unreasonable searches and seizures",) the Fifth Amendment (which protects individuals from being forced to testify against themselves in criminal cases) and the Ninth Amendment (which states that "The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people,) creates an implied right to privacy, it isn't formally established anywhere in the Constitution.
That's why police and intelligence agencies can tap telephones, open mail and generally spy on people. It's also why paparazzi can hire helicopters to buzz celebrity weddings and why tabloid TV shows and "legitimate" TV newsmagazines alike can get away with using hidden cameras and ambush interviews.
It's also a big part of the reason digital encryption has become such a hot issue.
History Is Made at Night
Cryptography--which combines the Greek words KRYPTOS and GRAPHOS and literally means "secret writing"--has played a part in human commerce for thousands of years. 3500 years ago, Mesopotamian scribes used basic ciphers to guard their secret pottery glaze formulas. Not surprisingly, the military has used encryption since history began. In his battlefield messages to the Roman Senate, for instance, Julius Caesar regularly used simple letter substitution codes and other basic cryptographic techniques. Like their military counterparts, diplomats and spies have also always used "secret writing". In the 1790's, during the time he served as George Washington's Secretary of State, Thomas Jefferson invented a wheel cipher machine and used it to encrypt his correspondence with the American ambassador to France--whose postal inspectors regularly steamed open foreigners' mail. 150 years later, during WWII, the U.S. Navy developed the M-138-A, which was basically a variation on Jefferson's device.
In 1926, the German Navy began purchasing an electro-mechanical encryption device manufactured by one Arthur Scherbius. This famous machine, based on a 1919 patent by Dutch inventor Hugo Alexander Koch was known as Enigma. It used a set of three rotors to apply a progressive letter-substitution code algorithm. The German Navy modified Scherbius' Enigma by adding a plugboard, which greatly increased the number of possible letter substitutions--and thus the strength of its ciphers. Enigma was used to encrypt and decrypt virtually all German military communications throughout WWII. The Allied effort to break both Enigma and Purple--its Japanese counterpart--sparked the era of computer-assisted cryptanalysis. At England's Bletchley Park, Alan Turing and his colleagues used an increasingly-sophisticated set of electro-mechanical computers they called "bombes" to test solutions to Enigma, while their counterparts at the American Office of Special Intelligence began developing purely electronic decryption engines to assist in cracking the 4-, 5- and 6-rotor Enigmas which came into use later in the war.
Then, in 1947, the U.S. Army's Signal Intelligence Service's VENONA project, the forerunner of the National Security Agency (NSA), decrypted messages which proved that traitors working on the Manhattan Project had passed the secret of the atomic bomb to the Soviet Union. Based in part on VENONA's revelations, the United States Senate then approved the Export Control Act of 1949, allowing the State Department to classify all but the very weakest cryptographic technology and products as munitions subject to stringent export restrictions--a policy which continues in force to this day.
The U.S. is far from alone in regulating cryptographic exports. While Russia has gone from an absolute ban on the domestic use and export of crypto under the former Soviet Union to a seeming lack of any controls today, France and Australia both ban the export and use of products which make use of encryption strengths greater than 40 bits--a key strength that a moderately powerful microcomputer can easily defeat. Britain's Official Secrets Act forbids even revealing the strength of encryption which Britons may export. Meanwhile, China--while it has no published laws on the subject--exacts the death penalty for the export of either cryptographic technology or products.
Stop Making Sense
In 1975, under contract with the U.S. government, IBM developed an encrytion technology which came to be known as the Data Encryption Standard (DES). DES was only adopted as Federal Information Processing Standard (FIPS) PUB-46 after its key length was reduced from the original 128 bits to 56 bits at the NSA's insistence. Despite 20 years of progress in digital cryptography, DES--which is a symmetric encryption scheme--has remained the U.S. standard ever since. The problem with symmetric-key encryption is that every key holder has a copy of the same encryption key. Although shared-secret schemes permit faster decryption, the fact that all copies of the shared key are identical means that, if any one copy is compromised, all the other copies are automatically also compromised. And--since all parties to the secret share the same key--no digital signature to an encrypted transmission is possible. That means a message's author can't be definitively established, since any of the common keyholders could have encrypted the message. Likewise, a keyholder can't definitively repudiate authorship of any message encrypted with a shared key. And that means a third party could forge what appears to be an authentic message from one of the authorized keyholders, especially if none of those "official" keyholders knows their shared key has been compromised.
Those problems--and especially the problems of digital signing, authentication and repudiation, required a completely different approach--one that took five years to appear.
In April, 1980, Martin Hellman, Whitfield Diffie, and Ralph Merkle were issued U.S. Patent 4,200,770 and in August, 1980, Hellman and Merkle were issued Patent 4,218,582. Those patents, both of which expired--and thus entered the public domain--in 1997, form the basis of asymmetric or (as it is better-known) public-key digital cryptography, although Patent 4,405,829, issued in 1983 to Ronald Rivest, Adi Shamir, and Leonard Adleman, outlines the more widely-known RSA public-key algorithm, which--like the Diffie-Hellman model--is based on the difficulty of factoring large prime numbers.
In the public-key model, every user has two keys. The first is a private key, which is never revealed. The second is a public key, which is made freely available to anyone with whom the keyholder wishes to exchange encrypted messages.. Data which is encrypted using only the private key cannot be decrypted by outsiders, unless the encryption is broken or the private key is revealed. Data destined for other parties is encrypted using both the author's private key and the recipients' public keys. Those parties whose public keys were used to encrypt the data can then use their respective private keys to decrypt it and use the author's public key to establish its authenticity.
While public-key encryption and decryption is slower than symmetric-key encryption, (because more than one key is used,) it has capabilities which make it a superior choice for use in online commerce. Public-key cryptography offers advantages such as digital signatures, (where the author's private key is used to produce a unique signature block based on a checksum of the document's original contents,) authentication (since no one else has access to an author's private key, no one else could have authored a document which he/she has encrypted) and the ability to repudiate forged documents (because a public key cannot be used to forge a digital signature). Public-key users can also "countersign" each other's public keys, thereby vouching for each other.
Today, public-key techology is usually used for authentication, signing and repudiation, as well as to encrypt the keys of symmetric encryption algorithms (such as IDEA, DES, RC4 or MD5) that are used for bulk data encryption. That dual approach allows the safety and authentication capabilities of public-key encryption for the key-exchange process to be combined with the speed of symmetric-key algorithms for encrypting and decrypting the data in message bodies.
The Secret of My Success
The use of public-key cryptography for digital signatures, authentication and repudiation requires some mechanism to ensure that any given user's public key is, itself, authentic, of course. When crypto users are all known to each other, they can simply authenticate each other's public keys by "countersigning" them. A global public-key environment, however, requires a more secure and less cumbersome solution. The need for trusted third parties to vouch for the autheticity of keys whose holders are not personally known to each other has therefor given rise to commercial Certificate Authorities, such as Verisign, Inc.
When a user submit his/her public key to such a Certificate Authority, it countersigns that user's public key with its own private key. The Certificate Authority then makes both the user's countersigned public key and the Authority's own public key available to the general public via a public, secure HTTP or FTP server. Other users can then download the Certificate Authority's public key and use it to verify keys belonging to persons or organizations whom they do not personally know, but which have been countersigned by the Authority.
Of course, most encryption products require their users to enter a password (or, more often, a "passphrase") and users forget their passwords all the time. Likewise, having to type in a password every time a user encrypts or decrypts an email message or other document disincents the use of crypto. A search for solutions to these problems has led to the development of password tokens.
Token-based encryption schemes simply substitute the contents of a token for the password that interactive encryption systems require. Crypto tokens are either generated automatically by a hardware or software product, or they are issued by a trusted third party. Tokens usually reside on a local disk, which makes them at least theoretically portable. Tokens may be used in either public-key or symmetric- key encryption, and they are the basis of most so-called "key escrow" or "key recovery" proposals.
In 1992, the Clinton Administration began a campaign to require all U.S.-made hardware-based crypto products to use the so-called Clipper encryption chip. It also began pushing Congress to enact laws that would mandate implementation of a national "key escrow" repository. If it had successfully been implemented, that policy would have required all users of crypto to submit copies of their private or shared keys and the passwords to those keys to a State Department-controlled archive. The Clinton administration's maintained that the widespread use of strong cryptography poses threats to both the ability of domestic law enforcement agencies to use wiretaps and investigate crimes involving computers and to intelligence organizations' ability to conduct espionage against foreign powers and terrorist organizations. The key escrow proposal generated so little support and so much opposition that it eventually was dropped. Instead, the Administration repackaged its proposed "key escrow" policy into a marginally less heavy-handed proposition it called "key recovery".
Key recovery would madate that American companies develop and implement either third-party key escrow repositories or partial-key escrow schemes. In the partial-key escrow paradigm, three or more different entities would hold separate parts of a user's private key and password. In order to generate or "recover" the full private key and password, at least two of these parties would have to cooperate. If one was the user's employer, the second was the State Department and the third was a bank or a Certificate Authority, the government could subpoena either the employer's or the third party's portion of the key and password and "recover" the user's key. The Clinton administration's sales pitch for this strategy was that, if an employee was unavailable or unwilling to supply his or her private key and/or password, his/her employer could use the key recovery infrastructure to access data encrypted by that employee. What was mostly left unsaid was that the spies or the police could do the same thing without the employer's knowledge--although they'd need a warrant to do so.
Or, at least that was the theory.
The Man Who Knew Too Much
1991's Senate Bill 266--which never became law--would have mandated that all encryption products include "back door" access capabilities for law enforcement and national security agencies. Responding to that threat to individual privacy, Phil Zimmermann, a software engineer whose hobby was cryptography, developed and released a freeware program he called Pretty Good Privacy (PGP). PGP combined RSA algorithm-based public-key crypto with strong symmetrical-key encryption in one unfriendly, inconvenient-to-use package. One of Zimmermann's friends uploaded a copy of PGP to an anonymous FTP site on the Internet. In less than a week, PGP had spread across the planet and inside of a few months, despite its user-unfriendly interface, it was the de-facto Internet standard for encrypting email.
In 1993, U.S. Customs agents began investigating Zimmermann for theft of intellectual property. He was suspected of stealing the patent-protected RSA algorithms which he had incorporated into PGP. (The U.S. patent on the RSA public-key encryption algorithm will expire in the year 2000. In many other countries it is unpatented or patented in the public domain. In a 1996 interview with me, Zimmerman was adamant that he developed the mathematical basis for PGP independently.) Customs spent the next three years pursuing their investigation, only dropping it in January, 1996.
Beginning in mid-1995, the business community discovered the Internet and, starting in 1996, it began to embrace the concept of online commerce. In short order, the hackers who were concerned with export restrictions on encryption strength were joined, first by software vendors and then by the larger business community. Successful online commerce applications were going to need a guarantee of data confidentiality which only strong crypto could offer. Unfortunately, the U.S. government was still firmly opposed to allowing strong cryptography to be used in products for the international market.
The major point of contention in the debate was then and still remains the issue of key lengths. As a rule of thumb, the strength of any given encryption scheme increases as the square of its key length. Thus, the 40-bit maximum key length the then- current U.S. policy forced Netscape to use in the export version of its Navigator browser was so weak that, on August 16, 1995, French graduate student Damien Doligez announced he had broken it in a mere 8 days by using 120 networked Unix machines--including two supercomputers--to sequentially test each key.
Doligez found the correct key about halfway through the list of possibilities. If he had been forced to try literally every key with the computing power available to him, the process would have taken no more than 15 days. However, had he tackled Navigator's 128-bit domestic version with the same computing power, testing half the possible keys would have taken about 51 septillion years (51,010,560,000,000,000,000,000 years). (Note that the heat death of our Universe is projected to happen in a mere 120-150 billion years.)
The government's position also cheerfully ignored the fact that, as early as June, 1996, Trusted Information Systems conducted a Worldwide Survey of Cryptographic Products which identified 1,262 crypto products from around the globe. 532 of those products were offered by non-U.S. companies from 28 different countries, including Russia, Poland, South Africa and Isreal. Many of them offered considerably stronger encryption than the international versions of their U.S. competitors products, which created a serious and growing "crypto gap" just as the international marketplace for crytographic products was taking off.
Many industry observers are concerned that the public debate has thus far been couched mostly in terms of civil liberties vs. law enforcement and intelligence agencies. What has not been emphasized is the impact on business of America's outdated restrictions on crypto exports and of the Administration's opposition to reforming them.
In January, 1996 Qualcomm, Inc., for instance, tried to obtain State Department clearance to secure a Virtual Private Network (VPN) connection via the Internet between its San Diego, California headquarters and its Singapore branch office, using 56-bit triple DES encryption. Even though their Singapore operation was exclusively staffed by U.S. citizens, the State Department denied Qualcomm's request. Qualcomm's request would still be denied under current law.
Quite apart from being frustrated in its attempt to secure its Singapore connection, Qualcomm also has to worry about the impact of export restrictions on its ability to incorporate strong cryptography into its popular Code Division Multiple Access. CDMA is a spread spectrum technology for multiplexing large numbers of simultaneous calls onto cellular telephone carrier frequencies and it needs strong crypto both to prevent "cloning" of CDMA phones and to protect the privacy of customer calls. As it stands, Qualcomm must confine the use of strong crypto in CDMA phones to the U.S. marketplace.
Mr. Smith Goes to Washington
Until December 31, 1996, the export of all encryption technology and products was regulated by the U.S. State Department under the International Traffic in Armaments Regulations (ITAR). Prior to 1997, the Clinton administration made exceptions for applications specific to the financial industry, but, for the most part, the State Department still refused to permit the export of anything stronger than 40-bit encryption.
In November, 1996, President Clinton amended his 1994 Executive Order 12924 regarding export controls on encryption products. The 1996 amendment transferred regulation and review authority for crypto exports from the State Department's Munitions List to the Commerce Department's Dual Use List. Approval of crypto exports was made subject to a one-time review and approval process, in place of the multi-stage, multi-agency review such exports had previously faced. The power to approve exports was split between an Export Administration Review Board and an Advisory Committee on Export Policy whose members include representatives from the Commerce, Defense, Energy, Justice and State Departments, and the Arms Control and Disarmament Agency, with the power to break ties reserved for the President--which, in effect, gave final approval to the President's National Security Adviser. The Executive Order was accompanied by a memorandum on encryption export policy which increased the allowable key strength of exportable crypto products from 40 bits to 56 bits.
Since encryption strength for any given algorithm is roughly 2 to the power of the key length, encryption using a 56-bit key is approximately 65536 times stronger than encryption employing the identical algorithm with a 40-bit key. That should have made the new rules highly attractive to the domestic computer industry. However, the amended Order 12924 contains several onerous conditions which make it less attractive to vendors seeking a one-size-fits-all solution to developing products for the global market.
First, EO 12924 made no provision for the export of existing 56-bit crypto products after December 31, 1998. That omission could leave vendors without any way to offer support, patches or inline upgrades to foreign clients who purchased 56-bit products and can't or won't "upgrade" to a version which supports key recovery. Second, vendors are forced to present a plan and timeline for implementation of key recovery, regardless of whether their customers are willing to accept it as a condition of future purchases and irrespective of foreign governments' resistance to key recovery schemes which would give the U.S. government access to the keys of foreign companies or individuals. Third, two-thirds of the members of the Export Administration Review Board are from defense, intelligence or law enforcement agencies, and all of those agencies have a vested interest in limiting the spread of strong crypto. Finally, in the event of any new legislation on encryption export, the Administration reserves the right to redesignate encryption products as munitions.
That's important because the 105th Congress may open the door to the virtually unrestricted export of strong crypto..or it may not. Senator Conrad Burns (R-Montana) introduced the Promotion of Commerce in the Online Digital Era (or "PRO-CODE") Senate Bill 1726 in the 104th Congress and reintroduced it in the 105th Congress after it died in committee at the end of 1996. Mike Rawson, Senator Burns' lead staffer on encryption issues, told me in late 1996, "We had the votes. The reason PRO-CODE was not reported out of committee was that the Administration assured folks on both sides of the aisle it was going to come forward with meaningful reform. From many members' perspective, not only was (the amendment to Executive Order 12924) not a meaningful reform, it was just a delay."
In February, 1997, Representative Bob Goodlatte (R-Virginia) introduced the Security and Freedom Through Encryption (SAFE) Act H.R. 695. If it had made it through the legislative minefield, SAFE would have modestly liberalized crypto export laws. It would have given the Secretary of State sole responsibility for export authorization and required that any crypto-capable product available to U.S. consumers be granted an export license. It would have forbidden government key escrow requirements. And, just, incidentally, it would have added five years to the sentence of anyone caught using crypto "in furtherance of the commission of a criminal offense"--and ten years for a second offense.
Meanwhile, also in February, 1997, Burns' Democrat co-sponsor for the PRO-CODE bill, Senator Patrick Leahy, (D-Vermont) introduced the Encrypted Communications Privacy Act of 1997 S.376. It was a virtual clone of PRO-CODE and it got as far as a July, 1997 hearing of the Judiciary Committee before it was put to sleep.
Also introduced in February, 1997 was Senator John McCain's (R-Arizona) Secure Public Networks Act S.909, which would have mandated creation of a "voluntary" key recovery infrastructure and provide penalties for unauthorized decryption, social engineering and other, similar transgressions against encrypted data. It confines itself strictly to domestic U.S. crypto issues and would do a fine job of giving the police and spy communities pretty much everything the Clinton administration has been asking for. It made it as far as the Commerce Committee before going into limbo in June, 1997.
Then Representative James Sensenbrenner, Jr. (R-Wisconsin) introduced the Computer Security Enhancement Act of 1997 H.R.1903--which is much of a muchness with Senator McCain's Secure Public Networks Act. Predictably enough, it's the only one of the bunch to have actually made it through the House to be referred to the Senate, where Senator Burns has kept it bottled up in the Commerce Committee. If luck is with us, it will die there.
Senator Burns' compatriot, Senator John Ashcroft (R-Montana) introduced the Encryption Protects the Rights of Individuals from Violation and Abuse in CYberspace (E-PRIVACY) Act S.2067 on May 12, 1998. It's the best surviving hope for crypto reform. E-PRIVACY would permit any U.S. person at home or abroad to "use, develop, manufacture, sell, distribute, or import any encryption product, regardless of the encryption algorithm selected, encryption key length chosen, existence of key recovery or other plaintext access capability, or implementation or medium used." Ashcroft's bill would also flatly prohibit any key escrow or key recovery requirement. It's currently bottled up in the Senate Judiciary Committee and is unlikely to ever see the light of day.
In attempting to understand this dance of the legislative hippopotami, it's important to keep in mind that, as long as the Administration remains convinced that the priorities of the intelligence and police communities should supercede those of private citizens and the business community, any meaningful Congressional action will very likely face a Presidential veto. It's doubtful if Congress can muster the two-thirds vote needed to override a veto, so even getting reform passed by a substantial majority may not be enough to solve the problem. It's also important to understand that a majority of Congressmen and Senators simply don't understand encryption and don't see the ramifications of the current Administration proposals. And it helps to remember that being seen as tough on crime is considered a big plus at election time, so the desires of law enforcement agencies carry a lot of weight with pols facing election.
You Can't Take It With You
Before it was snapped up by McAfee Associates (which, in its turn, merged with Network General to form Network Associates,) Phil Zimmerman's company, PGP, Inc. chose to publish its own API, enabling third parties to integrate PGP encryption into their own products. PGP, Inc. also produced a PGP plug-in for Qualcomm's Eudora. However, since the McAfee-Network General merger, support for RSA-based encryption has been removed from the freeware version of PGP, producing howls of outrage from PGP loyalists, who now must go outside the U.S. or purchase the commercial version of PGP in order to continue to use their existing RSA-based public and keys.
PGP's source code is published in the U.S. only in hard copy, in order to comply with government restrictions on source code distribution. Each new edition is promptly--and legally-- purchased by foreigners, exported in book form, then scanned, OCR'ed and republished in ASCII on myriad foreign servers within days of its release.
And therein lies one of the biggest flaws in the Administration's arguments in favor of restricting crypto exports. The genie can't be put back in the bottle--and it's been out an about ever since Phil Zimmermann released the original PGP in 1991. Even if there were never to be another release of PGP, there are thousands of bright programmers with an interest in digital cryptography who live and work outside the United States--and are thus not subject to its laws. Isreal, Russia and Poland are all hotbeds of crypto technology and what they're producing is in no way inferior to--and much of it is actually superior to--the work being done here in the U.S.
As Ronald L. Rivest--the "R" in RSA--points out, encryption, per se is not the only way to transmit information securely. Professor Rivest outlines a technique he calls "chaffing and winnowing", where valid information is hidden in a blizzard of plausible, but false, data--"chaff"--which the receiving party "winnows" out, leaving only the valid message. Rivest's technique--since it uses public-key encryption only for authentication and to exchange a key value which allows the receiver to distinguish "wheat" from "chaff"--would be fully exportable, if implemented, because the use of encryption purely for authentication and key exchange is perfectly legal under existing Administration policy.
Then there's steganography--what Rivest calls "the art of hiding a secret message within a larger one in such a way that the adversary can not discern the presence or contents of the hidden message." Peter Wayner has written a great book on steganography, data compression and related technologies entitled "Disappearing Cryptography: Being and Nothingness on the Net" (Ap Professional; ISBN: 0127386718). Steganography, like Rivest's chaffing and winnowing technique, is entirely legal under current law--and yet another example of why the Clinton Administration's attempts to control the spread of privacy technology is hopelessly doomed.
The argument that the spooks and the cops need to be able to tap digital conversations and break data encryption in order to protect us from international criminals and terrorists is equally bogus. To borrow a phrase from the NRA, "When crypto is outlawed, only outlaws will use crypto." And, make no mistake, they will use crypto, because--being outlaws--they will pay no attention to the government's edicts against using it. Heck, they're already breaking laws that could land them in prison for life, so why make a big deal about breaking one that could get them 20 years?
The death knell for the Administration's position may have rung on August 26, 1997, when Federal District Court Judge Marilyn Hall Patel struck down existing restrictions on the export of encryption as outlined in EO 12924 as an impermissably broad prior restraint on Professor Daniel Bernstein's freedom of speech. As part of his doctoral dissertation, Professor Bernstein had developed an encryption algorithm he called Snuffle, (ftp://idea.sec.dsi.unimi.it/pub/crypt/) and on June 30, 1992 applied to the State Department for permission to publish his algorithm online. His request was denied on August 20, 1992. On July 15, 1993, Bernstein filed seperate requests to publish the source code for his original paper, snuffle.c, unsnuffle.c, a description of Snuffle and a description of how to install Snuffle. On October 5, 1993, the State Department issued a ruling formally classifying all five documents as munitions and denying Bernstein permission to "export" them by publishing them online. Bernstein appealed on September 22, 1993. On February 21, 1995, after waiting almost 17 months for a response to his appeal, Bernstein sued.
On August 28, 1997, Judge Patel agreed to place the broader effect of her order on hold, pending its review by the 9th Circuit Court of Appeals. Whichever way the Court of Appeals rules, it's a safe bet that the decision will be appealed to the U.S. Supreme Court. How it will rule is anyone's guess.
In the meantime, we here in the U.S. run the real risk of a brain drain of epic proportions, as talented cryptology programmers are wooed away to build crypto-enabled products for foreign companies that are perfectly free to export their wares to the U.S. And their U.S. competitors will continue to be forced to fight for market share with crippled products and the threat of a Big Brother that demands the right to snoop on their customers at will.
The President's Analyst
The Economic Strategy Institute estimates that "the U.S. economy will lose between $35.16 and $95.92 billion over the next five years as a consequence of current administration policy". Their figures include both direct impacts--such as lost sales of encryption software and encryption-enabled hardware--and indirect effects--such as efficiency losses in business-to-business electronic commerce and forgone online shopping opportunities.
On June 8, 1998, an ad-hoc group of internationally-respected cryptography experts (Hal Abelson of MIT's Laboratory for Computer Science, Ross Anderson from the University of Cambridge, Steven M. Bellovin, AT&T Laboratories security expert and co- author of "Firewalls and Internet Security: Foiling the Wily Hacker", Josh Benaloh from Microsoft, Matt Blaze of AT&T Labs, Whitfield Diffie--co-holder of the seminal public-key encryption patents--now of Sun Microsystems, John Gilmore, founder of the SWAN project, as well as the father of the alt.* newsgroup hierarchy and founding member of the Electronic Freedom Foundation, Peter G. Neumann of SRI International, Ronald L. Rivest, currently of MIT's Laboratory for Computer Science and co-holder of the RSA algorithm patents, Jeffrey I. Schiller, security officer for MIT Information Systems and Bruce Schneier of Counterpane Systems) released an update to their 1997 report on "The Risks of 'Key Recovery,' 'Key Escrow,' and 'Trusted Third-Party' Encryption". They specifically address whether secure key recovery systems that meet government specifications are technically possible and what additional costs and risks such systems might entail.
Their conclusions are pretty damning. Among other things, they point out that the Administration's requirements include the ability to decode network data flows--a capability for which there is no legitimate business requirement other than industrial espionage--and the ability to access encrypted data without the knowledge of the data's owner. The report reiterates the 1997 version's determination that an extremely complex secure, national, (indeed, international,) ubiquitous data network would need to be developed and deployed in order to give the FBI, the NSA and other government law enforcement and intelligence agencies real-time access to encrypted data flows and data storage.
Essentially, that network would have to bypass every corporate and organizational firewall in America, (and, in reductio ad absurdum, every firewall in the world,) while, at the same time, being itself unbreakable. And the costs, risks and practicality of implementing the government's vision of a ubiquitous key recovery infrastructure appear to remain the same "regardless of the design of the recovery systems--whether the systems use private-key cryptography or public-key cryptography; whether the databases are split with secret-sharing techniques or maintained in a single hardened secure facility; whether the recovery services provide private keys, session keys, or merely decrypt specific data as needed; and whether there is a single centralized infrastructure, many decentralized infrastructures, or a collection of different approaches."
Even the government's own experts agree with the ad-hoc group on the feasibility of a global key recovery infrastructure. On June 30, 1998, after two years of trying, the Technical Advisory Committee to Develop a Federal Information Processing Standard for the Federal Key Management Infrastructure pronounced its effort to write a FIPS for a key recovery infrastructure for the Federal government itself a complete failure. In a letter to Commerce Secretary William Daley, the panel stated that its quest had foundered on the rocks of "technical problems that, without resolution, prevent the development of a useful FIPS." That means the government's own experts can't figure out a workable key recovery scheme, even after spending two years working on the problem. And, as far back as November 8, 1996, the Computer Science and Telecommunications Board (CSTB) of the National Research Council (NRC) concluded in its massive report on "Cryptography's Role in Securing the Information Society", "that widespread commercial and private use of cryptography in the United States and abroad is inevitable in the long run and that its advantages, on balance, outweigh its disadvantages."
The Clinton Administration should be listening to its own experts--but it's not. Instead, it's fighting a rear-guard action to delay as long as possible the liberalization of crypto export policy. The handwriting is on the wall in neon letters ten feet high--key recovery won't work and the police and intelligence communities are going to lose this one. The only question left is, how long will it take?
Want to know more? You should. And so should your users. You can give them a leg up by putting a link to the Encryption Privacy and Security Resource Page at http://www.crypto.com/ and to RSA Labratories' FAQ 3.0 on Cryptography at http://www.rsa.com/rsalabs/newfaq/ on your user start page. And you can add a link to the home of the international version of PGP at http://www.pgpi.com/, which includes a version of PGP 5.5 that allows users to decrypt messages sent using PGP the domestic version of PGP 2.6.2--which used RSA encryption--and to keep using their 2.6.2 keyfiles, if any. (Note that, as I mentioned earlier, the RSA algorithms are either unpatented or patented in the public domain in most of the world. Since Network Associates removed the RSAREF library from the domestic versions of PGP 5.x, the only freeware sources for RSA-compatible PGP 5.x are outside the U.S.)
It's important that you do these things RIGHT NOW. Doing so will help educate your users on encryption issues that will be key to the future of electronic commerce--and make no mistake about it, electronic commerce will be key to your continued profitability. It will better equip them to safeguard their own data and will help them understand why they should care. It will make you a hero.
And--just incidentally--it will inform your users about the right to privacy that should have been part of the U.S. Consititution all along.
(Copyright© 1998 by Thom Stark--all rights reserved)