Cryptographic Abundance

Cryptography could give us data privacy today. Only no one’s asking for it.

My 82-year-old mother never was very good at arithmetic. She now has lost the ability to balance her checkbook. Yet this morning, at the touch of a button on her browser, she performed a fairly sophisticated arithmetic operation on her way to establishing a secure session with the e-commerce site where she orders her medications. This operation is called “modular exponentiation.” If Mom knew its nine-syllable name she would have been afraid to push the button. Fortunately, and crucially, the operation is hidden from her, as it is from most of us. It is part of a cryptographic system, a system designed to provide confidential communication.

To Mom’s eternal puzzlement, I am a cryptographer, an expert in making and breaking secrets. Cryptography is at least as old as writing. Indeed, before the rise of literacy, writing itself was a form of cryptography. Written messages among literate people could safely be transmitted via illiterate couriers. The spread of literacy led to the invention of ways to obscure the meaning of a message-for example, reordering letters, or substituting some letters for others. Modern cryptographic algorithms still rely on reordering and substitution. But of course, we now use computers to manipulate the symbols.

Knowledge of cryptographic techniques used to belong almost exclusively to governments, which use cryptography to protect political, diplomatic and military secrets against the prying eyes of other governments. Historically, governments took steps to restrict the spread of cryptographic knowledge. Cryptographic activities were conducted in secret departments, some actually called “Black Chambers.” Cryptographic texts were suppressed or classified. Knowledge was passed from person to person, from master to apprentice. Trade in cryptographic information or equipment was banned. The field of cryptography was intentionally cloaked in mystery.

In the late 1970s, significant cryptographic knowledge, and the abilities to invent and implement cryptographic systems, spread beyond the black chambers of government into academic and industrial settings. We saw the start of annual research conferences on cryptography and the establishment of an open professional literature on the subject consisting of journals, conference proceedings, textbooks and eventually Web pages. We also saw some governments, perhaps bowing to the inevitable, reduce their attempts to repress knowledge of cryptography.

One knee-jerk reaction in the aftermath of the September 11 attacks was to call for a ban on the use of cryptography. Even if this were desirable, it would be difficult to achieve-as there are now thousands of competent cryptographers in more than 50 nations. The genie of cryptographic knowledge is out of its chamber.

What do cryptographers today believe about cryptography? Generally, they believe what they have learned through experience: that cryptography is hard to understand, hard to implement correctly and computationally expensive. As a result, system designers have learned to avoid cryptography wherever possible, and to use it only sparingly when it is used at all. This avoidance has led to, among other dysfunctional wonders, e-mail systems that tell everything to everyone and cellular telephones that not only expose conversations but whose device identities can be stolen and misused (see “The Undefended Airwaves,” TR September 2001).

But this view of cryptography is at least a quarter-century out of date. Moore’s Law-the rule of thumb that the number of transistors on a chip doubles every 18 months-has delivered a 100,000-fold increase in computational power in the past 25 years. We are therefore rapidly approaching the time when cryptographic operations will be cheap and easy, commonplace and unremarkable. Instead of avoiding or conserving cryptographic operations, designers should now be using them freely.

For example, a new cryptographic algorithm called the Advanced Encryption Standard was adopted in February 2001 as a draft U.S. Federal Information Processing Standard. It’s about 4,000,000,000,000,000,000,000 times more secure than its predecessor, the Data Encryption Standard, yet it operates many times faster. Inexpensive chips will be available in 2002 that can execute the new algorithm at multigigabit-per-second rates, fast enough to make fiber-optic links secure. These same chips will be able to perform 10,000 modular exponentiations per second, thereby accelerating e-commerce applications such as the ordering of medications over the Internet.

The sudden abundance of previously scarce cryptographic resources will have profound effects on the ecology of people, systems and information. Abundant cryptography will protect us from identity thieves who exploit access to private information in order to usurp their unsuspecting victims’ personae. Similarly, we will be protected from exploitation, blackmail, extortion, gossip and unwanted surveillance, whether from businesses claiming to be “acting in our best interests” or from governments acting extralegally.

Businesses using abundant cryptography will be capable of creating systems that act on our behalf to validate the origin and content of every piece of software we execute, every Web page we read, and every message or telephone call we receive. Say goodbye to spam e-mail and to direct marketing by telephone, unless you like that sort of thing. Abundant cryptography therefore has the potential to help secure time for recreation and refuge, for increased productivity, and for introspection and spiritual growth.

All these things are technically possible today. But despite the feasibility of cheap, easily implemented and abundant cryptography, social and economic obstacles stand between us and these potential benefits. Consumers, be they individuals, businesses or governments, have now been trained by their exposure to poor systems to have low expectations about their security and privacy.

One important problem is that the privacy interests of consumers are not aligned with the economic interests of the companies that provide their information systems. These companies have no incentive to protect their customers’ private data. In fact, they have every incentive to collect the data in order to advance their own marketing interests or to sell it to others. The foxes are building the henhouses.

The Ralph Nader of information security has yet to emerge, but I hope to meet him or her soon. Consumer advocacy, coupled with abundant cryptography and better information systems, can lead to a future where information about us as individuals, our private information, will be protected by encryption no matter how it is generated or stored or transmitted. Some argue that this will also protect information about terrorists and is therefore undesirable. I believe that in a society based on law, necessary surveillance of the few can be conducted through the proper operation of law rather than by denying the entire citizenry access to effective self-protection technologies.

What can you do in the meantime? Educate yourself. Once enough of us realize that there is no technological barrier between us and the benefits I’ve described, we will naturally revise our expectations about personal privacy and security upward and demand better. When enough of us understand how easy it is to make truly secure systems, and refuse to buy anything that offers us less, we will give companies the economic incentive they currently lack. Refuse to trust the foxes of the world.