Privacy is good. Without privacy nothing that we care about can thrive: neither marriages, nor art, nor science, nor technology, nor contracts, nor democracy, nor anything much at all.
Technologies that protect our privacy, such as encryption, are therefore useful. In general, we want as much secrecy as we can bear, and as more of our lives are conducted using smartphones and we store more information in a digital cloud, or contemplate being ferried by autonomous vehicles or living in smart houses, we look to technology companies, such as Apple, to provide us with a reasonable degree of privacy.
But should technology companies create black boxes, whose encryption is so strong that they cannot be unlocked without their users’ consent, a lucky guess, or treachery, even if law enforcement has a legitimate interest in seeing the boxes’ contents?
In “What If Apple Is Wrong?”, Brian Bergstein, MIT Technology Review’s executive editor, describes crimes where being able to unlock iPhones would identify a murderer or help free an innocent person and asks, “Are we certain we want to eliminate an important source of evidence that helps not only cops and prosecutors but also judges, juries, and defense attorneys arrive at the truth?” That “essential question” was mostly overlooked during the confrontation between the FBI and Apple, when the company refused to disassemble the locks on an iPhone that Syed Rizwan Farook had used before he and his wife killed 14 people in San Bernardino, California.
President Obama, speaking at the South by Southwest conference in March, grasped that essential question, lecturing the audience of technologists, many of whom were fans of strong encryption: “Dangers are real. Maintaining law and order and a civilized society is important ... And so I would just caution against taking an absolutist perspective … If in fact you can’t crack [phones] at all, if the government can’t get in, then everybody is walking around with a Swiss bank account in their pocket. There has to be some concession to the need to be able to get into that information somehow.”
Privacy rights cannot be guaranteed by technologies, which are contingent on the willingness of their manufacturers to create them, or by the continued existence of those companies. But a more limited privacy than technologists promise is guaranteed by the Constitution of the United States, which (as the president reminded us) has always allowed that police can enter one’s house and rifle through one’s personal effects, so long as they have a warrant issued by a judge. If it follows that allowing phones to be subject to search warrants makes them more vulnerable to hackers, then that is a trade-off we must accept in the real, fallen world of murders and human trafficking, so long as the increase in vulnerability is in fact small.
When Tim Cook, Apple’s CEO, vows to continue to increase the strength of the encryption on his company’s products, he is proposing to make commonplace something that has hitherto been rare: zones of privacy that are potentially impenetrable. But no one made Tim Cook king. At SXSW, President Obama warned against “fetishizing our phones above every other value,” and he insisted that “the notion that somehow our data is different and can be walled off from … other trade-offs we make is incorrect.”
No right is absolute, because all rights butt up against other rights, with their own strong claims. In open, democratic societies, we are committed to continually negotiating rival claims, as values change and technologies evolve.
But write and tell me what you think at jason.pontin@technologyreview.com.