If you watch the Jason Bourne movies, or movies like Enemy of the State, you’d think that the CIA and FBI are extraordinary in their ability to track us, to find us, to know everything about us. But in reality it took them over nine years to find Osama bin Laden in his villa in Pakistan, and over sixteen years to find Whitey Bulger on the beaches of Santa Monica. And now, it’s been two months since the San Bernadino shootings, and the FBI wants Apple to break into Syed Farook’s iPhone, because they can’t do it.
Really? Two months and the FBI can’t break into an iPhone?
To be honest, I have somewhat mixed feelings about this. First of all, I’m happy to hear that cell phone encryption is so strong that our government actually can’t break into it. And second, it concerns me a little bit that cell phone encryption is so strong that our government can’t break into it.
The danger is obvious: if cell phone encryption is that strong, then obviously terrorists really could use cell phones to communicate in a way that would evade our government’s ability to detect them. That is a concern.
On the other hand, if Apple and other manufacturers were to build “back doors” into their technology, then it’s clear that other “bad guys” would use those back doors to hack into our phones, steal our data, and use that data to their own advantage.
You can’t have it both ways, as Dante Ramos points out in the Boston Globe.
The same encryption that frustrates investigators in the San Bernardino case also protects Chinese human-rights advocates — not to mention the contents of the devices carried by American government officials and tech executives — from intrusions by, for instance, hackers in Beijing. As Bruce Schneier, a security technologist affiliated with Harvard’s Berkman Center for Internet and Society, noted in an email this morning, “I cannot build a technology that only operates in the presence of people with a certain morality.”
Now a judge has ordered Apple to find a way to open up Syed Farook’s phone. They have refused to do it. In an open letter to their customers, Apple defends its decision.
Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case. In today’s digital world, the “key” to an encrypted system is a piece of information that un locks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge. The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
Although I have mixed feelings about it, I think I’m with Apple on this. I think the government has enough other ways in which it can track terrorists, enough other resources, and there is enough danger in creating a backdoor to encryption, that we should leave well enough alone.