As Apple and the FBI begin the second major battle of their war over the protection of private citizens’ data, here is some information about their earlier dispute.
(Author’s Note: I am sharing this article – which originally substantively appeared in Inc. in 2016 – as Apple and the FBI begin the second major battle of their war over the protection of private citizens’ data. For information about the current January 2020 dispute concerning the unlocking of the alleged Pensacola, Florida Naval Air Station shooter’s smartphones, please see the article, Apple vs. FBI Round 2: Apple Refuses To Unlock Phones Belonging To Florida Naval Air Station Shooter.)
Yesterday (February 16, 2016), a federal judge ordered Apple to help the FBI unlock an encrypted iPhone used by Syed Farook, one of the two terrorists who murdered 14 people and injured 22 more in a mass shooting in San Bernardino, California, late last year. Among the reasons that the law enforcement agency wants to examine data on the smartphone in question is to help determine if the shooters had assistance from others in carrying out the murders; if they did have help, one could reasonably argue, examining the phone’s contents could not only help the FBI bring other terrorists to justice, but also help prevent future attacks, and save the lives of innocent Americans.
Apple CEO Tim Cook responded by publishing an open letter criticizing the order, in which he stated that while he and his colleagues at Apple have “great respect for the professionals at the FBI, and we believe their intentions are good,” the technology firm would do all that it could to oppose the order, as obeying it, in their mind, “would undermine the very freedoms and liberty our government is meant to protect.”
I have previously written about why we should all oppose government efforts to weaken encryption in computers and smartphones; the reasons that I have spelled out in the past remain true, and my position has not changed.
The San Bernardino iPhone case, however, is more complicated.
There seems to be no question that the phone whose contents the FBI wants to access belonged to a mass murderer, so the issue is not whether the government should be able to spy on American citizens in general, or whether civilian devices that contain encryption should, in general, have backdoors, or if warrantless spying is acceptable, but whether law enforcement agencies, after getting a judge’s order or warrant, should be able to gain access to the data of criminals who are unquestionably guilty of a serious crime. It is, of course, reasonable to believe that they should be able to do so.
Furthermore, in the case of the San Bernardino iPhone court order, there is obviously probable cause to believe that there may be time-sensitive information on the particular device in question that will help investigators catch other criminals involved in the mass murder of Americans, and potentially prevent other terrorist attacks. Such a scenario is quite different than the case of general spying.
It should also be noted that the device in question belonged to a county government that may approve of the data being accessed, not to the shooter himself.
That said, there are major concerns.
The FBI has demanded that Apple effectively create a new version of the Apple iOS that would contain a backdoor that would allow the FBI to circumvent various security features, and that Apple install that new operating system onto the iPhone in question so that the FBI can access the device’s contents. Presumably, the FBI cannot do this on its own, as iPhones come equipped with technology that only allows software approved by Apple to be installed–a security feature that reduces iPhone users’ exposure to malware, but also cripples law enforcement’s abilities.
Apple’s CEO, however, has noted that the creation of such an operating system opens a Pandora’s box: “In the wrong hands,” Cook said, “this software–which does not exist today–would have the potential to unlock any iPhone in someone’s physical possession.” Clearly, he is correct: Once such software is created, there is no guarantee that it would be used only this once–or, if it were allowed to exist after it were used, that it would always be used in situations such as the present one (on the phone of a mass murderer suspected of having terrorist ties and only after a judge’s order).
Once the technology exists, who says the government will not abuse it by utilizing it without a court order? Who says that other parties won’t figure out how to use the backdoor as well? Furthermore, the government does not have a perfect record of information security–could criminals or foreign states obtain such software if it existed somewhere within a government facility?
There may even be an ironic effect: Should Apple create a backdoor, it is likely that criminals will use third-party encryption apps to prevent their data from being read, while innocent Americans will be exposed to surveillance.
So, who is right — Apple or the FBI?
Here is what the issue boils down to:
The FBI and the judge view this order as a demand that a technology firm with the capability to help law enforcement investigate the worst terrorist attack in our nation since 9/11 and potentially prevent future murders of Americans do so. They correctly state that this order applies only to this particular case, so, in theory, any technology that Apple creates would be used only in this easy-to-accept-the-need-for-accessing-data case.
Apple and privacy advocates, including the Electronic Frontier Foundation, on the other hand, view this order as highly problematic on multiple fronts:
In short, they view this order as setting a dangerous precedent, and calling for the creating of dangerous technology whose very existence would create many serious risks.
Clearly, this issue is, as Igor Baikalov, chief scientist at Securonix, told me, “a really sensitive topic.”
“The San Bernardino case is a no-brainer,” he said, “but when one considers the inquiries lined up after that one, claiming similar urgency plus preventive potential, but not having a benefit of hindsight, the question becomes where to draw the line and who gets to draw it.”
Perhaps, if it had not been for various government surveillance efforts that have been exposed over the past decade, and for numerous breaches of government IT systems by hackers, people would be more trusting. Or perhaps not.
Furthermore, one must wonder: Could the court have tried another approach, such as asking Apple to help the FBI disassemble the phone and create replicas so that it could crack the password? That’s still invasive–but a lot less than demanding the creation of a backdoor, and not something that other parties can easily exploit. I also wonder why, if there is a possibility of information regarding other terrorists and connections to foreign terrorist organizations on the phone, the FBI didn’t seek to have an order issued by the United States Foreign Intelligence Surveillance Court (rather than a federal court)–which would have kept the order out of the public eye.
In any case, there is little doubt that the court order creates a dangerous precedent, and I do believe that other, more palatable options should be exhausted prior to discussing the creation of a security-crippled OS. The judge gave Apple five business days to challenge her order, and Apple is certainly going to do so. Let us hope that as Apple escalates the matter a better mechanism will be found and established by which the necessary information can be obtained to ensure justice and save lives yet preserve the rights of the innocent.