The world’s most valuable company and the most powerful government are clashing in court over what seems to be a defining case for the future of privacy.
The FBI wants Apple to unlock the phone of one of the shooters from December’s terrorist attack in San Bernardino, California.
But Apple CEO Tim Cook published an open letter vowing to defend his clients’ right to privacy:
Up to this point, we have done everything that is both within our power and within the law to help them … But now the US government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”
At first sight, it may seem that the tech giant is making a big deal out of a single phone. However, we need to understand that the FBI is really after establishing legal precedent, not the suspect’s phone information.
This piece by Julian Sánchez on Cato.org nails it:
Four important pieces of context are necessary to see the trouble with the Apple order.
1. This offers the government a way to make tech companies help with investigations
Law enforcement and intelligence agencies have for years wanted Congress to update the Communications Assistance for Law Enforcement Act of 1992, which spells out the obligations of telephone companies and Internet providers to assist government investigations, to deal with growing prevalence of encryption—perhaps by requiring companies to build the government backdoors into secure devices and messaging apps.
In the face of strong opposition from tech companies, security experts and civil liberties groups, Congress has thus far refused to do so.
By falling back on an unprecedentedly broad reading of the 1789 All Writs Act to compel Apple to produce hacking tools, the government is seeking an entry point from the courts it hasn’t been able to obtain legislatively.
Moreover, saddling companies with an obligation to help break their own security after the fact will raise the cost of resisting efforts to mandate vulnerabilities baked in by design.
2. This public fight could affect private orders from the government
Several provisions of the federal laws governing digital intelligence surveillance require companies to provide “technical assistance” to spy agencies.
Everything we know suggests that government lawyers are likely to argue for an expansive reading of that obligation— and may already have done so. That fight, however, will unfold in secret, through classified arguments before the Foreign Intelligence Surveillance Court.
The precedent set in the public fight may help determine how ambitious the government can be in seeking secret orders that would require companies to produce hacking or surveillance tools meant to compromise their devices and applications.
3. The consequences of a precedent permitting this sort of coding conscription are likely to be enormous in scope
This summer, Manhattan District Attorney Cyrus Vance wrote that his office alone had encountered 74 iPhones it had been unable to open over a six-month period.
Once it has been established that Apple can be forced to build one skeleton key, the inevitable flood of similar requests — from governments at all levels, foreign, and domestic — could effectively force Apple and its peers to develop internal departments dedicated to building spyware for governments, just as many already have full-time compliance teams dedicated to dealing with ordinary search warrants.
[adrotate group=”8″]This would create an internal conflict of interest: The same company must work to both secure its products and to undermine that security — and the better it does at the first job, the larger the headaches it creates for itself in doing the second.
It would also, as Apple’s Cook has argued, make it far more difficult to prevent those cracking tools from escaping into the wild or being replicated.
4. Most ominously, the effects of a win for the FBI in this case almost certainly won’t be limited to smartphones
Over the past year, I worked with a group of experts at Harvard Law School on a report that predicted governments will to respond to the challenges encryption poses by turning to the burgeoning “Internet of Things” to create a global network of surveillance devices.
Armed with code blessed by the developer’s secret key, governments will be able to deliver spyware in the form of trusted updates to a host of sensor-enabled appliances. Don’t just think of the webcam and microphone on your laptop, but voice-control devices like Amazon’s Echo, smart televisions, network routers, wearable computing devices, and even Hello Barbie.
…The developer keys that mark code as trusted are critical to that ecosystem, which will become ever more difficult to sustain if developers can be systematically forced to deploy those keys at the behest of governments.
Users and consumers will reasonably be even more distrustful if the scope of governments’ ability to demand spyware disguised as authentic updates is determined, not by a clear framework, but a hodgepodge of public and secret court decisions.