Apple vs. FBI: What Happened and Why It Still Matters
Apple's complicated and convoluted fight with the FBI over San Bernardino shooter Syed Rizwan Farook's iPhone may be over, but the larger issue of smartphone encryption remains unresolved. Apple and other tech companies continue to insist that total, unbreakable data encryption is necessary and inevitable, while the FBI and other law enforcement agencies say they need access to suspects' communications and devices.
Protestors in Portland, Oregon, supporting Apple's position against the FBI. Courtesy: Fight for the Future
The case that made all the headlines was sparked by a judge's order on Feb. 16 that compelled Apple to help the FBI break into an iPhone used by an apparent terrorist. Apple refused to comply, and on Feb. 25 filed a 65-page motion to vacate the court order.
After several weeks of legal maneuvering and media posturing, as well as surprising pro-encryption statements from former national-security officials, the FBI disclosed in late March that a third party, as yet unknown, had stepped forward and successfully broken into the iPhone. The court order was vacated, at the Department of Justice's request, on March 29.
In a related case, a federal judge in Brooklyn, New York, had ruled Feb. 29 that Apple doesn't have to help the FBI unlock an iPhone used by a Queens drug dealer. That had no direct impact on the San Bernardino case, but bolstered Apple's overall argument and may affect future cases.
It may seem strange that the largest American company defied the U.S government, that its CEO was cheered by shareholders for doing so, and that Facebook, Google and Microsoft backed Apple. But there was -- and is -- a lot more at stake than the data on a dead terrorist's iPhone.
One thing is clear: Most people didn't, and may still not, completely understand what's going on. Here are answers to some frequently asked questions. Also check out our explainer on the larger battle over smartphone encryption and encrypted communications.
Whose phone was this?
The iPhone in question, an iPhone 5c running iOS 9, was used by Syed Rizwan Farook, who, with his wife Tafsheen Malik, killed 14 of Farook's co-workers in San Bernardino, California, on Dec. 2, 2015, in what looks to have been both a terrorist attack and a workplace shooting. Farook and Malik were themselves killed by police later that day.
But Farook didn't own the iPhone himself. It belonged to his employer, the County of San Bernardino Department of Public Health, which issued it to Farook as a work phone.
Farook and Malik destroyed their personal cellphones the day of the shootings. The FBI wanted to get into Farook's work phone to see if there was any stored data on it that could indicate whether Farook and Malik were communicating with anyone overseas before the attacks.
(After the couple killed Farook's co-workers, Malik posted a statement on Facebook pledging allegiance to the Islamic State. But as of yet, there's no evidence that they coordinated or planned the attack with anyone else.)
Why couldn't the FBI read the data on the phone?
The phone had a locked screen that prompted the user to input a passcode to access the device. Neither Apple, nor the FBI, nor the County of San Bernardino Department of Public Health knew that passcode — which could be a 4-digit PIN, a 6-digit PIN or a much longer alphanumeric password. If it's the latter, this effort may be fruitless. (One FBI agent said in a legal proceeding that the display indicated a 4-digit PIN.)
Couldn't someone just have guessed the phone's passcode?
No. Apple built safeguards into iOS 8 and iOS 9 to prevent random passcode guessing. After three incorrect passcode inputs, you'd have to wait 60 seconds before trying another one. After a few more incorrect passcodes, the delay is 5 minutes. If someone tries 10 incorrect passcodes, the phone may factory-reset itself, erasing all user data. (One expert says that on an iPhone 5c, a device reboot will restart the incorrect-input countdown, and that the wipe-after-10-bad-inputs feature is turned off by default.)
So what did the FBI want Apple to do?
It wanted Apple to disable the incorrect-passcode entry delay and the factory-reset safeguard. If those were disabled, then the FBI could have used special hardware to try to "brute force" the passcode by running through all 10,000 possible 4-digit PINs, then all possible 6-digit PINs.
Why didn't Apple want to help?
Helping the FBI break into an iPhone would be undermining the security of all iPhones, Apple said.
This is just one phone. What's the big deal?
It's not just this one phone at all, Apple said. If it caved on this case, the company argued, Apple would be compelled to use this technique to unlock iPhones in police custody all over the United States.
On Feb. 24, Apple's attorneys released a list of nine other cases involving 12 different iPhones of different models and configurations that law-enforcement authorities had asked Apple to help crack. None of the nine other cases involved terrorism. And on March 30, an Arkansas county prosecutor said he'd got the FBI to agree to help unlock an iPhone and iPod connected to a murder case.
Wasn't this case a matter of national security?
It sure seems that way. But this isn't the first time Apple had refused to unlock an iPhone. In October, it refused to do so in the Brooklyn drug case mentioned above. That didn't make many headlines, perhaps because the Justice Department chose not to publicize it.
Don't iPhones back up all data to iCloud automatically?
Yes, and Apple did provide the FBI with everything from the phone that had been backed up to iCloud. But the user can turn off automatic backups, and it appears Farook did just that more than a month before the killings. The FBI had iCloud data only up to Oct. 25, 2015.
Couldn't the County of San Bernardino Department of Public Health just have forced the phone to backup to iCloud?
Yes, normally it could have, but...
What happened with the iCloud backup?
In the rush to collect evidence from Farook's devices in the days after the shootings, someone at the FBI asked the County of San Bernardino Department of Public Health to reset the iCloud password. It did, and now the backup won't work.
Couldn't the encrypted data just have been copied off the phone, and then decrypted?
No. Beginning with iOS 8, each encryption key was generated using information specific to the device, as well as with the passcode. Pull the encrypted data off the phone, and you won't be able to decrypt it.
What if Farook used a long alphanumeric passcode?
Unless he used one of the most 10,000 or so most common passwords (and lists of those do exist), then the FBI would have been unlikely to get in. But since we don't know how the ultimately successful method worked — the FBI has classified it as secret — it's possible the the passcode was bypassed altogether.
Could Apple have disabled the safeguards?
Yes. On an iPhone 5c, it definitely could have. Apple could have put the phone into Device Firmware Upgrade (DFU) mode and run specially designed software — which would have to have been specially created — that would have disabled the incorrect-passcode safeguards.
Was the FBI asking Apple to create an encryption master key or 'backdoor'?
Technically, not in this case. Apple wouldn't have been breaking this phone's encryption. Instead, it would have removed obstacles that stood in the way of the FBI trying to "brute force" the passcode by trying every possible numerical PIN.
But Apple says that distinction was irrelevant, because it was asked to undermine the security of its own products.
Why couldn't the FBI have just run that specially designed software by itself?
Because an iPhone will run only software that's "signed" with Apple's secret digital signature. Apple's not letting anyone else know that signature. It's possible that the successful method somehow managed to install new firmware without Apple's signature -- that's essentially what "jailbreaks" of older versions of iOS have done.
Could hackers use the software the FBI wanted to break into iPhones?
Theoretically not, because they would need Apple's secret digital signature. But iOS 9.1, the latest version of iOS that Farook's iPhone could have been running on Dec. 2, 2015, was indeed jailbroken on March 11, 2016 by the Chinese hackers known as Pangu Team.
Can Apple disable the incorrect-passcode safeguards on other iPhones?
It's not completely clear if Apple could do this on an Apple 5s, 6, 6 Plus, 6s or 6s Plus. Those phones have a Secure Element that's tied into the fingerprint reader and is difficult to tamper with without erasing the entire phone. But some experts say it can be done.
However, Apple is rumored to be working on a phone that will be completely impossible for it to hack, even for Apple technicians.
The Apple Store in midtown Manhattan.
What was Apple's argument against helping the FBI?
Apple said that being compelled to write software to defeat its own security would violate Apple's First Amendment right to free speech, and its Fifth Amendment right against self-incrimination.
Furthermore, it argued that creating software to break into the iPhone would be like asking a manufacturer of safes to create a special key that could unlock a specific safe.
From there, Apple argued, it wouldn't be long before Apple would be required to create a master key, or "backdoor" that could be used to break into any iPhone. The existence of such a backdoor would undermine the security of its products.
How did the First and Fifth Amendments fit into this?
Years ago, it was established in federal courts that software is a form of speech. Like the language in a book or newspaper, software's value lies in the information that it communicates. By that argument, Apple is protected from communicating information that it does wish to communicate, and from communicating information that would contradict previous communications — such as security software.
What about the Fourth Amendment right against unwarranted search?
That would only apply if the phone's owner had been opposed to the search. In this case, the County of San Bernardino Department of Public Health had given its consent.
Has Apple helped police get into locked iPhones before?
Yes, at least 70 times. Until iOS 8, police could just bypass the lockscreen with special equipment, but after that, they needed Apple's help. One report said that police departments simply sent locked devices to Apple, and received a file with all the extracted data a few weeks later.
I've heard this involves something called the All Writs Act. What's that?
The All Writs Act is a law, initially dating from 1789, but most recently amended in 1911, that allows a federal judge to order specific actions on a case-by-case basis if no applicable law governing the case exists. It's what the U.S. government has been using to get Apple to unlock phones.
Wait, so a judge can order anyone to do anything using the All Writs Act?
Not exactly. The order can't compel anyone to do anything illegal. But it provides a means by which a judge can make things happen when there's no particular law governing a case.
THE BROOKLYN DRUG CASE
What happened in the Brooklyn drug case?
In October 2015, Federal Magistrate Judge James Orenstein asked whether Apple had any objections to what the U.S. Justice Department had asked him to do — to compel Apple to unlock a drug dealer's iPhone. (The drug dealer resides in Queens, but the federal court that covers Queens and Long Island is in Brooklyn.)
Until then, Apple thought it had no choice but to comply with such orders. But after Orenstein raised the question whether Apple wanted to, or had to, comply, it stopped complying with requests to break into iPhones.
Orenstein asked Apple to give him its objections, which the company did in February. On Feb. 29, Orenstein ruled in favor of Apple, saying that the All Writs Act does not grant the U.S. government the power to order Apple around.
What is the Brooklyn judge's problem with the All Writs Act?
Judge Orenstein believes the government is abusing the All Writs Act to get whatever it wants, whenever it wants, and that its interpretation of the act may be unconstitutional. The New York Times says he's called it a "Hail Mary play" that the government turns to when it runs out of other options.
In his Feb. 29 ruling (which has no bearing beyond the Eastern District of New York), Orenstein wrote: "The implications of the government's position [on the All Writs Act] are so far-reaching — both in terms of what it would allow today and what it implies about Congressional intent in 1789 — as to produce impermissibly absurd results."
Did the drug dealer go free?
No, he pleaded guilty even without his phone being unlocked. But the FBI says there might still be information on the phone that could be useful in other cases.
So if the Brooklyn judge hadn't raised that question, this whole terrorist iPhone case might not be a problem?
Maybe not. But this standoff between Apple and the FBI, and more generally between Silicon Valley and Washington, D.C., has been brewing for a long time. The Justice Department simply decided to bring it to a head when it had a locked iPhone in a case involving terrorism.
Could authorities in other countries force Apple to unlock iPhones?
Possibly, and that's what Apple is really worried about. If it did this for the U.S. government, the company argued, then China or repressive Middle Eastern countries could force Apple to cooperate with authorities in those countries — and ban sales of iPhones if it didn't play ball.
Do other technology companies agree with Apple?
Many do. Facebook's Mark Zuckerberg and the CEO of Google, Sundar Pinchai, both spoke out in support of Apple's position. Microsoft's top lawyer said his company would file a legal brief in support of Apple, and indeed it, Facebook, Google, Twitter and many other companies are joined that brief.
Could the NSA have hacked the iPhone? If so, then why didn't it?
It's possible that the NSA could have broken nto the iPhone. We don't know, and we also don't know if a request from the FBI to the NSA was made regarding this case.
Why didn't President Obama get involved?
Obama made a few general public remarks about encryption in early March, but didn't directly address the case. But it's noteworthy that the White House was more receptive to Silicon Valley's arguments for encryption when Eric Holder was the U.S. attorney general. According to the New York Times, the pendulum has swung back in the FBI's favor now that Loretta Lynch has replaced Holder.
Where will this all lead?
During month that the issue was unresolved, there appeared to be two possible legal outcomes. The more likely one was that the case would have been appealed all the way to the Supreme Court. The Supreme Court would normally have rendered the final judicial decision, but if the justices split evenly (possible since the death of Justice Antonin Scalia), then the most recent previous ruling by a lower court would have stood.
The other possibility was, and still is, that Congress creates a law, and the president signs it. The law could compel device makers to help police whenever possible, or shield them from having to cooperate, or mandate that no device be created that no one can ever hack into.
Thanks to Jeanna Bryner, Henry Casey, Andrew Freedman, Marshall Honorof, Jeanette Mulvey and Adam Uzialko for suggesting questions.