Skip to main content

iPhone 'Back Door' Is Slippery Slope That Could Save Lives

In the middle of Apple CEO Tim Cook's impassioned letter explaining why he doesn't want to give the FBI a back door to the iPhone to investigate one of the San Bernardino shooters, he states that Apple has "no sympathy for terrorists." At first, I glossed over this odd and obvious line. Then, I realized why he included it. 

Some people are characterizing Apple's refusal to hand over a firmware update to break the iPhone's encryption as choosing to protect the rights of terrorists over pursuing justice and gaining insight that could help aid future investigations. Sure enough, one of the very first comments in a Washington Post article reporting on Apple's stance says, "So, Tim Cook and Apple support terrorism. It's absurd to think that Apple cannot unlock the one phone without compromising any other Apple user."

Credit: Lowe Llaguno / Shutterstock

(Image credit: Lowe Llaguno / Shutterstock)

You can expect a lot more of that kind of rhetoric in response to Cook's letter, not just from the public but from politicians and pundits. Donald Trump has already sided against Apple, telling Business Insider, "To think that Apple won't allow us to get into her cellphone? Who do they think they are? No, we have to open it." 

But lost in the battle between those who are standing up for privacy at all costs and those who will scream that Apple is anti-American is the debate we should really be having: Will giving the FBI access to evildoers' devices possibly save lives? And is such access worth the risk to privacy?

The tools for creating a back door to the iPhone could 'fall in the wrong hands,' but it's worse for Apple to sit on its hands.

Although the FBI has been working on it for two months, the agency has been unable to unlock the iPhone 5c used by Syed Rizwan Farook. He, along with his wife, opened fire at a holiday party at the Inland Regional Center in December, killing 14.

According to Apple, the FBI wants the company to create specialized software that would be installed on the device that would override its 10-tries-and-wipe feature. There are two main reasons why Cook doesn't want to comply.

First, he says, creating such a tool would "undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals." Second, Apple objects to the government's expansion of its powers under the All Writs Act, which Cook says would give it the "power to reach into anyone's device to capture their data" and eventually "intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge."

MORE: The Best (and Worst) Identity Theft Protection 

There's no question that the tools for creating a back door to the iPhone could "fall in the wrong hands," as Cook says, but it's even worse for Apple to sit on its hands while potentially invaluable information lies hidden behind a passcode.

Do I want my highly sensitive personal information in the wild? Absolutely not. But I'm willing to take that chance if it means one less person gets killed because we didn't do everything we could to learn more about the people whose mission is to destroy us.

Apple and other companies must find a way to balance our right to privacy with our right to live.

In its defense of Apple's stance, the Electronic Frontier Foundation makes a very good argument for Cook not to cave in to the government's demands. "Even if you trust the U.S. government, once this master key is created, governments around the world will surely demand that Apple undermine the security of their citizens as well," the organization wrote. There are already precedents for this sort of scenario, as Google and others have found in China with rampant censorship. 

But that's not a good enough reason to not cooperate with the FBI in this specific case. Apple and other companies must find a way to balance our right to privacy with our right to live.

Imagining where this slippery slope could go, Ahmed Ghappour, a professor at the University of California's Hastings College of the Law, posed the following question to The Washington Post: Could the government "compel Facebook to customize an algorithm that predicts crime?”

I wouldn't want Facebook or anyone else to be compelled to do that. I want more tech companies to volunteer, to be more proactive in identifying potential threats, working with law enforcement to prevent tragedies instead of trying to crack open black boxes after the damage is done. That's a slippery slope I can live with.