If The FBI Can Make Apple Open Syed Farook’s iPhone, It Can Do Anything

If The FBI Can Make Apple Open Syed Farook’s iPhone, It Can Do Anything

The government wants Apple to make a hack-friendly iOS to combat terrorism, but it will open the door for unprecedented levels of cyber terrorism.
Brandon Morse
By
Email
Print
Hangout with us

On February 16, California Judge Sheri Pym issued a court order to Apple, Inc. to create new software that would allow the FBI to bypass the safety protocols in iPhones that force them to lock down or wipe memory after 10 failed passcode attempts. The FBI needs it to gain access to a phone that belonged to Syed Farook, one of the San Bernardino shooters.

Apple CEO Tim Cook issued a public letter to customers from the Apple website not only stating the government’s intentions, but also declaring Apple’s resistance to the FBI’s request to create such a program.

“We have great respect for the professionals at the FBI, and we believe their intentions are good.” states Cook. “Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”

The FBI has asked Apple to create a master key for government bodies to use that would allow them to break user encryption by entering various number combinations in the blink of an eye without the phone going amnesiatic. This gross overreach of power has even gotten the attention of the Electronic Frontier Foundation (EFF), which has taken Apple’s side on the issue.

“We are supporting Apple here because the government is doing more than simply asking for Apple’s assistance.” said the EFF in a statement on their site. “For the first time, the government is requesting Apple write brand new code that eliminates key features of iPhone security—security features that protect us all. Essentially, the government is asking Apple to create a master key so that it can open a single phone. And once that master key is created, we’re certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security.”

The EFF also plans on submitting an amicus brief in support of Apple resisting the government order.

The FBI’S Demand Isn’t Simple Terrorism Fighting

I’m more afraid of my own government than I am of any terrorist. I can shoot at ISIS when they decide to get forceful with me. Al-Qaeda can’t legally imprison me for resisting them. This is not the case with the American government, which can confiscate your money for mere suspicion of trafficking and use violence against you if you resist. The idea that the government could brute-force my phone into giving up my location, texts, and turning on my microphone and camera to gain information—or, more likely, proof of their suspicions of me—seems more dangerous to my personal security than a terrorist I can shoot back at.

You don’t ask for a master key because you want access to one lock. You ask for a master key because you want access to all locks.

Secondly, I don’t believe that the FBI asked for Apple to make this software because it just wants to force their way into this phone, because if they wanted Apple to provide that phone’s info, they could have gotten it by asking, as Apple has helped in similar instances in the past.

You don’t ask for a master key because you want access to one lock. You ask for a master key because you want access to all locks. The very nature of this master key is that it unlocks any phone the FBI wants access to. The pretense that it’s strictly the San Bernardino shooter’s information that they want to acquire by creating this new hack-friendly operating system—coined the “FBiOS” by National Review’s Kevin Williamson—seems like weak reasoning at best.

Furthermore, this operating system creates too many security issues. Eliminating the consequences of entry failure defeats the purpose of passcode security. Any Aiden Pearce wannabe with the capability to brute force a passcode would be a security nightmare to customers and providers alike. The government wants this operating system to combat terrorism, but it will, in effect, open the door for unprecedented levels of cyber terrorism.

Remember ‘Innocent until Proven Guilty’?

Like Cook, I have no sympathy for terrorists. In the San Bernardino case, we have a confirmed terrorist, and access to his phone is more than warranted. However, Farook is one man with one phone that Apple could help the FBI investigate. What the government is asking is access to all phones. I can’t help but feel this is the government saying that every citizen is a potential Syed Farook, and that all phones should be easily accessed…just in case.

This is the government saying that every citizen is a potential Syed Farook, and that all phones should be easily accessed…just in case.

In this situation, “just in case” denotes the mere suspicion that a citizen might be a terrorist. The trouble is that “terrorist” is defined rather loosely, and sometimes inaccurately in today’s world. It wasn’t long ago that Janet Napolitano’s Department of Homeland Security was vaguely defining Tea Partiers and veterans as likely terrorists. Mere suspicion has also put innocent civilians on the “no fly list.”

If suspicion is enough to violate the Fifth Amendment rights of American citizens without trial regarding travel, then it’s certainly enough to give bureaucrats reason to utilize the master key against citizens they suspect…just in case. Some would say this cannot be done without a warrant, but they should remember that the government is no stranger to warrantless spying. Just ask the National Security Agency, which engaged in warrantless surveillance at least from 2001 to 2007.

Wisdom says that if someone has abusable power, expect them to abuse that power. The government is no exception, and has proven repeatedly to be the rule.

Most important in Apple’s resistance is its refusal to set a precedent. That a corporation or business could be court-ordered to create a product that specifically caters to the needs of government entities would be a massive overreach officials would undoubtedly abuse for years to come. In this specific case, Apple’s success in resisting the government isn’t just a security issue, it’s a matter of how much further government can push itself into the private sector. This isn’t about one iPhone, this is about all iPhones, and possibly more than that.

Brandon writes for The Federalist, and is front page editor at RedState.com. Direct all hate to @TheBrandonMorse on Twitter.
comments powered by Disqus