Apple's refusal to unlock terrorist's phone exposes rift with FBI

An iPhone is seen in Washington, Wednesday, Feb. 17, 2016. A U.S. magistrate judge has ordered Apple to help the FBI break into a work-issued iPhone used by one of the two gunmen in the mass shooting in San Bernardino, California, a significant legal victory for the Justice Department in an ongoing policy battle between digital privacy and national security. (AP Photo/Carolyn Kaster)

Privacy advocates have applauded Apple CEO Tim Cook's decision to oppose a court order demanding that the company help the FBI access data on a terrorist's phone, but cybersecurity experts worry the defiance could set a dangerous precedent for future investigations.

A federal magistrate judge ordered Apple on Tuesday to disable a feature on San Bernardino terrorist Syed Farook's work iPhone that would wipe its data after 10 incorrect attempts to enter a password. Investigators have been able to access a cloud-based backup of some of Farook's information, but that data had not been updated since October, so they believe he may have deliberately turned off automatic backups in the weeks before the December 2 attack.

FBI Director James Comey revealed last week that after two months of investigation, officials have not been able to break the encryption on Farook's phone. If the auto-erase function is deactivated, the FBI would be able to use brute force to test password combinations until they find the right one.

Cook posted a lengthy open letter explaining why Apple is opposed to assisting with this process, calling it "overreach" by the government that would inevitably put other customers' data at risk.

"We are challenging the FBI's demands with the deepest respect for American democracy and a love of our country," he wrote.

NYPD Commissioner Bill Bratton said Wednesday that Apple customers face much greater risk from terrorist attacks than they do from government overreach.

"This is the crux of the issue. We need to get this issue resolved, the profit motive under the guise of protecting the interest of their customers over the interest of government to protect the lives of those customers," Bratton said at a press conference.

Cook's position is not surprising, given the privacy concerns of his customers, said David Gomez, a retired FBI executive and a senior fellow at the Center for Cyber and Homeland Security at George Washington University.

"Apple as a business, as a company has a responsibility to their customers and shareholders to take this position," he said. "It's part of the essence of who they are as a business, so they have to fight the fight."

With Farook dead and the phone belonging to the county that employed him, Gomez sees no specific personal privacy concerns in this scenario. From a legal perspective, it seems Apple's interest is primarily in protecting its software, so the question is whether that is a legitimate reason to deny the court's order.

"[Apple's] argument is, how can we trust the government?" Gomez said. "The interesting thing is I'm not sure that's a legal argument."

The letter does not indicate that Apple is incapable of doing what the court is asking, according to Morgan Wright, a senior fellow at the Center for Digital Government and a cybersecurity analyst.

"He's basically saying they don't want to do it," Wright said. Technicians would need to develop a parallel operating system specific to Farook's phone that bypasses the auto-erase feature. Some experts believe that is possible.

"This is not a technological issue," Wright explained. "This is a moral issue, a political issue, and a legal issue."

As a political and legal issue, though, digital privacy groups are standing by Apple because they worry about implications for other cases.

"If the order stands, Apple and other technology companies could be ordered to build backdoors - essentially defects - into other devices, rendering them insecure and vulnerable to attack by law enforcement and by others as well. We will fight against this result," said Greg Nojeim of the Center for Democracy & Technology in a statement.

According to the Electronic Frontier Foundation, the government is asking Apple to create new software that subverts its own security measures and it wants the public to trust that it will never be misused.

"Essentially, the government is asking Apple to create a master key so that it can open a single phone," the foundation said in a statement. "And once that master key is created, we're certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security."

The organization also warned that if the U.S. government can demand that a company reveal users' private data, other countries like China and Russia may try the same thing.

"The @FBI is creating a world where citizens rely on #Apple to defend their rights, rather than the other way around," NSA whistleblower Edward Snowden wrote on Twitter.

Many other Twitter users also praised Cook's stance.

Republican presidential candidate Donald Trump blasted Apple's decision, though.

"Who do they think they are? They have to open it up," Trump said on Fox News Wednesday.

Apple would argue it is not so easy to just "open it up." Cook insisted in his letter that what the government is asking for would essentially be a "master key" to sidestep encryption on its devices, which is "something we consider too dangerous to create."

The conflict revives an ongoing debate that has gotten more intense as encryption methods have become more complex. Law enforcement agencies have repeatedly urged technology companies to provide backdoors to their systems to assist with criminal and terrorism investigations.

Companies like Apple have resisted such demands both out of a concern for user privacy and a fear that any shortcuts created for authorities would also be easily exploited by hackers.

"If there's a way to get in, then somebody will find the way in," Cook explained in a recent "60 Minutes" interview. "There have been people that suggest that we should have a backdoor. But the reality is if you put a backdoor in, that backdoor's for everybody, for good guys and bad guys."

In that interview, Cook rejected the notion that the encryption debate is about a trade-off between privacy and national security, suggesting that it should be possible to protect both in America. Finding that balance is something the FBI has struggled with as encryption becomes more sophisticated, though.

"The FBI's been dealing with this for a while," Gomez said.

Counterterrorism officials have frequently expressed fears that terror suspects will "go dark" and rely on encrypted communication technology that authorities cannot access, even with a warrant. The Farook case presents a unique example of how that challenge can manifest itself, and the difficult position it puts private companies in.

However the case is resolved, the broader challenge of terrorists going dark may change the way intelligence and law enforcement agencies operate.

"When you're faced with a high-tech problem, you got to go low-tech," Wright said, emphasizing the importance of human intelligence.

In this particular case, the suspect is dead, but generally he said getting a passcode is no different from getting a confession. It requires good human sources and effective investigative techniques.

"When you get to the point where we can't break the encryption, then we're back to social engineering," Wright said.

Gomez similarly argued that, in a typical situation, investigators could focus on individuals--"the weakest link is the human being"--but the San Bernardino case is "kind of a perfect storm" where that is not an option.

If Apple continues to fight the order, the case may take years to work its way to the Supreme Court, but Gomez expects the situation will somehow be resolved before it gets to that point.

"They'll have to figure out a workaround," Gomez said, "and what that workaround is, I don't know."

"There's a way for the public sector and the private sector to work together to solve this problem," Wright said.

He suggested a possible compromise would be for Apple to take the phone, use the software to access it, and provide the data to the FBI without ever giving the government direct access to the program it uses.

However, Gomez said that scenario would create chain of custody issues for a potentially vital piece of evidence. The FBI may allow Apple technicians to install the software in their presence, but he does not think investigators would give up custody of the device.

"A deal can be made that everybody's happy with," he said.

The stakes are larger than this one case, though. If the government fails to get this data from Apple, it will be a clear signal to terrorists and other criminals who rely on digital communications that this is a blind spot in law enforcement's abilities.

"These guys will know that there's no way for the government to get hold of their data," Wright explained. A phone will become a weapon for terrorists, child molesters, and gangs, and the threat that presents will only grow as encryption continues to evolve.

"The government will always be a step behind the engineers," Gomez said, because developing these technologies takes time and money law enforcement does not have.

He is confident authorities can eventually find a way around any security measure if they have enough time, but criminals will then move on to new methods of encryption.

"There's nothing that's unbreakable in terms of encryption...Ultimately, history has shown that everything is decipherable," Gomez said.

close video ad
Unmutetoggle ad audio on off