Apple CEO Tim Cook: Backdoor To iPhones Would Be Software Equivalent Of Cancer
"Some things are hard and some things are right. And some things are both," Apple CEO Tim Cook said during a Wednesday night interview on ABC News' World News Tonight with David Muir. "This is one of those things," he said, doubling down on the company's refusal to create a way for the FBI to access data on the iPhone of one of the San Bernardino shooters.
Last week, a federal judge ordered Apple to help the FBI crack into the iPhone of Syed Rizwan Farook who, along with wife Tashfeen Malik, killed 14 people and wounded 22 others in December. As the Two-Way reported, shortly after government officials obtained the iPhone Farook used, a San Bernardino County employee working with federal authorities reset the password for its iCloud account — meaning the phone could no longer perform an automatic wireless backup that could have enabled Apple to recover information.
In the interview, Cook called this a crucial mistake, saying there is now only one way to get information from the phone.
"The only way to get information — at least currently, the only way we know — would be to write a piece of software that we view as sort of the equivalent of cancer. We think it's bad news to write. We would never write it. We have never written it — and that is what is at stake here," Cook said. "We believe that is a very dangerous operating system."
The government has said that the software key would be limited in scope, but Cook rejected that characterization.
"This case is not about one phone. This case is about the future," Cook said. "If we knew a way to get the information on the phone — that we haven't already given — if we knew a way to do this, that would not expose hundreds of millions of other people to issues, we would obviously do it. ... Our job is to protect our customers."
Following the federal magistrate's ruling, Cook posted a statement on Apple's website which argued the government was effectively ordering Apple to put its customers at risk by compromising their privacy. "We can find no precedent for an American company being forced to expose its customers to a greater risk of attack," Cook wrote.
But that's not exactly true, NPR tech reporter Aarti Shahani says.
As Shahani reports on Morning Edition, in the 1990s federal regulators blocked Microsoft from selling software with advanced encryption abroad because it could have a military application. Shahani says Microsoft altered its software to satisfy regulators' concerns.
"Microsoft had to basically weaken its product so they could enter foreign markets," Shahani says. "That definitely impacted privacy, exposed customers to greater risk."
On Sunday, FBI Director James Comey made his case in a blog post on the Lawfare website, writing:
"We don't want to break anyone's encryption or set a master key loose on the land. ... Maybe the phone holds the clue to finding more terrorists. Maybe it doesn't. But we can't look the survivors in the eye, or ourselves in the mirror, if we don't follow this lead."
But Cook contends that creating a way around the encryption would put hundreds of million of people at risk and "trample on civil liberties."
"Our smartphones are loaded with our intimate conversations, our financial data, our health records. They're also loaded with the location of our kids in many cases. It's not just about privacy, it's also about public safety," Cook said. "No one would want a master key built that would turn hundreds of millions of locks ... that key could be stolen."
Cook also said that he would be speaking with President Obama about the issue, but said he would be willing to fight the government's order all the way to the Supreme Court.
Copyright 2020 NPR. To see more, visit https://www.npr.org.