One Justice Department official said that if the new systems work as advertised, they will make it harder, if not impossible, to solve some cases. Another said the companies have promised customers "the equivalent of a house that can't be searched, or a car trunk that could never be opened.''That Hosko guy apparently gets around. Here he is freaking out in the Washington Post as well:
Andrew Weissmann, a former Federal Bureau of Investigation general counsel, called Apple's announcement outrageous, because even a judge's decision that there is probable cause to suspect a crime has been committed won't get Apple to help retrieve potential evidence. Apple is "announcing to criminals, 'use this,' " he said. "You could have people who are defrauded, threatened, or even at the extreme, terrorists using it.''
The level of privacy described by Apple and Google is "wonderful until it's your kid who is kidnapped and being abused, and because of the technology, we can't get to them,'' said Ronald Hosko, who left the FBI earlier this year as the head of its criminal-investigations division. "Who's going to get lost because of this, and we're not going to crack the case?"
Ronald T. Hosko, the former head of the FBI’s criminal investigative division, called the move by Apple “problematic,” saying it will contribute to the steady decrease of law enforcement’s ability to collect key evidence — to solve crimes and prevent them. The agency long has publicly worried about the “going dark” problem, in which the rising use of encryption across a range of services has undermined government’s ability to conduct surveillance, even when it is legally authorized.Think of the children! And the children killed by terrorists! And just be afraid! Of course, this is the usual refrain any time there's more privacy added to products, or when laws are changed to better protect privacy. And it's almost always bogus. I'm reminded of all the fretting and worries by law enforcement types about how "free WiFi" and Tor would mean that criminals could get away with all sorts of stuff. Except, as we've seen, good old fashioned police/detective work can still let them track down criminals. The information on the phone is not the only evidence, and criminals almost always leave other trails of information.
“Our ability to act on data that does exist . . . is critical to our success,” Hosko said. He suggested that it would take a major event, such as a terrorist attack, to cause the pendulum to swing back toward giving authorities access to a broad range of digital information.
No one has any proactive obligation to make life easier for law enforcement.
Orin Kerr, who regularly writes on privacy, technology and "cybercrime" issues, announced that he was troubled by this move, though he later downgraded his concerns to "more information needed." His initial argument was that since the only thing these moves appeared to do was keep out law enforcement, he couldn't see how it was helpful:
If I understand how it works, the only time the new design matters is when the government has a search warrant, signed by a judge, based on a finding of probable cause. Under the old operating system, Apple could execute a lawful warrant and give law enforcement the data on the phone. Under the new operating system, that warrant is a nullity. It’s just a nice piece of paper with a judge’s signature. Because Apple demands a warrant to decrypt a phone when it is capable of doing so, the only time Apple’s inability to do that makes a difference is when the government has a valid warrant. The policy switch doesn’t stop hackers, trespassers, or rogue agents. It only stops lawful investigations with lawful warrants.His "downgraded" concern comes after many people pointed out that by leaving backdoors in its technology, Apple (and others) are also leaving open security vulnerabilities for others to exploit. He says he was under the impression that the backdoors required physical access to the phones in question, but if there were remote capabilities, perhaps Apple's move is more reasonable.
Apple’s design change one it is legally authorized to make, to be clear. Apple can’t intentionally obstruct justice in a specific case, but it is generally up to Apple to design its operating system as it pleases. So it’s lawful on Apple’s part. But here’s the question to consider: How is the public interest served by a policy that only thwarts lawful search warrants?
Perhaps the best response (which covers everything I was going to say before I spotted this) comes from Mark Draughn, who details "the dangerous thinking" by those like Kerr who are concerned about this. He covers the issue above about how any vulnerability left by Apple or Google is a vulnerability open to being exploited, but then makes a further (and more important) point: this isn't about them, it's about us and protecting our privacy:
You know what? I don’t give a damn what Apple thinks. Or their general counsel. The data stored on my phone isn’t encrypted because Apple wants it encrypted. It’s encrypted because I want it encrypted. I chose this phone, and I chose to use an operating system that encrypts my data. The reason Apple can’t decrypt my data is because I installed an operating system that doesn’t allow them to.Furthermore, he notes that nothing Apple and Google are doing now on phones is any different than tons of software for desktop/laptop computers:
I’m writing this post on a couple of my computers that run versions of Microsoft Windows. Unsurprisingly, Apple can’t decrypt the data on these computers either. That this operating system software is from Microsoft rather than Apple is beside the point. The fact is that Apple can’t decrypt the data on these computers is because I’ve chosen to use software that doesn’t allow them to. The same would be true if I was posting from my iPhone. That Apple wrote the software doesn’t change my decision to encrypt.
I’ve been using the encryption features in Microsoft Windows for years, and Microsoft makes it very clear that if I lose the pass code for my data, not even Microsoft can recover it. I created the encryption key, which is only stored on my computer, and I created the password that protects the key, which is only stored in my brain. Anyone that needs data on my computer has to go through me. (Actually, the practical implementation of this system has a few cracks, so it’s not quite that secure, but I don’t think that affects my argument. Neither does the possibility that the NSA has secretly compromised the algorithm.)In short, he points out, the choice of encrypting our data is ours to make. Apple or Google offering us yet another set of tools to do that sort of encryption is them offering a service that many users value. And shouldn't that be the primary reason why they're doing stuff, rather than benefiting the desires of FUD-spewing law enforcement folks?
Microsoft is not the only player in Windows encryption. Symantec offers various encryption products, and there are off-brand tools like DiskCryptor and TrueCrypt (if it ever really comes back to life). You could also switch to Linux, which has several distributions that include whole-disk encryption. You can also find software to encrypt individual documents and databases.
https://www.techdirt.com/articles/20140923/07120428605/law-enforcement-freaks-out-over-apple-googles-decision-to-encrypt-phone-info-default.shtml