There’s a couple of items that seem to be missing from the February 2016 debate on whether Apple should assist the FBI in hacking into the cell phone used by deceased criminal / terrorist murderers.
Supposedly, this is a perfect case for avoiding most privacy issues. The bad evildoers are dead, so no privacy issues, and the phone is owned by their employer anyway, which is a public agency. Just one problem. If you make a way to crack into this phone, you’ve basically made a way to get into any phone. And even if you haven’t made a general backdoor, you’ve demonstrated you can and therefore are you going to start getting tons of court orders to keep doing this?
So let’s look at one obvious thing, but then a couple of items I think aren’t being looked at very closely just yet…
- The FBI claims the order to hack the phone is narrowly described to focus on ONE phone. But how does the FBI know what it takes to get in there? Apple says the only way to do it is for them to re-cast their entire operating system. There’s really no apparent what – that I can think of in any case – where a hack would be inherently specific to one individual device.
What I’m not seeing anyone really talk about…
There are some other issues raised by what’s going on with this court order that I haven’t seen much focus on as yet.
- Government Coercion of Forced Labor. Supposedly, when the Government requires Company compliance with various types of expensive regulations, they pay some form of reasonable accommodation fees to help them accomplish required tasks. But in this case, how could you possibly do that? You’re essentially ordering a private business to embark on a technical research project to discover what it would take to accomplish this goal, and then implement it. How much will that take in time and money? And what business opportunities will be lost by having those personnel assigned to this task instead of what they’d otherwise be doing?
- Self Determination by Workers. Work may be at will in most states. So a company can fire you for pretty much any reason at any time. Maybe not for Government, Unions or some other types of work, but at most tech firms I’ve ever heard of, it’s At Will employemnt. So what happens if an Apple coder says s/he won’t do this work for philosophical reasons? Do they get fired? Or are they found in contempt of a court order and then they go to jail? Either way, what would Apple do then? Do they have to seek out and hire additional talent to accomplish this goal?
The slippery slope here is about a lot more than just privacy violations. It’s about what level of Government coercion can be used to force Specific Performance by Companies and individual workers. The State certainly has an interest in trying to gather evidence regarding terrorism, (or any other illegal activities), and to protect its citizens. And for that matter, I’d personally hope most would agree we’re all for our various police agencies aggressively seeking to protect us and stop the bad guys.
It’s probably true that with modern encryption more and more bad actors are putting more of their plans in digital forms only so there’s little to no evidence of the kind police agencies might have collected in the past. Police are then left with little other than actual actions by others, which can be harder to track. Perhaps after an incident there’s physical evidence, but with a complete inability to get at any plans or communications there’s clearly an increased advantage to criminals unlike ever before.
But here’s the thing: This is where we are. This is what we’re left with. We can’t allow the U.S. Government the kind of reach they’re looking for here because the philosophical and practical costs are too high and as importantly, DON’T SOLVE THE PROBLEM. Even if you can get into a device, users can have apps on devices that have encryption themselves. And there will be other devices not made in the U.S. that will add these capabilities, regardless of rules to not deploy them in the U.S. Even if this order ends up with the hacking of this phone, all we will have done is moved the threat down to the app level.
If this order prevails and Apple is compelled to do this, all that will happen is a major loss to end user privacy, major new risks for hacking of anyone, and likely a loss of business to Apple overseas as others prefer providers that don’t cripple encryption in their products.