Giving Police Backdoor Access To Smartphones Is An Invitation To Be Hacked

With both Android and iOS phones making privacy updates that will make it impossible for Google or Apple to unlock a device without a user’s passcode, even with a warrant, authorities from local police to the head of the FBI to the U.S. Attorney General are saying there should be some sort of backdoor way to gain access to these devices. But what they don’t realize is that leaving in that additional point of access just makes phones more vulnerable to other forms of snooping.

Over at the Washington Post, writer Craig Timberg puts it in pretty straightforward terms. He uses the analogy of a windowless, doorless brick building. Put any sort of opening in that structure and it’s not as secure.

“No matter how thick the door or tough the lock, the house is now more vulnerable to intrusion in at least three ways: The door can be battered down,” writes Timberg. “The keys can be stolen. And all the things that make doors work – the hinges, the lock, the door jamb – become targets for attackers. They need to defeat only one to make the whole system fail.”

So even if you agree with the notion that the police should, in certain warranted cases, have a way to access a smartphone without the user’s passcode, it would seem difficult to deny that this portal would be a tempting point of entry for hackers.

“It’s not just that somebody is going to use the same back door that law enforcement uses,” explains cryptology expert Matthew Blaze from the University of Pennsylvania. “It’s that introducing the back door is very likely to either introduce or exacerbate a flaw in the software.”

To go back to the brick house analogy, you could put in a sturdy, air-tight steel door, but what if putting in that door reveals a weakness in the mortar in nearby bricks?

So not only do you have a known point of entry into the structure, you now have other flaws that could be taken advantage of.

Should the police be given special keys that will unlock any house, any room, any locker, any safe deposit box?

Sure, it would make things easier for the police. It would also make things that much more tempting for thieves who would try to emulate those master keys — or exploit a locking system that allows for an all-access key — to aid in the commission of crimes.

In his argument earlier this week for backdoors, outgoing Attorney General Eric Holder claimed that “It is fully possible to permit law enforcement to do its job while still adequately protecting personal privacy.”

The AG resorted to playing the kid card to get his point across.

“When a child is in danger, law enforcement needs to be able to take every legally available step to quickly find and protect the child and to stop those that abuse children,” he said, without actually explaining how having backdoor access to a smartphone would aid any of this. “It is worrisome to see companies thwarting our ability to do so.”

Thing is, the decision by Apple and Google to batten down the hatches on personal privacy does little to prevent law enforcement agencies from catching criminals. It only protects data that is on the phone. Police can still get warrants for wiretaps, can still compel wireless providers and app companies to provide stored details on transmitted or remotely stored data.

So if a child pornography ring is e-mailing or texting photos to each other, or storing those images on the cloud or on a remote server, that’s all still available to police.

We live in a world where a trip to the hardware store, or the bank, or the sandwich shop, or the beauty supply store can result in you having your personal and financial information stolen and sold to anyone willing to pay for it. This is not the time to be giving anyone — police or criminals — additional points of access to our data.

Want more consumer news? Visit our parent organization, Consumer Reports, for the latest on scams, recalls, and other consumer issues.