There’s a lot of talk going on right now in Silicon Valley about this case the FBI and Apple are grappling over: it’s an attempt to unlock the iPhone 5C of Syed Farook, the Pakistani-born terrorist who, along with his wife, killed 14 people in San Bernardino, California.
At the heart of the case is a request by the FBI, now filed into a court order, to impose upon Apple Computer to help the FBI unlock this device. As you know, iPhones can be locked with a numeric code, and after numerous attempts to unlock the device with the WRONG code, the software has a built-in “erase all” function. This is built into iOS, and it’s a protective device. The FBI is specifically requesting Apple to write NEW software that will override the built-in protections of iOS, so that the FBI can try 10 or 20 million different passcodes without erasing the contents of the device.
The first issue, of course, is how the FBI can’t figure out how to get into this device. You could probably throw a rock in Silicon Valley and find a dozen entry-level hackers that could circumvent the passcode to get into the device. How does the FBI not have 10 of these people on staff already?
Sure, the nature of this attack was awful, and would get even your garden-variety patriot up in arms about getting to the bottom of the crime, and especially finding out if anyone else was involved. It makes perfect sense, from an investigational perspective, to see what information that phone has on it.
However, I’m not sure that’s Apple’s responsibility. And as the Tim Cook-penned response indicated, it does set a dangerous precedent about privacy and the reach of government. He wrote:
“Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.
We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.”
But more than all that lofty ideology, this seems like a pretty cut-and-dry case of “it’s not our problem.”
In some sense, I think the FBI is confusing Apple with other tech giants like Google or Facebook. These companies DO have oodles of information about their users – usernames, passwords, location history and much more. But this is not the business Apple is in. Not by a long shot. They make their money on selling devices.
And that’s where I think this discussion hinges. All Apple did was sell a device to a consumer. What that consumer subsequently did with it is his business. Now, if that device was used in a horrible crime, (and that’s not a fact, but rather only an angle being pursued,) it certainly makes sense to see what information is on there. But how that is Apple’s responsibility seems, at the very least, confusing – if not downright dangerous.
From a brand perspective – this is a high-stakes game for Apple. If they capitulate (or are forced to,) then the risk of floodgates opening becomes a likely outcome – expect every country around the world (especially high-iPhone-penetrated countries like China and India,) to issue similar orders to the company. And then expect similar orders to flood the executive offices of every technology company, social media company, e-commerce company and so on.
If Apple holds firm here, they could come off looking like heroes of privacy and vicars for establishing the boundaries between technology and civics. But at what cost? Is there a public backlash against the brand for “not cooperating in the war on terror?” Or is there a sentiment for honoring our rights to privacy that Apple would have stood steadfast to uphold?
This is dicey, indeed. And if I’m a brand manager for Apple, I would be constructing about 17 different contingency plans. Let’s stay tuned.