Privacy, patriotism or profits? Apple’s fight with FBI goes deeper than iPhone encryption
Cybersecurity community weighs in on implications for consumer trust in data security
By Byron Acohido, ThirdCertainty
Apple CEO Tim Cook is somewhat of a Johnny-come-lately to a campaign that America’s most prominent tech companies have been waging for a while now—posturing to be viewed as champions of consumer privacy.
Cook this week drew a ton of attention to Apple’s resistance of a court order requiring the company to assist the FBI in decrypting data on the iPhone of one of the shooters in the late December terrorist attack in San Bernardino, Calif.
Free resource: How to build customer loyalty by keeping data secure
What we are seeing unfold is the continuation of the mad scramble—led by Google, Microsoft, Yahoo and Facebook—to distance the U.S. tech sector from anything to do with U.S. government surveillance. America’s top tech companies have been in scramble mode ever since whistle-blower Edward Snowden outed the NSA’s Prism surveillance program in the summer of 2013.
Prism, you’ll recall, is the NSA’s covert operation to systematically drink from the rivers of consumer behavioral data flowing through our daily search queries, email, social media postings and general Web surfing.
Related video: ‘Privacy by design’ restores control to consumers
The tech giants are all hustling to swell profits derived from this activity. And any hint that their voracious collection of consumer preferences and behavioral data for commercial purposes can be leveraged by Big Brother could erode consumer trust in the Internet of Things.
Cook, interestingly, does little to champion the core principle, namely that the FBI has no business asking for an iPhone decryption tool, no matter what the mitigating circumstances.
Instead, Cook is making the nuanced argument that Apple’s engineers are, in effect, incapable of executing the type of decryption the FBI has requested. Doing so could create the “master key” to a “backdoor” the feds could use to decrypt anyone’s iPhone, security experts warn.
Nuanced or not, Cook’s maneuver attracted global news coverage this week—and earned Steve Jobs’ successor a gold star from consumer and privacy advocate groups. But Cook’s move has wider implications. ThirdCertainty gathered these reactions from the cybersecurity community.
Brad Taylor, chief executive officer, Proficio
There shouldn’t be a backdoor to encryption that a manufacturer holds on to for any consumer or business product. Why would Apple want to have a backdoor key in the first place? If Apple does have a ‘backdoor’ key to unlock any data on an iPhone, it should not simply turn it over to a federal judge.
This should be a case for the Supreme Court related to the Constitution. Once people with bad intentions discover that a vendor is maintaining a backdoor key, they will readily turn to products available via the Internet—outside of the U.S.—to securely encrypt data and transmissions.
French Caldwell, chief evangelist, MetricStream
The assertion that the FBI is demanding that Apple create a backdoor is a stretch. Until now when tech companies have discussed a backdoor, they’ve referred to encryption. In this case, the government is not asking for a backdoor to Apple’s encryption, but rather is demanding Apple’s assistance in unlocking the screen of the phone of an alleged terrorist. This demand for assistance is not the first of its kind, and Apple will have to comply.
However, Apple has made it very difficult to unlock the screen, and to do so requires creating a unique version of its OS; thus, the one legal argument that Apple has is the burdensomeness of compliance with the court order. Apple has 5 business days to demonstrate why the order is too burdensome.
John Gunn, communications vice president, VASCO Data Security
Many people have the mistaken impression that if Apple and other mobile OS providers are forced to build-in backdoors, then suddenly law enforcement officials will have a magical and lasting backdoor to all encrypted information.
In reality, if backdoors are built-in, then two things will happen: criminals will still keep their secrets using any one of the more 100 third-party encryption products, and average citizens will be left more vulnerable to criminal and state-sponsored hacking.
Csaba Krasznay, product manager, Balabit
Apple is proud of its product security. They already have several security certifications. If Apple provides some tricks to circumvent its security functionality, they have to document it publicly, or risk these certifications.
On one hand, the U.S. government requires highly secure devices. Yet on the other hand, law enforcement requires bypass-able devices. But law enforcement agencies have several ways to collect evidence and information, and decryption of mobile devices is merely one option.
Although a master key would facilitate law enforcement’s work, there are other solutions. In the best interests of all users, including the U.S. government, built-in security shouldn’t be touched.
Jeff Hill, channel marketing manager, STEALTHbits
Despite the very real concerns about privacy and potential abuse, the time is quickly approaching when it may make sense for the tech community to launch a collaboration with the government. There is a long history of judicial precedent that favors the government over individual rights in the context of national security.
With the San Bernardino and Paris attacks fresh in the minds of the public, it behooves those in technology to at least attempt, in good faith, to find common ground with authorities.
There should be some recognition of the legitimate concerns of the citizenry and those in law enforcement. Historically, the court system has been willing to value safety over privacy when the nation is confronting aggressive enemies.
More on privacy and data security:
Federal data breach law should be approached with caution
How strong is the EU-U.S. Privacy Shield?
Companies must not forfeit privacy in march of technology