Last week, the FBI called on Apple for assistance in gaining access to a phone owned by one of the terrorists responsible for the San Bernardino shootings in December. On the surface, this doesn’t seem to be a very hard request for Apple to comply with—tech companies comply with law enforcement subpoenas on a daily basis. But this case is significantly different.
Unlike many scenarios in which tech companies have full access to users’ personal data, there currently exists no way to access the data on the iPhone in question. Herein lies the issue: the FBI is demanding that Apple create a modified version of the device’s software that will bypass a crucial security feature. By bypassing the security feature that would wipe the device’s data after a series of unsuccessful passcode attempts, Apple would make it easy for the FBI to crack the encryption with brute force—inputting every possible 4 digit passcode until the phone is unlocked.
Helping the government gain access to a terrorist’s phone in pursuit of criminal justice sounds great on paper, but the consequences of these actions would set a chilling precedent for compulsory software modifications for years to come. This is made evident in part through an audacious public statement from Apple’s CEO Tim Cook, published on Apple’s website shortly after the demands from the FBI were made public. In the statement and its corresponding Q&A, Cook projects that such a precedent would pave the way for the government to be allowed to order tech companies to “create other capabilities for surveillance purposes, such as recording conversations or location tracking”.
Tech companies are obligated by law to make reasonable efforts to assist law enforcement with criminal proceedings. Rewriting software to compromise security features is far beyond a reasonable effort. Rather, it is a definite and perceptible step towards weakened civil liberties in pursuit of the mere potential of increased national security.