Contemplating morality today is getting more complex since society is now connected in so many new, different ways. The clearer actions and reactions of the past are muddied by where digital falls into the terrain of the physical – particularly where laws exist in order establish privacy and personal rights. The speed of legislation being more tortoise to big tech’s hare means that governments are constantly playing catch up to businesses. This has created moral dilemna’s unlike what we’ve ever seen before.

Now that many of these big tech companies are larger than any sovereign government they span regions where traditional social codes of morality are translated into law in different ways. Big tech has helped the American laissez-faire system propogate. But that does not give it any moral charter to do things outside of what a local government demands. Facebook has to abide by Europe’s GDPR, it has to store Russian citizen information in Russia, and it cannot operate in China at all.

This lack of moral authority is the focus of many of the problems that social media poses to our society today. No one elected Mark Zuckerburg to decide what we should see or not see on Facebook. We chose his platform and service to share photos and updates with our friends. Never in any terms and conditions do we cede him the keys to our souls. Yet who else is going to decide when we all know that we need limits?

So for social networks, questions about morality will not be resolved any time soon. But there is one subject that comes up repeatedly that should seem to be easier to solve on the surface: should the US government be able to force Apple to build a backdoor to enable Apple to unlock the phones of serious criminals? The case this month from a mass shooter in Florida is the latest example of this recurring argument.

It might seem more straightforward, but it is actually a very complicated question with two sides.

Yes, Apple should have to create a backdoor so that law enforcement can access people’s phones

If you build a more literal bridge from the physical world and the precedents set by desktop, the baseline is that yes, law enforcement agencies should be able to scrutinize the contents of an iPhone when they have a warrant to search someone or their home. Agencies wheel computers away during certain types of cases where pertinent evidence can be found on them. In theory, an iPhone is no different than a locked drawer. If it is inside the residence and a possession of the accused, it should be opened and its contents revealed. It is, after all, nothing more than a container of information that happens to communicate.

Moreover, if you have nothing to hide then why do you need encryption at all? And to take this argument to the extreme, our phones contain so much information about our lives that you could make a moral argument that phones should be a mandatory part of any serious investigation as it would give access to many aspects of someone’s life that would make it easier to determine guilt or innocence.

No, Apple should not create a backdoor for law enforcement agencies

For people who think that the government already has too much power, it’s tempting to hold iPhones outside of current law as a safeguard against the slippery slope of building backdoors into all of our digital platforms and services. The trend towards encryption shows that people are serious about their paranoia and don’t want to take the chance of being spied upon. Encryption after all isn’t about illegal messages, it’s about protecting accesses to our bank accounts, email, and private photos.

The common refrain here is that law enforcement will just have to do without iPhones in order to preserve everyone’s rights and privacy because it is not just about the criminal (or suspect), it’s about each and every iPhone user.

A police officer could force someone to unlock their iPhone, but the government should not be able to make Apple make their phones more vulnerable. If Apple could make a specific phone more vulnerable, I don’t think this would even be an issue, but Apple is famous for being the most secure major consumer tech platform, and making a change to one device means making changes to them all.

Apple is correct to not develop a backdoor since when we say backdoor that means that anyone could eventually gain access to any iPhone. We all remember Blu-ray which was supposed to be uncrackable and the reports of the kid in Sweden that cracked it the same day it was released…

Governments will not force Apple either, since most of the politicians who would need to lead the charge use iPhones. They are exactly the types of people who would be most likely to be hacked if a backdoor was ever built.

Better to keep everyone as secure as possible than provide a few extra tips on a few cases to law enforcement. As much as I can dislike what Apple does these days, I stand with them on this principle.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.