July 27, 2024
Search
Close this search box.
Search
Close this search box.
July 27, 2024
Search
Close this search box.

Linking Northern and Central NJ, Bronx, Manhattan, Westchester and CT

From the Desk of an Avid Smartphone User: The Privacy Dilemma

It should come as no surprise that I am an extremely avid smartphone user. Every day that I can (which excludes Shabbos and Yom Tov), I probably wake up and unlock my phone 367 times on average, to send texts and emails and Facebook posts and Instagram pictures and so on. It’s an unfortunate necessity that I need to stay connected during the day, in case an important email from a teacher/yeshiva/college (University of the Moon Class of 2020?) pops in. I could dwell on whether I need to limit my phone usage more or if the always-on-call expectation is problematic, but there’s a different issue I’d like to talk about for now.

It’s scary how much personal information I have stored on my phone. If my phone is unlocked, one can access all my contacts, my Facebook account, my emails, my LinkedIn (the hottest social network for high schoolers), many of my documents on Google Drive and so much more—as well as an embarrassing amount of selfies, but that’s besides the point. Artistically, it could paint a portrait of who I am and what I do… but on a more literal level, what’s on my phone gives away a lot of my personal information, and if someone accessed it, I’m sure they could easily find me, or impersonate me (particularly on my social media accounts), or perhaps steal my identity (although I don’t have any credit card numbers or SSNs stored on my phone as far as I know).

That’s why phone security is so important nowadays; unlike in the “days of yore” when a phone just contained a list of contacts and maybe a few grainy pictures, today phones contain a wealth of information that could be used to devastating effect in the wrong hands. And even beyond that, doesn’t everyone want to keep some things on their phones private? Doesn’t everyone have a right to some personal space? I have an antivirus app on my phone to make sure I don’t download any malware by accident that could steal my data, and I have a very strong password: hahathisisnotmyrealpasswordWHYWOULDITELLYOUIT700. (I actually use a PIN.) It’s important to deter would-be hackers from getting to your data. But what if the group trying to hack into the phone was—the FBI? What if the FBI wanted to hack into your phone because you had committed a terrible crime—perhaps a terrorist attack—and they wanted to see your texts and call logs to search for evidence?

Most people would think that’s reasonable. I would agree, and Apple (of iPhone and Mac fame) does too, particularly in a terrorism case; the right to phone privacy is essentially forfeit.

You might have heard of the case currently surrounding the phone of one of the San Bernardino terrorists, which is the raison d’être that I’m writing this article. The FBI wants to unlock the terrorist’s iPhone in order to find evidence and build up more of a case against him. Apple is fully willing to help them unlock the phone—to a point. But when the FBI couldn’t unlock the phone (clearly the passcode wasn’t 1-2-3-4), they wanted Apple to go a step further. From what I understand, they want Apple to create a special version of iPhone software that would allow them to have more than 10 tries at the passcode and that would allow them to try many in quick succession, likely with an external computer program. (I have personal experience with phones being wiped because of too many passcode tries. Years and years ago, I had the pleasant experience of erasing the data on my uncle’s Blackberry because I thought it would be fun to keep punching in numbers.)

The FBI promises that they would only use this special iPhone software once, for this case. Apple argues—very publically—that if this software, a “backdoor” or “master key” for iPhones was even created, it would be a very dangerous tool even in the hands of the government. Apple says it would have many ramifications for phone privacy—could the government use this backdoor again? Could it be used to violate others’ privacy? So many people in the USA use iPhones, so what would happen if the government or someone else had the power to unlock them all? It’s like the whole NSA debate again.

Many tech companies have sided with Apple—even its fiercest rivals, Google and Microsoft—but others have argued that Apple should help the FBI out. I think, and this ties back into what I spoke about before about all of the data stored in my phone, that this is a very complex issue, and not just when it comes to privacy. There’s an issue of privacy vs. security of the USA, of course, but also of doing something drastic as a one-off act vs. setting a dire precedent, and the government’s right to search and take action vs. the private sector’s rights to act how it would like within the boundaries of the law. There are ethical and legal ramifications here as well.

If I had to pick a side now, I would side with Apple, although my opinions could change should any game-changing revelations about the phone or security come to light. But right now, I think Apple has a good point when it comes to protecting the integrity of its iPhones, and that creating a backdoor even just for one case could set a terrible precedent. Once that software exists, it exists, and even if it’s fully deleted after it’s used, it would be possible to force its creation again, and there could be (God forbid) more cases where a backdoor like it would come in handy. Then, to use a famous quote from the Gemara (albeit in a different context from its source), ein l’davar sof—there is no end to the thing. There would be no end to when the backdoor would be requested, and that could be very dangerous. Obviously it’s not a given that would happen, but in this case I think it’s better to err on the side of caution.

Just to clarify, I do not think that every case of possible precedent setting, re: my recent piece about the new area at the Kotel, which I didn’t think set a precedent of setting aside Orthodox Jewry, is harmful. But in the iPhone case, when it’s something so volatile and so intertwined with citizen privacy, I think a precedent is dangerous.

Now, as a gift to all of you, I’d like to give you my “real” phone password: justkiddingImnottelling.

By Oren Oppenheim

Oren Oppenheim, age 18, is really wishing that he had been accepted already to University of the Moon. You can email him at [email protected].

 

Leave a Comment

Most Popular Articles