Quantcast
Channel: Recommended
Viewing all articles
Browse latest Browse all 35726

Everyday Magic: Why Apple's Fighting The Law, And We Hope They Win

$
0
0

Hello, Kossacks!

Been a long time since I’ve done one of these, but there’s a pretty meaty technology story hitting the headlines, and I think it’s important to give it a thorough examination and explanation.

So, many of you have probably already read (or at least seen on the FaceTwitters) the open letter that Apple wrote to its customers.  What would make Apple write a letter like that?

Well, see, that phone was one in the possession of one Syed Farook, otherwise known as the jackass that shot up his place of work in San Bernardino in the name of Daesh.  It’s in the possession of the FBI, and the phone actually belonged to his employer, The County of San Bernardino, and was provided for work use.  Warrants are in place, and the County has given the FBI permission to search the phone.

The FBI can’t do that however, since the phone is locked. So why can’t they just brute force the passcode on the phone?  After all, the phone is only locked with a 4 digit pin code, meaning it’ll take them at most 10000 guesses to get in.

Well, Apple has a nice feature on it for customer protection.  See, since many of us with a smart phone use it for things like shopping, banking, and even as a stand-in for a payment card, and this information could cost someone a lot of money if their phone was stolen and the thief had access to it, iOS will erase the data on the phone after 10 failed attempts in a row at providing the correct unlock PIN or password.  Therefore, the FBI can’t brute force the phone, since they risk losing any evidence on it.

Anyway, now a judge has now asked/ordered Apple to do is to assist the FBI in creating a version of iOS, which is the software that runs the various iDevices, that does not have that protection, and allows passcodes to be entered electronically, not just by manual entry, so that the FBI can brute force the phone with the speed of another computer.

Mind you, Apple couldn’t even do this, normally, except the model in question (5C) doesn’t come with one of their newer features (I hesitate to call it a security feature, since it’s actually a bit anti-consumer) which blocks a non-standard version of iOS from being installed on phone models 5S and newer.

So, what’s the big deal?  This all seems relatively reasonable thus far.  The Fourth Amendment has been fulfilled thanks to legal warrants and the phone owner’s consent being given to a property search.

Well, there’s two big problems with this request.

The first is that anyone pretending that this will be a one-time request is an idiot.  See, once a tool like this is out there, it will be requested again.  And again.  And when Apple says “Oh, we can’t do it to that phone because of this feature that the Technomancer mentioned earlier in his diary”, that feature quickly gets outlawed — the argument can even be made that it’s a victory for consumers, since like I said, not being able to modify your software is a bit anti-consumer — or it gets tackled in a roundabout manner like putting export restrictions on it to the point that it makes more business sense from Apple to not bother making one OS version for the US and one for everywhere else it exports a phone to.

And once an attack like this is out there, anyone with the knowhow can take advantage of it.  That software won’t stay the secret of Apple and the FBI for long, and Google will be just as pressured to do the same thing to Android.  Once this cat is out of the bag, it’s a whole new world.

The second is that in order for consumer end-to-end encryption schemes to be trusted, the receiving device generally has to be treated as trusted.  The reason why encryption is so easy and seamless on today’s smartphones and other consumer devices is that it’s been made simple on the consumer end.  Rather than having to maintain your end of the encryption key pair or remember more than one password for every encrypted transaction, your browser, or your phone, or your tablet, or your desktop/laptop, etc. is treated as trusted.

This, by the way, is why you should always put a strong password on your devices.  Just sayin’.

Anyway, if the device itself can’t be afforded a minimum level of trust, encryption suddenly becomes a pain in the ass again, and available only to geeks like me that are willing to tinker with a device enough to put it in place.  And while you may be saying “But Technomancer, I have nothing to hide!”, I would argue that you do.

You have financial information to hide.

You have personal information to hide.

You have plenty of things you want to keep private from prying eyes.

Because here’s the thing with an encryption scheme.  Just because you may trust one set of prying eyes doesn’t mean you trust them all.  And even if we pretend for a moment that the government doesn’t make mistakes and only targets the correct people 100% of the time and follows all laws 100% of the time when it comes to criminal investigations...they’re not the only prying eyes now, because this is an effective backdoor to an encryption scheme.  And the thing about encryption is that you either have it, or you don’t.

More specifically, once there is a weakness in it, the intended party that weakness is there for is not the only party that can access that weakness.  And I’m not sure if you all have been paying to the news lately, but we have Presidential candidates saying that we should have a Manhattan-style project on how to give the government a backdoor into consumer encryption — mainly because they don’t understand this simple point — once there’s a backdoor in encryption, it’s not encryption any longer.

So I’m glad Apple’s fighting the law on this one, and I hope they win. You should too.


Viewing all articles
Browse latest Browse all 35726

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>