The unbeatable iPhone (or so they say)…

There are a lot of news articles written about the case of the FBI against Apple, and I personally think that law enforcement agencies should get access to devices and data, after a judge decides that it is necessary, on iDevices of suspects that are somehow involved with terrorism. Like todays attacks in Brussels.

I cannot believe that Apple won’t setup a lab for this and use hardware to get access to locked iDevices, when ordered by law. Is it just me or what?

But… someone ( apparently) already found a new attack vector. Hmm. I wonder who that is and what he did. Perhaps he took out the SIM slot and used a specially crafted SIM card, or used the port at the bottom of the iPhone. Two possible attack vectors. I mean. We all know that iOS has (security related) flaws. Otherwise Jailbreak software would not have been possible.

So yeah. I’m all in for Apple, or some independent organisation, getting access, with a hardware device that won’t be shared with anyone else. I mean. Who gives a BEEP about my stupid pictures and conversations. My bank info? Same story. Law enforcement agencies worldwide already have access to it so whatever. I mean. Is your data really more important to you than the safety of you and your family? Really? Well. To me it isn’t. Not in this universe, so go go Feds!!!

Sure. It may be questionable if this will help law enforcement agencies to prevent future attacks, but to say no upfront… is IMHO not a smart thing to do. Not something Apple should do. And companies like Samsung, Google. Well. You name it. They should all have some kind of hardware device in a lab to help law enforcement agencies. I’m not saying that it should be simple and free of charge, but a reasonable price should be acceptable. And I personally have absolutely zero issues with a solution like this.

25 thoughts on “The unbeatable iPhone (or so they say)…

  1. I don’t agree with you on this. It is my personal data, even though I’ve nothing to hide, it is mine… I should be able to decide what to do with it and not anyone else (legal or illegal)…

    • Don’t get me wrong. I like to have some privacy, too, but not at all costs. Luckily this is not about mass surveillance, since it is limited to specific cases linked to terrorism and other criminal activities, and thus people like you and me, who are not involved with terrorism or other criminal activities, have nothing to worry about.

      • Cellebrite is the same company that was being used by Michigan State Police during routine traffic stops to search phones without a warrant. The ACLU stepped in because the police were compelling (guilt tripping) people into giving them access to copy everything off their phones.

        You say “since it is limited to specific cases linked to terrorism and other criminal activities” as if the mere fact that terrorism is a criminal activity makes it OK to require the backdooring of any security feature that might ever get in the way of law enforcement. Remember, there are no limits being proposed here, suspected of smoking pot? Neighbor called in a noise complaint? Cop just doesn’t like you? Yeah, that’s covered by this ruling too.

        So this is very much about surveillance, the FBI just picked a terrorist case to push the issue so they could ride the emotions of the average person, then use the case for all future cases (precedence) to show they should always be able to compel any company into providing backdoors for security features. The request is a complete overstep of jurisdiction. Law enforcement should NEVER create or push laws, they ENFORCE them. The Legislation branch creates laws. Every person that complains about the president enacting laws is making the same case that I am when I say police shouldn’t create laws either.

        And one final argument: The police are somehow magically always good guys. Just like any other segment of the population, there are plenty of cops that do things they shouldn’t, and I’m not about to give them access to snoop on my phone. Real world example? Would you want this cop to have access to your 16 year old daughter’s iPhone?

      • Hi Tom,

        This is exactly why I would love to see a hardware device that only Apple has, and keeps safe in a protected lab, like they have for new hardware, and thus no street cop could ever use it. We don’t want third party companies to get access to our data.

  2. Another problem is your perspective on the issue is “My Photos”. My photos are the least of my worries. My iPhone can be used to access all of my company’s SSL certs. My iPhone can be used to recover passwords from all my bank accounts, credit cards, private sites. My iPhone can be used to impersonate me to commit crimes. My iPhone has ssh keys allowing access to things I do not own but am responsible for.

    Do I care about my photos? Not in the least. Most of them are already on Facebook. You can’t use Facebook to take over my life, credit score, banking, corporate resources, trade secrets.

    • Let me try to put this into perspective. First. Photos are of course just one example of data on a phone, and can be important to catch pedophiles, and the thing is that you already trust Apple, so this should not be a problem, since the concept here is that only Apple can access the data, with a special hardware kit, when required by law. And not just any ordinary Apple employee, but a specialist hired by Apple, who won’t misuse any of your data. Not to mention that the terrorist is dead and thus we should not give a BEEP about his rights… which he lost after his attack. Let’s not forget that.

      And for the record; Apple said that they could have restored the data in the cloud, but that the FBI changed the password, and that was the end of the story. In other words; iCloud storage is far from perfect.

      • I don’t care about any terrorist, but what if that hired “specialist” leaks that tool, or if it gets stolen and can be used to impersonate me as a “terrorist”?

      • Why would anyone want to make you look like a terrorist? Not that I think that you need any of your data for that, but still. It all boils down to trust, and Apple will certainly do everything in their power to protect that piece of hardware. Like the zero leaks that we have seen from Apple employees about new hardware (for example). Absolutely nothing. Unsurprisingly, because their career would be over after a leak, let alone such an important piece of hardware. Not going to happen.

        Also. Since iOS is just a piece of software, and with it come flaws, so maybe your data isn’t that secure after all. You just think that it is secure, but based on what?

      • p.s. I thought, they finally started to encrypt the iCloud backup with the device password with the latest iOS 9.3.

    • Yup. without encryption there’s no point in even trying to develop new products anymore, because the Chinese will just steal it before you even have the opportunity to create it.

  3. If I had to guess, I’d say that the vulnerability on an iOS 9 device probably involves booting the phone, then swapping out a certain IC or other component on the board with it still powered up. Since the chain of trust is, I believe, mostly verified at boot time. That’s my somewhat-educated guess.

    Oh and I mainly agree, that the original warrant was not an undue burden on Apple and was reasonable given the circumstances (they are arguing mainly about judicial precedent, and they’ve not done a good job at all with their PR). But I also think that no law should be allowed to prevent Apple from continuing to improve its security until no recovery just isn’t possible.

  4. I agree to disagree for multiple reasons. First of all at least in most western countries the tool to be created must go to multiple parties for review (to ensure that is doesn’t alter data in an unwanted manner etc). This of course leads to an increased risk of a leak – even if Apple engineers are trustworthy – but the government itself has not such a good track record in keeping information safe (Snowden anyone ?). And small attorney offices are maybe also not the safest places. Of course the existence of this tool in the wild will put every iOS user at risk no matter if it’s (worthless) pictures or bank accounts or medical records or or or… But another argument surely is that the creation of such a tool by Apple puts Apple itself into a difficult position: To decide which request is legitimate. US courts trials on terrorism? China courts trials regarding reporters ? The North Korea government recently charging a US citizen for espionage ? I think Apple doesn’t want to be put into that position and at the same time they neither want to approve all request blindly as it will lead to an image problem of the company.
    Another scenario is of course industry espionage, either supported by governments (at border inspection) or by big corporations. Just image Samsung decrypting an Apple representatives iPhone when he/her is negotiating the next A10 cpu or for advanced flash storages.

    All over I see a lot of reasons for Apple not creating such a tool. For the safety of the users but also for self-protection.

    • Listen. Apple now simple states that the data on iDevices cannot be decrypted, and so far everyone had to accept this. No matter what. Now image that Apple would indeed make a special hardware kit to extract the data from iDevices, and that only Apple could extract the data, that it delivers the data in an unaltered format. Whenever possible. You really think that Apple is going to mess with the data and that law enforcement agencies won’t trust Apple if only Apple has access to it?

      The fact that (some) governments and small attorney offices are unable to protect your data is another issue. This can and should be solved by implementing additional requirements and extending the rules (by laws).

      Sure. Apple will be in a (more) difficult spot, but it has officies and lawyers in most countries, so deciding which request is legitimate is not impossible (to me any court order is legitimate).

      About industry espionage. Since only Apple would have access to this device, in a protected lab, nobody else would be able to decrypt the data. And espionage is usually done by compromising servers/certificates and bribing people, which is much easier if you ask me.

      The bottom line is simple: No company should be put above the law and jeopardise the safety of human beings, even if that means that some people, due to their own actions, will have to share their data.

      • If it would be a hardware device then it would be more difficult to leak for sure. But as I understand there was a software solution looked for (i agree the fbi probably doesn’t care if it’s hard or software). Anyway there is a pretty interesting article at describing that such a tool would go through multiple parties before being accepted by a court (in the US). I am confident Apple has the expertise to build such a tool without altering the data – still it would have to be reviewed and analyzed by different parties making the risk quite real of such a (software) tool leaking (like the leaked TSA lock master keys). Of course anyway the bad guys will move to (open source) more secure solutions without any known backdoors. Ultimately Apple is not above the law – that’s why they try to find out what the law is. I guess it’s acceptable within the law to fight a court decision. It’s a standard legal option.

      • Thank you for sharing that link here. Good information like that is vital in order to understand what it requires. Not so easy after all. I still hope that Apple finds a reliable and secure way to help fight criminals and extremists, without the need of third party companies, who may be easier to hack, and on top of that sell tools to regimes and organisations that should not get access to our data.

    • Thanks. That is a cool picture of Tim Cook, and he is absolutely right. That is. There should not be a backdoor in iOS. However. Law enforcement agencies should get help, from Apple, with unlocking and data extraction, so that iDevices won’t become safe havens for criminals and extremists.

      • You’re the expert here, so correct me if I’m wrong: even if the government were to get help from Apple every single time, wouldn’t at least _one_ of the times require a backdoor, which Apple does not want to provide/create/etc in the first place?

        So in other words: is it feasible that one (help from Apple) cannot exist without the other (backdoor)?

        Always a pleasure, Pike!

  5. “Like todays attacks in Brussels.” Instead of decrypting a phone, it would be better to stop support and arm wahhabists, don’t you think?

    • Listen. I do not want a backdoor in iOS so that anyone can get in, but the server flaw that showed up recently, that one may be of help to law enforcement agencies like the FBI. All Apple will have to do is mod their server software for the phone and Siri can show them what they want. Like contact info. Is that really such an issue for you?

  6. The question is: to what law enforcement agencies do you think it’s acceptable for Apple to provide people’s data? As soon as Apple would be willing to provide data to FBI, every other secret service in the world would require to do the same and it would be a matter of losing ability to do business in those countries if they refuse. Ministry of State Security in China, Federal Security Service in Russia, Mabahith in Saudi Arabia – you name it.

    • Well this is pretty simple IMHO because if Apple wants to do business in a country, then they should obey the law, even when they don’t like it, or stop doing business there. So in short. All of them.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s