Yes, following this with great interest. From a technical perspective, I think it illustrates the ultimate problem with using short pins (4 long is what I read) plus biometrics. Strong 2FA is not based on biometrics, either for security or privacy.
Something doesn't feel right about this phrase. If you encrypt some data, there is no possibility for an attacker that holds the encrypted message to somehow circumvent the encryption by using another piece of software (they can try breaking the encryption itself, of course). If such a method/software exists, that means the data in question was not properly encrypted in the first place.
It's unclear what is meant. Perhaps upgrading the iOS to the altered new version could bypass the screen lock? I don't think they're going to tell us more
Right, one would expect that devices can't be "upgraded" while they're locked and encrypted. But I don't think that Tim Cook is admitting that this can be done. He's arguing that Apple shouldn't be forced to try.
From what I read from the court request, the "upgraded" version of iOS was supposed to get rid of delay and lockout features to allow brute-forcing of the 4-character pin, though that's hardly brute forcing. Reputationally, and from a future sales perspective, the damage is large if they are forced to do this (and are technically able to do so), and I don't imagine they'll be reimbursed for that pain. Of course, they could respond by issuing upgrades to iOS that make it impossible to upgrade unless the pin is entered correctly (if it is the case that you can do so now). The drawbridge would well and truly be up if that were done. There's a lot about what is being asked for which makes little sense from the perspective of follow-up, because presumably the NSA and FB etc. will have trawled through online accounts already. That only leaves data confined to the phone being hidden. It makes much more sense as a Comey ploy to advance his agenda.
Details can be found here: https://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-court-order/
This clarifies things a lot, thanks. Apparently the proposed bypass would only work on that one, specific phone and wouldn't work on iPhone 6 at all.
I'm afraid that this noble stance by Tim Cook is only theatrics, the Iphone already is a tracking device merrily carried by millions of happy and naive citizens.
Well, there is that. True for all smartphones. And Android is even worse. There's also the fact that ~60% of iPhones are backed up to iCloud, and Apple has keys to decrypt.
I would also consider the questions raised when Apple bricked devices that did not have the screen replaced at an Apple store. Apple claims this is to mitigate a vulnerability and that only Apple can repair the screen with the phone. It raises the question whether it is possible to replace the screen with a modified screen designed to aid in access to the phone.
Really only 60%? I'd guess it's much closer to 99%+ of all US based iphones being backed up to the cloud. Seriously I'd be shocked if less than 90% are backed up to the cloud.
I thought that I saw that estimate on Hacker News, but now I can't find it. Maybe it was in one of the articles that I've read about Cook's letter.
No it's not. If the home button is changed then the fingerprint scanner is disabled. Have a look at the post from Jessa Jones which oges into detail about this, in reply to an article at PCMag: http://au.pcmag.com/apple-pay/42065/opinion/how-apple-missed-the-boat-on-error-53
It may be that 99% of users have iCloud backup turned on, but: (a) the terrorists and those with a criminal inclination are likely within the 1% that know turn it off; and (b) not all iPhone content may necessarily be backed up even if enabled. For example, we don't really know what it is that the FBI is looking for. Is it really the contact list, the calendar, photos, and music playlists? Sure those can be backed up to iCloud, but only the contact list seems especially insightful. Could it be that maybe what the FBI wants is some internal metrics like recent calls received or placed (although this likely can be gathered from cellular provider), internal GPS information (i.e. where the phone has been), etc? Its possible that the most of forensically valuable info (other than contact list) is not likely backed up to iCloud, unless they are wanting to see if the San Bernardino Pakistani wife had a selfie with Osama Bin Laden from 8 years ago on there. Also, I agree that modern smartphones can be powerful government tracking and privacy invasion devices, but that doesn't necessarily make Apple's stance hypocritical. Apple is trying to protect the content on the phone, not stuff like calls made or current GPS location... which the government already has access to through Verizon and AT&T. Honestly, the more I think about it, I just don't see much practical value in what the FBI expects to find on the phone that they don't already have. The FBI wants this custom FBiOS that does away with manual passcode entry, induced 'wrong guess' delays, and the potential lockout associated with brute-forcing the phone... as well as some other detailed voodoo having to do with Secure Enclave. They claim its just for this one phone. But that's sort of like telling a girl to go ahead and post a naked selfie ("no one will know")... but once its out there... its out there. You know someone malicious would get ahold of this tool somehow and use it for their own purposes. I just don't see that the tactical / forensic value to be gained in this one case... is worth the risk to everyone else's personal privacy.
@Alec - "I just don't see that the tactical / forensic value to be gained in this one case... is worth the risk to everyone else's personal privacy." In a way this is the nub of most of the privacy landscape we face. Different organisations and interests have different motives and rewards in doing what they do regarding surveillance and privacy, and are indifferent or hostile to the cost to other groups. Rather clearly, Apple's commercial interests are badly harmed here because they are a global company. More generally, this has led to the rise of the rent-seekers and lobbyists, to the obvious detriment of the population (who are "just" the product). Meanwhile, the uncle of murdered Lee Rigby has criticised Apple's stance as reported by the BBC: http://www.bbc.co.uk/news/technology-35595840
I'm by NO means a crapple fan boy. But I'd bet crapple could publicly cave to the *** & their sales wouldn't be off by 1%. In other words 99%+ people are sheep. And who knows this could just be a show for the public & been already settled in private.
@zapjb - indeed, the various interests are sometimes congruent, sometimes opposed, and in any case, you certainly cannot believe their public pronouncements anymore, trust is pretty much in the pits. I think I'd bet around 70% on the scenario you outline. In a longer-term way that's good though, because there are different providers in different jurisdictions that will benefit and provide us with more alternatives.
Maybe the FBI doesn't know either? Maybe they simply want to feel confident that there isn't anything else of value RE understanding what happened and possibly stopping future events?
Short of a *** screensaver nothing will deter folks from buying their products. Other than the usual market concerns. Heck a public fight will probably jump up their sales a bit.
Yeah, I agree zapjb... I don't see this as a huge sales driver for Apple one way or the other. Even if they give the FBI what they want, sales aren't likely to tank due to it. Also, I want to say that I'm not a diehard "privacy rights trump all" kind of guy. I don't have a problem with the FBI and NSA collecting phone metadata from the phone companies, as to me... the calls you make *are* in the public domain because you are using an essentially public service to conduct the activity (phone companies are private entities but are publicly regulated because society has deemed them essentially a public service). However, in this case, the FBI is wanting on to a phone itself. The data you put on there is yours. If you have iCloud backup & Photostream turned off, then those contacts & photos are purely yours. If the FBI can get the data from the iCloud servers then so be it. I'm fine with that also, as that again is like a public service, imho. To me it's sort of like "attorney / client" privilege... your conversations with your attorney are private, BUT if you talk with your attorney in a public fashion or in front of someone that is not part of the privileged relation, then you can't claim "attorney / client" privilege anymore. And, really, it doesn't matter too much to me personally. I don't have naked selfies or incriminating photos or contacts with terrorists or pedophiles on my phone. However, I just believe there is a line to be drawn somewhere, and for me the FBI is sort of crossing the line in this case.
"John McAfee Offers To Hack Terrorist's iPhone For FBI" Stoty Here: http://www.huffingtonpost.com/entry/john-mcafee-fbi-iphone_us_56c62bbee4b0928f5a6b3f7f Classic Mcaffee Quote from Article: "And why do the best hackers on the planet not work for the FBI? Because the FBI will not hire anyone with a 24-inch purple mohawk, 10-gauge ear piercings, and a tattooed face who demands to smoke weed while working and won't work for less than a half-million dollars a year," he wrote. "But you bet your ass that the Chinese and Russians are hiring similar people with similar demands and have been for many years. It's why we are decades behind in the cyber race." .
Isn't Apple compelled to comply with the court order anyway? They can appeal it, but if they lose the appeal they have to unlock the phone. I can not see Cook or any Apple Executive willing to go to jail over this. If the terrorists entered a random 11-digit passcode, it would be impossible to unlock the phone in less than 2 centuries. Many believe that Apple already has the ability to unlock this phone, so it is not a matter of writing a backdoor to get it done. If the industry believes that, then the FBI believes that.