I think it comes down to this: The FBI will get into the phone. That's never really been at issue. The question at hand is how easily they will get into the phone, and what that means for device manufacturers moving forward. the FBI is arguing that they can compel Apple to cripple encryption after the fact. Apple is arguing they can't. It's a hell of a precedent to set, and most manufacturers have been trying hard not to set it. American manufacturers are already hindered by laws against strong encryption in the United States; it doesn't effectively keep Americans from using them, but it does keep American manufacturers from selling them, giving European and Asian vendors a leg up. Aside from issues of civil liberties (which I in no way mean to discount), giving American law enforcement agencies the precedent of cracking into any American device however they want whenever they want because they decide they should is likely to have a chilling global effect on American technology companies. And the thing of it is, weak encryption is adequate for most people. They're not looking to protect their information from the NSA, they're trying to keep their credit card info safe from their children. They want to know that if they drop their phone on the street, they have a few hours to wipe it remotely before someone hooks up a dongle to it to brute-force the combo. The type of encryption and the methods of decrypting it aren't really at issue. It comes down to the government wanting the legal right to compel companies to disable their security features whenever the government says so and Apple, in this case, arguing that complying with the government would be bad for their customers and bad for their bottom line. And I suspect that Apple will win.
True only because if you're a criminal attacking the encryption is the stupid way to get at someone's data. A dictionary attack on stupid passwords, or phishing, or some other sort of social engineering is much better. Getting at the data though the user is almost always going to be easier than through the machine. But strong encryption is still important because it creates trust. "Even the resources of a nation-state would be insufficient to crack this" engenders trust; "only thieves willing to shell out for some FPGAs can crack this" does not.And the thing of it is, weak encryption is adequate for most people. They're not looking to protect their information from the NSA, they're trying to keep their credit card info safe from their children. They want to know that if they drop their phone on the street, they have a few hours to wipe it remotely before someone hooks up a dongle to it to brute-force the combo. The type of encryption and the methods of decrypting it aren't really at issue.
You are right about nearly everything in your statements, I don't really want to respond to all of it I just want to respond and make clear one thing here that I think you are already aware of. I feel weird that I have to add this disclaimer since generally when I read people quote a single part of an statement and comment on just that it looks like they are trying to take down the whole factual platform of the other guy off one thing, which I'm not. It's weird how quotations and discussions that are civil can even look confrontational when they are direct responses back and forth between two users. Anyway, I just wanted to point out: Technically they aren't crippling encryption after the fact, it was poorly designed from the beginning and they instead relied on tamper-proofing the device itself to protect the data on the phone. It's not actually possible to cripple encryption after the fact without solving a new research problem on the algorithms themselves. It happens from time to time, but would be remarkable on some of the most well researched algorithms like AES and RSA that the NSA can't even get around. PRISM and many other programs wouldn't need to exist if they could break AES or RSA (keep in mind the usage of "or" and not "and" there, if they break one the other naturally fails in hybrid encryption like SSL/TLS for instance). I'm actually unaware of this, could you talk about which laws you are referring to? I know of no strong encryption laws on the books, as basically every project that calls SSL (basically everything) is using strong cryptography. Every phone or other device manufacturer simply by installing OpenSSL have packaged in strong crypto and regularly use it.the FBI is arguing that they can compel Apple to cripple encryption after the fact. Apple is arguing they can't. It's a hell of a precedent to set, and most manufacturers have been trying hard not to set it.
American manufacturers are already hindered by laws against strong encryption in the United States
See, I'm not sure that distinction matters. It's like this: - the device shipped with mediocre encryption, protected by tamper-proofing - stronger encryption is currently available - the FBI wants an end-run around the tamper-proofing So whether the FBI wants to cripple "encryption" or "security" is a legal point, to be sure, but the precedent set is all about the after-the-fact part. They want to be able to compel a company to crack open something that was secure. That makes everything that is secure potentially insecure whenever the government can shove a writ through. You're right - strong encryption will protect you. But there's also the pain-in-the-ass factor: if most people are using weak encryption, then using weak encryption is a great way to blend into the crowd. If everyone uses strong encryption, then using strong encryption becomes anonymous. In the NSA/FBI/CIA/TLA's horror world, everyone shifts to strong encryption, meaning that they can't single people out just by what encryption they're using. And then, they're sure going to want to be able to compel Apple into cracking that strong encryption, rather than just weak encryption. https://en.wikipedia.org/wiki/Export_of_cryptography_from_the_United_StatesTechnically they aren't crippling encryption after the fact, it was poorly designed from the beginning and they instead relied on tamper-proofing the device itself to protect the data on the phone.
I'm actually unaware of this, could you talk about which laws you are referring to? I know of no strong encryption laws on the books, as basically every project that calls SSL (basically everything) is using strong cryptography.