FBI already has the suspects' data and is using the iPhone encryption issue to widen powers

Spectrum

Elite Poster
BGOL Investor
CbnTJv3WAAAhnBc.png:large


plus

http://www.mirror.co.uk/news/world-news/san-bernardino-killers-iphone-password-7406906


Apple has blasted the FBI for changing a password associated with the iPhone used by one of the San Bernardino terror attackers.

Investigators have demanded access to an iPhone used by terrorists who killed 14 people and are in a row with the tech giant.

Two senior Apple executives said the company had worked hard to help investigators trying to crack the phone for clues about Syed Rizwan Farook , who committed the December 2 atrocity with his wife Tashfeen Malik.

San Bernardino County reset the password on the iCloud account at the request of the FBI, said county spokesman David Wert.

The government first disclosed the identification change in a footnote to its filing Friday.

Read more: Apple apologises for 'Error 53' and releases fix for bricked iPhones after legal action



Nick Hedley
iphone.jpg

Row: The passcode on the phone was changed


The Apple executives said that the reset occurred before Apple was consulted. The Justice Department declined to comment on that contention.

They criticised government officials who reset the Apple identification associated with the phone, which closed off the possibility of recovering information from it through an automatic cloud backup.

Tim Cook, CEO of the tech giant, said the FBI has asked the firm to help create a "backdoor" which could potentially allow hackers to crack into any iPhone in the world.

Read more: Lee Rigby's family accuses Apple of putting privacy of terrorists ahead of public safety

But Cook claimed the only way to do this was to create a dangerously insecure version of iOS, the operating system which powers iPhones and iPads.

"The government suggests this tool could only be used once, on one phone," Cook wrote in a lengthy statement.


Investigators have demanded access to an iPhone used by terrorists who killed 14 people and are in a row with the tech giant


"But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes.

"No reasonable person would find that acceptable."

He claimed the alleged back door would leave millions of customers open to hack attacks and said the "demand would undermine the very freedoms and liberty our government is meant to protect".

Some technology experts and privacy advocates backing Apple suggest Farook's work phone likely contains little data of value.

They have accused the Justice Department of choreographing the case to achieve a broader goal of gaining support for legislation or a legal precedent that would force companies to crack their encryption for investigators.
 
http://www.newyorker.com/news/amy-davidson/a-dangerous-all-writ-precedent-in-the-apple-case?

Tim Cook, the C.E.O. of Apple, which has been ordered to help the F.B.I. get into the cell phone of the San Bernardino shooters, wrote in an angry open letter this week that “the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create.” The second part of that formulation has rightly received a great deal of attention: Should a back door be built into devices that are used for encrypted communications? Would that keep us safe from terrorists, or merely make everyone more vulnerable to hackers, as well as to mass government surveillance? But the first part is also potentially insidious, for reasons that go well beyond privacy rights.

The simple but strange question here is exactly the one that Cook formulates. What happens when the government goes to court to demand that you give it something that you do not have? No one has it, in fact, because it doesn’t exist. What if the government then proceeds to order you to construct, design, invent, or somehow conjure up the thing it wants? Must you?


The F.B.I.’s problem is that it has in its possession the iPhone used by Syed Rizwan Farook, one of the San Bernardino shooters, but the phone is locked with a passcode that he chose. (The phone, which investigators found while executing a search warrant for Farook’s car, is actually the property of the San Bernardino County Health Department, Farook’s employer, which has consented to its search—and so, as Orin Kerr points out, on the WashingtonPost blog the Volokh Conspiracy, there is no Fourth Amendment issue there.) If the F.B.I. enters the wrong passcode ten times, the data may be turned to gibberish. And if the F.B.I. disables that feature, allowing it to enter every possible passcode until it hits the right one, it may still come up against another barrier: a built-in delay between wrong entries, so that typing in five thousand possibilities, for example, might take thousands of hours. Both sides agree that Apple has given significant technical assistance with the San Bernardino case already; in response to a separate warrant, it gave the F.B.I. the iCloud back-ups for Farook’s phone (the most recent was from some weeks before the shooting). In the past, in response to court orders, Apple has helped the government extract certain specific information from older iPhones—perhaps seventy times, according to press reports. But there is apparently no way for the company to do so on the newer operating system, iOS 9, which the shooter was using and which was built without a “back door.” In other words, there is no set of instructions or a skeleton key in a drawer somewhere in Cupertino that Apple could give the F.B.I. to allow it to get in.

And so Judge Sheri Pym, a California district-court magistrate, has ordered Apple to come up with a new software bundle that can be loaded onto the phone and, in effect, take over the operating system and tell it to let the F.B.I. in. (Apple will have a chance to object to the order in court.) As an added point of convenience, this bundle is also supposed to let the agents enter passcodes electronically, rather than tapping them in, which is one of the many points on which the government seems to have moved from asking for compliance with a subpoena to demanding full-scale customer service. In its request for the order, the government says that “Apple has the exclusive technical means which would assist the government in completing its search,” for a number of reasons. One is that iPhones look for a cryptographic signature before accepting operating-system software as legitimate. But the government, again, is not asking for a signature or even for the equivalent of a handwriting guide (which would be problematic, too) but for an entire ready-to-run bundle. It has said that it wants Apple to put in a code that makes the bundle usable only on Farook’s phone—but that is a desire, not a description of an existing, tested, software protection. (The government also says that it will pay Apple for its work.) The other reasons that the government says that Apple should be compelled to do this work come down to Apple being Apple—being a smart company that designs this kind of thing. What is the government’s claim on that talent, though? Would it extend to a former engineer who has left the company? The government’s petition notes that the operating system is “licensed, not sold,” which is true enough, but conveys the darkly humorous suggestion that Apple’s terms of service are holding the F.B.I. back.

1977 Supreme Court decision involving telephone taps, called pen registers. In that case, the F.B.I. wanted New York Telephone, which was already helping it to set up a tap in an illegal-gambling sting, to let it use some spare cables that were, physically, in the same terminal box as those hooked up to the suspect’s phone. The telephone company told the F.B.I. to get its own wires and string them into the apartment of one of the alleged gamblers some other way. When the F.B.I. objected that the suspects might spot the rigged cables, the Court agreed that it could legitimately ask the telephone company for its technical help and “facilities.” But the F.B.I. wasn’t asking New York Telephone to design a new kind of cable.

If a case involving a non-digital phone network could be applied to smartphones, what technologies might an Apple precedent be applied to, three or four decades from now? (The N.S.A. used, or rather promiscuously misused, another pen-register case from the same era to justify its bulk data collection.) It no longer becomes fanciful to wonder about what the F.B.I. might, for example, ask coders adept in whatever genetic-editing language emerges from the recent developments in CRISPR technology to do. But some of the alarming potential applications are low-tech, too. What if the government was trying to get information not out of a phone but out of a community? Could it require someone with distinct cultural or linguistic knowledge not only to give it information but to use that expertise to devise ways for it to infiltrate that community? Could an imam, for example, be asked not only to tell what he knows but to manufacture an informant?

This is the situation that Apple is in, and that all sorts of other companies and individuals could be in eventually. There are problems enough with the insistence on a back door for devices that will be sold not only in America but in countries with governments that feel less constrained by privacy concerns than ours does. And there are reasons to be cynical about technology companies that abuse private information in their own way, or that jump in to protect not a principle but their brands. But the legal precedent that may be set here matters. By using All Writs, the government is attempting to circumvent the constitutionally serious character of the many questions about encryption and privacy. It is demanding, in effect, that the courts build a back door to the back-door debate.
 
CbnTJv3WAAAhnBc.png:large


plus

http://www.mirror.co.uk/news/world-news/san-bernardino-killers-iphone-password-7406906


Apple has blasted the FBI for changing a password associated with the iPhone used by one of the San Bernardino terror attackers.

Investigators have demanded access to an iPhone used by terrorists who killed 14 people and are in a row with the tech giant.

Two senior Apple executives said the company had worked hard to help investigators trying to crack the phone for clues about Syed Rizwan Farook , who committed the December 2 atrocity with his wife Tashfeen Malik.

San Bernardino County reset the password on the iCloud account at the request of the FBI, said county spokesman David Wert.

The government first disclosed the identification change in a footnote to its filing Friday.

Read more: Apple apologises for 'Error 53' and releases fix for bricked iPhones after legal action



Nick Hedley
iphone.jpg

Row: The passcode on the phone was changed


The Apple executives said that the reset occurred before Apple was consulted. The Justice Department declined to comment on that contention.

They criticised government officials who reset the Apple identification associated with the phone, which closed off the possibility of recovering information from it through an automatic cloud backup.

Tim Cook, CEO of the tech giant, said the FBI has asked the firm to help create a "backdoor" which could potentially allow hackers to crack into any iPhone in the world.

Read more: Lee Rigby's family accuses Apple of putting privacy of terrorists ahead of public safety

But Cook claimed the only way to do this was to create a dangerously insecure version of iOS, the operating system which powers iPhones and iPads.

"The government suggests this tool could only be used once, on one phone," Cook wrote in a lengthy statement.


Investigators have demanded access to an iPhone used by terrorists who killed 14 people and are in a row with the tech giant


"But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes.

"No reasonable person would find that acceptable."

He claimed the alleged back door would leave millions of customers open to hack attacks and said the "demand would undermine the very freedoms and liberty our government is meant to protect".

Some technology experts and privacy advocates backing Apple suggest Farook's work phone likely contains little data of value.

They have accused the Justice Department of choreographing the case to achieve a broader goal of gaining support for legislation or a legal precedent that would force companies to crack their encryption for investigators.
This is all false and simply tin foil hat propoganda.
 
feds are some nosey muthafuckas....

once they get in, they are going to charge firms big money

for their digital espionage services...
 
Apple is doing this for marketing and nothing more. Apple does not or was never concerned with your privacy. If they were, Apple would not be monitoring your emails, location, what you surf on the web, text messages and photos. That is more than the government has ever done and we have people actually complaining about privacy.
 
feds are some nosey muthafuckas....

once they get in, they are going to charge firms big money

for their digital espionage services...
Apple knows everywhere you have been, the stores you have walked in, scans the content in your emails and text messages, which apps you use and photos you have taken but the FBI are the nosey ones?
 
Apple knows everywhere you have been, the stores you have walked in, scans the content in your emails and text messages, which apps you use and photos you have taken but the FBI are the nosey ones?

well its THIER customers they want to know what they are doing so they can sell them more shit..

its the digital age, everybody from google to amazon is doing it, when you use THEIR services..

the fuck the feds want with that info for??

they keep their little operations secret... talkin bout some damn national security..

and apples claiming corporate security...

whats the difference??

feds couldnt even stop a terrorist from fuckin afghanistan,

from invading their airspace and knockin down towers...

they still aint answer for that shit!!
 
Gone are the days of doing real police work everybody just looking for the cheat code and even then the FBI has done little to stop any real terrorist that they didn't manufacture.
 
Apple knows everywhere you have been, the stores you have walked in, scans the content in your emails and text messages, which apps you use and photos you have taken but the FBI are the nosey ones?

Where do you think Apple stores all of that information?
 
Apple is doing this for marketing and nothing more. Apple does not or was never concerned with your privacy. If they were, Apple would not be monitoring your emails, location, what you surf on the web, text messages and photos. That is more than the government has ever done and we have people actually complaining about privacy.


Apple doesn't need additional marketing, they've already got the most successful smart phone and profitable ecosystem.
 
well its THIER customers they want to know what they are doing so they can sell them more shit..

its the digital age, everybody from google to amazon is doing it, when you use THEIR services..

the fuck the feds want with that info for??

they keep their little operations secret... talkin bout some damn national security..

and apples claiming corporate security...

whats the difference??

feds couldnt even stop a terrorist from fuckin afghanistan,

from invading their airspace and knockin down towers...

they still aint answer for that shit!!
I guess the best answer I could provide is what's more important. Corporate security or national security.

Side note: why do you type the way you do?:lol: You always space out your sentences. Is that just your style and you stay snapping not matter what the subject :lol:
 
Apple doesn't need additional marketing, they've already got the most successful smart phone and profitable ecosystem.
This would be false also. If they want to maintain their market share apple will always need to market while also finding creative ways to market their products. This is one of those to me.
 
Apple doesn't need additional marketing, they've already got the most successful smart phone and profitable ecosystem.

Need it? No. Would take full muthafuckin' advantage of it? Absolutely!

This has nothing to do with whats already profitable and successful, this has everything to do with "more".
 
Question tho

With Apple refusing to unlock phones. With this being a work phone, would this negatively impact Apple's enterprise business as companies may opt for being able to have full control over the phone?

With the way it is set up now. An employee could simply lock the phone and the company could never find out what the employee did if under investigation.
 
I guess the best answer I could provide is what's more important. Corporate security or national security.

Side note: why do you type the way you do?:lol: You always space out your sentences. Is that just your style and you stay snapping not matter what the subject :lol:

well since america is a corporation, with its own ceo aka president...

I would have to say corporate security...

thats just my writing style..

makes for easy reading... in this colin friendly

environment we dwell in....
 
Last edited:
I mean, "


And if they lose and are forced to open a backdoor?.... why would they chance that.


Apple only cares about opening a backdoor to the extent that they can no longer make claims about how tight their ecosystem is and will gladly point the finger at the feds for any subsequent security breach. If they win, it is a boon for them as far as new customers and for corporations everywhere in regards to the extent that they have to cooperate with the feds.....on any level.
 
Back
Top