top of page
Writer's pictureOurStudio

FBI Won't (or Can't) Say How It Broke Into Terrorist's IPhone

iPhone

Credit: magerleagues / photo on flickr


The FBI doesn't know how the third party tech folks it hired to break into San Bernardino terrorist Syed Farook's iPhone succeeded. Therefore, it says it can't inform Apple how it was done to alert them of a security risk potentially affecting its customers. Or so the FBI says.

That's the breaking news from the Wall Street Journal this afternoon. There was a big question mark as to whether, as is typical policy, the federal government would inform a company about a security risk in its software. Apple, we all know, resisted the FBI's efforts to try to force it to develop code to assist officials in breaking its own security. Right before a planned court confrontation, the FBI withdrew its demands because it found another company (an unidentified third party) who was able to figure out how to bypass the phone's security (at significant expense).

Sources told the Wall Street Journal that it will tell the White House that it doesn't know how the tool used to break into the iPhone worked—that it "knows so little" that there's no point in even having a review process to determine whether the information should be passed along to Apple.

As a result, this means that American customers who have phones models similar to Farook's have a security vulnerability that might not be fixable, unless Apple is informed or figures it out on its own (one suspects they're probably working on it).

Should we actually believe the FBI when they say they don't know how the tool works? It's easy to be skeptical of their honesty given how pettily the Department of Justice responded to Apple's attempts to defend itself in court, dismissing the company's very real need to protect the security of its customers as a "marketing" concern. But a post by Susan Landau at the Lawfare blog suggests that they may well be telling the truth, and that itself is a cause for concern. The FBI is trying to terrify us all about the threat of terrorists and child predators "going dark," but it doesn't seem to be making budget recommendations that reflect this fear:

The FBI is going dark, but the cause is not encryption; it is the Bureau's approach to investigations involving encryption and other types of anonymizing tools. Consider the FBI's 2017 budget request. It includes a requested increase of $38.3 million and 0 positions for "challenges related to encryption, mobility, anonymization, and more"; current services are at "39 positions (11 agents) and $31 million." This explains the FBI's problem. Despite six years of publicly pressing for laws to control encryption's deployment, the FBI staffing is at a remarkably low level, one that fits the attack profile of quite a few years ago, not the present time. By contrast, the 2017 request for additional physical surveillance capabilities is for $8.2 million and 36 positions (18 agents); this request is on top of the current 1770 positions (549 agents) and $297.8 million budget. (The 2017 FBI budget request also includes a separate cyber component with 1,753 positions (897 agents) along with a current budget of $541.4 million, and a 2017 request of $85.1 million and 0 positions. While the cyber component interacts with the Going Dark program and small amounts of funds are fungible, the cyber effort does not substitute for the missing Going Dark capabilities.)

If that's how the FBI is prioritizing spending, then no wonder they are so hot to draft tech companies to do the work for them.

Read more here.

0 views0 comments

コメント


bottom of page