One can understand the instinct, when the one who actually caused you tortious harm is beyond any judgment but the eternal one, to lash out at whatever hefty pockets seem within reach. Still, the legal gambit from the families of three of the people (Tevin Crosby, Javier Jorge-Reyes and Juan Ramon Guerrero) killed in Omar Mateen's murder rampage in June at Orlando's Pulse nightclub to sue Facebook, Twitter and Google because the tech services allegedly "provided the terrorist group ISIS with accounts they use to spread extremist propaganda, raise funds, and attract new recruits" should have any believer in free expression and the ability to technologically and legally facilitate it nervous. I certainly hope no U.S. judge sees any merit in it.
The suit was filed this week in U.S. District Court in the eastern district of Michigan, as first reported yesterday by Fox News.
Omar Mateen/ZUMA Press/Newscom
What we all want out of communication networks like Facebook and Twitter and search services such as Google, and usually get at least in any way it actively affects us, is that they neither interfere with nor even worry overmuch about how we are using them. For them to be what we want them to be, they should be as neutral as possible.
To the degree they choose not to be neutral, they open themselves up to these sorts of accusations that by providing a means for people to communicate or earn money via ads, they are somehow complicit in the nature of the communications or their real-world harms, if any. This should be a reason for such companies to be as effectively content-neutral as possible, though as the lawsuit itself notes, the entities being sued try not to seem to facilitate terror.
Section 230 of 1996's Communications Decency Act has generally been interpreted, correctly, as indemnifying the providers of these communications services from being considered responsible for the content on them.
The families' lawyer are arguing, though, that, as Fox puts it:
sites like Facebook may be violating the provision with their heavily-guarded algorithms….this lawsuit alleges something much more nefarious behind one of the tech world's most secretive processes. "The defendants create unique content by matching ISIS postings with advertisements based upon information known about the viewer," [lawyer Keith] Altman said. "Furthermore, the defendants finance ISIS's activities by sharing advertising revenue."… While these social platforms have cracked down and deactivated accounts affiliated with terrorist groups in the past, Altman argued that another account will almost immediately pop up and that companies think they're not responsible because they are not ones producing the content.
Yes, that is exactly the point, and no one who enjoys using any of those services would want them to have to act otherwise (even if some applaud them when they try to act otherwise in certain cases, even if the services don't, and shouldn't, admit that policing or barring certain content means they are responsible for everything they don't bar).
If these companies felt the legal need to behave as if every use of their service is their legal responsibility, nearly everything good about them would be in danger.
The lawsuit is the latest to target popular Internet services for making it too easy for the Islamic State to spread its message. In June, the family of a California college student killed in last year's terrorist attacks in Paris sued Facebook, Google and Twitter. Keith Altman, the attorney representing the three families in the Orlando nightclub lawsuit, also represents the family of that student, Nohemi Gonzalez, in the Paris terrorist attacks lawsuit.
The services aren't always neutral in allowing their customers to use them, as noted above and in the suit, and according to Reuters:
Facebook said on Tuesday there is no place on its service for groups that engage in or support terrorism, and that it takes swift action to remove that content when it is reported. "We are committed to providing a service where people feel safe when using Facebook," it said in a statement. "We sympathize with the victims and their families." Twitter declined to comment. In August, the company said it had suspended 360,000 accounts since mid-2015 for violating policies related to promotion of terrorism. Representatives of Google could not immediately be reached. The three companies plus Microsoft Corp said this month they would coordinate more to remove extremist content, sharing digital "fingerprints" with each other.
That the companies try to do such policing is their prerogative, and will doubtless be used against them in legal arguments to say that since they clearly are not content neutral, failure to sufficiently expunge terror-related communications shows they are responsible for them.
The question of whether Google's ad model equals them being complicit in funding whatever person or entity makes money from that model in a criminal or tortious sense also threatens Google's existence. As the lawsuit states, "For at least one of the Defendants, Google, revenue earned from advertising is shared with ISIS… YouTube approves of ISIS videos allowing for ads to be placed with ISIS videos. YouTube earns revenue from these advertisements and shares a portion of the proceeds with ISIS."
No deviation from an ideal of neutral facilitation of communication could begin to justify the notion that the companies are to blame for the use of their services to communicate ideas that may (or may not have) inspired or in some sense caused Mateen to commit his act of murderous terror, though respect for free speech requires acknowledging that the cause of the mayhem lies in the mind and body of the man who performed it, and he is dead.
The lawsuit itself also details how it believes Twitter's attempts to quash ISIS-related accounts are feeble, narrow, and too easily circumvented, and that the company can and should stop some of the obvious means used by ISIS-related accounts to rebuild and rebrand after they get deleted.
It further argues that the combination of content and ads targeted via algorithm constitute Google's participation in the creation of the content as perceived by a user.
The suit wants the companies to pay "compensatory damages…treble damages pursuant to 18 U.S.C. § 2333….any and all costs sustained in connection with the prosecution of this action, including attorneys' fees, [and] an Order declaring that Defendants have violated, and are continuing to violate, the Anti-Terrorism Act, 18 U.S.C. § 2331."
Elizabeth Nolan Brown reported recently on the value of the Communications Decency Act in hobbling the attempt to prosecute the people running the Backpages web site.
Comentarios