Skip to main content

Supreme Court to hear lawsuit accusing YouTube of complicity in 2015 Paris attacks

For years, social media companies have been accused of being complicit in violent attacks, including against Muslims
In 2019, YouTube announced it would take a "stronger stance" against threats and personal attacks made on the platform.
In 2019, YouTube announced it would take a "stronger stance" against threats and personal attacks made on the platform (AFP/File photo)

A US lawsuit blaming YouTube, in part, for the 2015 Paris attacks is heading to the Supreme Court next month, paving the way for the nation's highest court to discuss whether or not social media companies play a role in aiding large-scale attacks against civilians.

The family of Nohemi Gonzalez, the only American killed in the 2015 attacks, has filed a lawsuit against Google, YouTube's parent company, accusing it of complicity in the attacks.

It states the platform's algorithm "recommended that users view inflammatory videos created by ISIS, videos which played a key role in recruiting fighters to join ISIS in its subjugation of a large area of the Middle East, and to commit terrorist acts in their home countries".

Facebook in India: Why is it still allowing hate speech against Muslims?
Read More »

The central issue in the case is the scope of Section 230 of the Communications Decency Act of 1996, a US law that states that internet companies cannot be sued over third-party content uploaded by users or for decisions site operators make to moderate or filter what appears online.

The lower courts previously ruled in favour of Google, leading Gonzalez's lawyers to appeal all the way up to the Supreme Court, which said in October it would hear the case. Oral arguments are set for 21 February.

"If some changes can be done to prevent these terrorist people [from] keeping killing human beings, then that is a big thing," Beatrice Gonzalez, Nohemi Gonzalez's mother, said in an interview with ABC News on Monday.

Gonzalez was a 23-year-old college student who was studying in France when she was killed in the attack.

Beatrice Gonzalez alleges that Google's YouTube algorithms, a series of proprietary software instructions which recommend video content to users, amplified IS-produced materials in support of the militants that killed her daughter.

YouTube says it bans terrorist content across its platform and that its algorithms help catch and remove videos that promote violence, noting that 95 percent of those removed last year were automatically detected, and most were removed before receiving fewer than 10 views.

"Undercutting Section 230 would make it harder for websites to do this work," YouTube spokesperson Ivy Choi told ABC.

"Websites would either over-filter any conceivably controversial materials and creators, or shut their eyes to objectionable content like scams, fraud, harassment and obscenity to avoid liability - making services far less useful, less open and less safe."

Middle East Eye reached out to Google for comment but did not receive a reply by the time of publication.

Social media companies and hate speech

The lawsuit is not the first time that social media giants have been accused of inciting violence and being complicit in violent acts, particularly against marginalised communities around the world.

In 2021, dozens of Rohingya refugees in the UK and US sued Facebook, accusing the social media giant of allowing hate speech against them to spread.

Their lawsuit demanded more than $150bn in compensation. An estimated 10,000 Rohingya Muslims were killed during a military crackdown in Buddhist-majority Myanmar in 2017.

In addition to Rohingya, Facebook has been accused of providing a platform for the incitement of violent attacks against Muslims around the world, including the Christchurch attacks in New Zealand and violence against Muslims in India.

In April 2021, a US civil rights group filed a lawsuit against Facebook, claiming the company's alleged failure to enforce its own moderation policies was responsible for a wave of anti-Muslim abuse.

An independent civil rights audit of the social media company released last July outlined that, despite having policies that did not allow for hate speech against religious groups, incidents of hate speech continued to persist across Facebook.

In 2019, YouTube announced it would be taking a "stronger stance" against threats and personal attacks made on the video-sharing site.

Muslim advocacy groups welcomed the news which came after it was criticised for refusing to remove a homophobic video, adding that the anti-harassment policy should extend to protecting Muslims.

This article is available in French on Middle East Eye French edition.

Middle East Eye delivers independent and unrivalled coverage and analysis of the Middle East, North Africa and beyond. To learn more about republishing this content and the associated fees, please fill out this form. More about MEE can be found here.