X and Meta approved antisemitic and anti-Muslim content in Germany

Study finds that Facebook and X approved ads calling for the arrest of migrants or violence against Jews, stopping the 'Jewish globalist agenda,' or the extermination of Muslims

Rephael Kahan|
Getting your Trinity Audio player ready...
A recent investigation found that Facebook and X approved ads calling for the imprisonment of migrants and violence against Jews, published ahead of Germany’s federal elections. Meta approved half of the ads, while Elon Musk’s X approved all of them. One ad urged action against the "Jewish globalist agenda," while another called for the extermination of Muslims.
2 View gallery
מאסק בנאום הווידאו
מאסק בנאום הווידאו
Elon Musk in a video speech in Germany
(Photo: Sean Gallup/Getty Images)
The findings, released by German corporate responsibility organization Eko, highlight serious flaws in content moderation on social media platforms. Researchers submitted hate-filled ads containing slurs against Muslim and Jewish migrants, calls for violence, and extreme proposals such as placing migrants in concentration camps or exterminating them. The results were alarming: Meta approved five out of 10 ads, while X greenlit all 10.
Among the approved ads were messages likening Muslim refugees to a "virus" and "vermin," as well as calls for their sterilization or eradication. One ad explicitly urged setting fire to synagogues to "stop the Jewish globalist agenda." Although the ads were taken down before they could be distributed, their approval raises concerns about the oversight mechanisms of social media giants. The findings have been submitted to the European Commission, which may launch an investigation into Meta and X for potential violations of the Digital Services Act.
Get the Ynetnews app on your smartphone: Google Play: https://bit.ly/4eJ37pE | Apple App Store: https://bit.ly/3ZL7iNv
The timing of these revelations is particularly sensitive, with German elections that were held on Sunday. The spread of hate speech on social media has been a recurring issue during politically charged periods. Facebook has faced scrutiny before, most notably during the Cambridge Analytica scandal, when data-driven political manipulation influenced elections worldwide. That case resulted in a $5 billion fine for Facebook.
2 View gallery
דונלד טראמפ ו מארק צוקרברג
דונלד טראמפ ו מארק צוקרברג
Mark Zuckerberg and Donald Trump
(Photo: AFP)
Elon Musk, the owner of X, also has been accused of interfering in German elections, including openly endorsing the far-right Alternative for Germany (AfD) party. It remains unclear whether X’s approval of such ads stems from Musk’s political leanings or his self-proclaimed commitment to "free speech." Since acquiring the platform, Musk has dismantled content moderation teams and instead relied on the "Community Notes" system, where users provide context to posts. Mark Zuckerberg, CEO of Meta, has also adopted this approach for Facebook, though he has assured that AI-powered moderation will still detect hate speech and illegal content.
The rise of extremist content on social media coincides with growing political and social tensions. A recent study found that far-right content is spreading widely on X and TikTok, potentially shaping public opinion in Germany. Economic instability and a surge in violent attacks by radicalized migrants have only added to the volatile climate.
It remains uncertain whether this investigation will push the EU to tighten regulations on platforms like X, Facebook, and TikTok. However, the findings serve as a stark warning: unchecked extremist content often aligns with political agendas that may conflict with democratic values. Musk himself recently criticized the Community Notes system after it fact-checked his claim about Ukrainian President Volodymyr Zelensky's low approval ratings. When a user poll showed Zelensky with 57% support, Musk dismissed the system as being "manipulated by political groups."
Ultimately, this raises a larger question: Should social media companies be responsible for policing hate speech, or should government agencies take on that role with transparency and adherence to national laws? While hate speech has no place in public discourse, relying on private tech companies to regulate content may be an unrealistic expectation. Just as traditional media outlets must follow advertising laws and journalistic ethics, similar oversight may be necessary for social media platforms.
<< Follow Ynetnews on Facebook | Twitter | Instagram | Telegram >>
Comments
The commenter agrees to the privacy policy of Ynet News and agrees not to submit comments that violate the terms of use, including incitement, libel and expressions that exceed the accepted norms of freedom of speech.
""