In Germany, antisemitism on social media can be linked to offline violence
Not enough is being done in Germany to combat antisemitism on social media platforms.
On 8 September 2020, the 10th day of the Halle trial, several Jewish survivors gave their remarkable testimonies to the horrific crime. The crime took place on 9 October 2019 on Yom Kippur, the holiest day in the Jewish calendar. Stephan B. killed two people and injured many more in a terror attack in Halle, Germany. Among the testimonies, one survivor pointed out German domestic intelligence and police’s incapability to deal with social media and the gaming community that surrounded the attack.
Stephan B., so it appears, had radicalized himself online and had published several files that included a live stream on Twitch, and on the imageboard Meguca shortly before his attack. The documents that he uploaded show a worldview of ‘extermination antisemitism’ (Vernichtungsantisemitismus) interconnected with misogyny, racism, Islamophobia, and incitement. “Go in and kill everything,” he wrote in bold letters.
Even though Stephan B. is a native German, he spoke mostly English during the live stream. He also wrote his documents in English. This attests to his connections to global radical online communities and the alt-right, with a particular receptivity for the gaming and manga community, rather than to traditional German right-wing extremist networks.
The global dissemination of hate by malicious actors with the help of social networks, and its potential effects offline is an issue that researchers on antisemitism have begun to pay attention to but research remains insufficient. Current examples from Germany show how urgently policymakers, lawmakers, and practitioners depend on such research to find appropriate restrictions and combat mechanisms to fight antisemitism on networks like Facebook, Twitter, Instagram, YouTube and TikTok.
Already in 2008, Andre Oboler noted that, with the support of social media, antisemitism had reached a new quality. Today, antisemitism on social media can be found in all languages, is algorithm-driven, and can be weaponized in troll attacks or through social bots, for example. Indeed, antisemitic content can be disseminated on an unprecedented scale, cost-free, and in fun shapes like GIFs and memes or social media posts.
In his 2013 report, Oboler contrasts the different shapes in which antisemitism appears on the platform with Facebook’s insufficient strategies to combat it. Recently, Facebook made headlines with having failed to take down profiles and groups of the radical right-wing conspiracy QAnon on which virulent antisemitic content was posted frequently for years. Still, relatively new platforms like the video-sharing social network TikTok also have rampant antisemitism issues.
A recent study by Gabriel Weiman and Natali Masri focuses on TikTok, whose main audience are children and youth. The study identifies how radical right-wing extremists use social platforms to disseminate and normalize their ideologies, encompassing antisemitism, Holocaust denial, xenophobia, homophobia, and misogyny, on a large scale and specifically tailored towards youth.
The ongoing COVID-19 pandemic has also become a conduit of antisemitic expression. Examples include memes that show the anti-Jewish “The Happy Merchant” as a virus; posts that refer to the coronavirus as the ‘Jew flu’; or QAnon’s conspiracy myth on Bill Gates being a Jew who secretly wants to implant microchips in people receiving a COVID-19 vaccination.
In Germany, the antisemitic content from social media platforms has informed the protest on the streets, where COVID-19 deniers and protestors against COVID-19 restrictions merged with right-wing extremists. The demonstrators did not only walk towards the German Reichstag with Reichsbürger (Reich citizen) and Imperial flags, both symbols of the German radical right, but also trivialized the Holocaust by wearing the yellow star badge from the Nazi era, in protest against the restrictions. Connections to global right-wing extremist ideology such as QAnon was visible among the demonstrators in Berlin.
In Germany, antisemitism even on social media can be prosecuted as a hate crime. To avoid detection, whether by AI or content moderation, users employ simple techniques: instead of writing Rothschild or Rubinstein, names linked to the stereotype of the rich and powerful Jews, they just write -schild or -stein. The insinuation is subtle but still understandable.
A: Whose bread I eat whose song I sing. These people get donations from companies. Or as is the Arabic proverb: Do not spit into the fountain from which you drink.
B: Thank you for this informative insight. Are they coincidentally donors whose names end on -schild or -stein?
Another form of transformation of antisemitism on social media is by its deplatforming. A study by Richard Rogers shows how social media networks are deplatforming malicious actors in their effort to combat hate speech. An infamous example of deplatforming is the former vegan chef and social media personality Attila Hildmann from Germany.
Hildmann used social media, such as Instagram and Facebook, extensively before being banned due to his increased radical-right content. From social media, Hildmann has moved to the Instant-Messenger Telegram, where he has over 84k followers. He frequently posts antisemitic content referencing QAnon. He also posted that the German chancellor Angela Merkel is the leader of a Zionist regime that aims to destroy the German race, that the Holocaust didn’t happen, and that Jews and Zionists are parasites and subhumans.
Hildmann was also one of the main actors at the above-mentioned demonstration in Berlin. He was arrested at the Russian embassy, where he asked Putin through loudspeakers to liberate the Germans from what he perceived as the current German dictatorship.
These examples offer a glimpse into the issue of antisemitism at a time when digital communication seems to be exploding and social media has a particularly large influence on societal discourse. Since the emergence of social media, antisemitic incidents and radical right-wing actors using the platforms to advance their agendas have been a concern. Until now, scholarly attention to the problem has been surprisingly sparse, however. The danger that antisemitic hate speech poses, and the numerous violent incidents offline that have been linked to antisemitic content on social media platforms beg for a comprehensive academic intervention.