On Wednesday, lawmakers from nine different countries including the United States, Canada and the United Kingdom convened for a hearing at the European Parliament in Brussels with policy representatives from Meta, Twitter and Google to discuss ongoing challenges in the fight against online antisemitism.
The session was part of the conference of the Inter-Parliamentary Task Force on Combating Antisemitism, which was founded in 2020 following the 2020 hearings held in Israel’s 23rd Knesset with the platforms, and was initiated to bring together legislators to tackle the global phenomenon of online antisemitism that has seen a continuous rise each year. It is made up of legislators and parliamentarians from Israel, the United States, Canada, Australia, New Zealand, South Africa and the United Kingdom.
Ahead of the hearing, the lawmakers were advised by NGOs and civil society members on the status of online antisemitism and new challenges posed by technological developments in AI and other fields.
The representatives of the social media companies were there voluntarily, and provided updates on the progress their companies have made since the previous hearing, which was held in Washington D.C. at the US Congress in September 2022; however, the EU lawmakers were visibly not satisfied with the answers and lack of transparency on the part of the platforms.
The effort was led by European Parliament member David Lega of Sweden, as well as steering committee members of the task force US Rep. Debbie Wasserman Schultz, Canadian Member of Parliament Anthony Housefather, and former Knesset member Michal Cotler-Wunsh. Representing Israel were Knesset members Orit Farkash-Hacohen and Simcha Rothman.
One key point of contention was the failure of social media companies to adopt the IHRA (International Holocaust Remembrance Alliance) definition of antisemitism, which is already being used by the European Union and 41 other countries, and addresses both modern and classic forms of antisemitism.
Meta Public Policy Manager for EU Affairs, Lara Levet, told the lawmakers: “We need to build our community standards for a global application and then our implementation to scale, and that’s why we built all of them for that purpose. So we have used [IHRA] as a resource and we built it into community standards that can be applied in a way that is consistent at scale by our customers.”
However, when lawmakers asked if the new standards required by the Digital Services Act (DSA) in Europe would be applied globally just as the platforms claimed they do for current hate speech policies, Levet responded: “I’m sure some of them [the regulations] will be applied globally and others will be very country specific.” This is an apparent contradiction of the reason the company has given for not formally adopting the IHRA definition of antisemitism into the community standards of Meta.
Both YouTube and Twitter told lawmakers they incorporate the IHRA definition in some capacity but, like Meta, have not formally adopted it; they did not confirm during the hearing whether their content moderators are trained to use the IHRA definition.
Former MK Michal Cotler Wunsh told Ynet, “This hearing further underscores the imperative of and urgency for social media giants to use the IHRA consensus definition, the result of a long democratic process that has been adopted by 41 countries and over 1,000 entities, as the benchmark definition. You cannot identify and combat antisemitism without first defining it clearly and comprehensively.”
All of the social media platforms also refused to answer whether or not specific content was against their policies to the point of near absurdity when Housefather asked: “If you had a post that said ‘Jews are all white supremacists that support apartheid,’ would that violate your hate speech standards?”
The YouTube representative David Wheeldon responded that such a statement needs to be evaluated in context, while other platforms declined to comment.
Housefather posed the exact same question to the companies in September 2022, when the representatives also refused to state that such a comment violates their hate speech policies.
Housefather told Ynet after the hearing that “I’m absolutely shocked anybody would say that such a statement requires context. Furthermore, I asked it a year ago and they still haven’t changed their answer…I’m deeply concerned.”
Both Lega and Wasserman-Schultz brought up another challenge of coded language on social media, in which seemingly innocuous words are used as replacements for what would be blatantly antisemitic speech if tracked by social media algorithms.
When asked if Twitter would commit to implementing a specific reporting procedure for this phenomenon, a company representative claimed that it is already possible to report hateful content that is “coded language” in its own category, however the function does not exist in the reporting procedure on Twitter’s website.
“We have a list of terms that we consider as being antisemitic and it’s not only us considering it, it’s [formed] by working with civil society partners that we take into account,” the Twitter representative said.
Israeli lawmaker Orit Farkash-Hacohen also brought up the issue of Iran’s Supreme Leader Ayatollah Khamenei and Twitter’s continued refusal to remove him from the platform, reading aloud several antisemitic comments posted by Khamenei on the platform in recent years.
When she served as Israel’s Minister of Strategic Affairs, Farkash-Hacohen penned a letter to Twitter on the topic and received a response that such speech was permitted because Khamenei is a public figure.
“There is a double standard here in a way…I would think that you should be stricter with people who are leaders and have millions of followers…Do you think that the argument that somebody who’s a public figure and actually has a bigger audience to echo antisemitism is a reason to give this guy [Khamenei] immunity?” she asked the Twitter representative.
The Twitter representative responded: “The rules apply to every user on Twitter and so as well to Ayatollah Khamenei, and so if the content is being reported then we can see if it goes against a rule and then take action if it violates the rules or not.”
Given the fact that users reported this content repeatedly and Khamenei remains on the platform after tweets referring to Zionism as a cancerous growth that needs to be removed, as well as his open calls to violence and support for terrorist organizations, Twitter essentially confirmed that such speech is not against their policies.
The session concluded with British member of Parliament Sarah Jones emphasizing the need to regulate the social media platforms due to the failure of the company representatives to demonstrate success in identifying and regulating antisemitic content on their platforms.
While the representatives came of their own volition and progress has been made in banning of Holocaust denial across all platforms, as well as in the implementation of features like community notes which allow users to correct false information on Twitter, the hearing demonstrated the dire need for reform, as well as the hypocrisy of social media platforms in dealing with hatred against the Jewish community.
Copyright dell’imagine: Emily Schrader