Raphael Uzan, Jan. 12, 2021
With online antisemitism and hate-speech spiking over the past few years, the policies used to combat them are in desperate need of an update. So concluded Quebec Liberal MP Anthony Housefather who chaired a Canadian House of Commons’ Justice and Human Rights Committee investigation into these matters.
In July 2019, the committee published a report suggesting policy changes that included adopting a clear definition of hate, training enforcement officials on online hate, and regulating social media platforms. The report paved the way for a national policy on online hate.
A member of parliament for the Montreal riding of Mount Royal since 2015, Mr. Housefather previously served as mayor of the Montreal borough of Côte Saint Luc. Before that, he worked as an executive in a communication software company.
This subject is personal to MP Housefather, as he told CBC anchor Vassy Kapelos on its program Power & Politics, “As a Jewish MP, I received a lot of antisemitic comments online in social media and in other ways, something I had rarely experienced in my life until I was elected federally. However, that gives me the capacity to understand what other people are going through particularly young people on campuses that are subject to this type of hate and recognize we need a common definition of antisemitism not only in one country but across boundaries.”
Subsequently, MP Housefather joined forces in September 2020 with legislators of five other countries to create a global Inter-Parliamentary Task Force to combat online in Canada, the United Kingdom (Andrew Percy), Australia, the United States (Chris Smith), and Israel (Michal Cotler-Wunsch); its purpose is to raise awareness on the issue and establish common policies in their respective countries.
I had the pleasure of discussing this 2019 Standing Committee on Justice and Human Rights report titled Taking Action To End Online Hate with MP Housefather via Zoom, and what actions the Canadian government should take to combat online hate. The interview was edited for clarity.
What first led you to open a committee investigation into online hate?
MP Housefather: In its most recent studies Bnai Brith Canada notes an increase in antisemitic incidents since 2016, most of which started online. These facts convinced our committee that our current policies on social media were not working. There are many reasons for this, primary amongst them is that social-media providers make up their own rules and self-regulate. For the most part, algorithms used by most platforms offer content to people that feeds their extremist beliefs, be it white nationalist or far left. Moreover, even though this kind of content violated the guidelines of social-media companies, it was still posted online.
The problem is an international one; one country alone cannot solve it. Still, there are solutions that Canada could offer to combat online hate. For one, clearly defining hate and antisemitism. For another, working with online providers to see how we could make them more responsible through regulations. Acting on this issue, the Liberal Party heard from several online providers, different communities, free speech advocates and scholars. We then came out with our 2019 report.
Do you distinguish online hate from online antisemitism?
MP Housefather: No. Online antisemitism is a form of online hate, which differs from the legal definition of hate speech as defined by the criminal code. [The legal definition limits itself to the advocacy of genocide and public incitement of hatred.]
Click image to play interview
Are platform providers such as Twitter or Facebook willing to help, and to what extent?
MP Housefather: Some of them say they want to help but have only acted in countries where they are forced by law to do so. In Germany, Facebook has close to 100 people reviewing the content, while in Canada they don’t have any people doing it. Without the government imposing rules, there’s just no way these providers will take these issues seriously.
Imposing regulations, though, can be tricky. As a businessman, I understand that multinationals need consistency: It is unfair to ask Facebook to follow different rules in every country. Regardless, we must make that the rules Facebook follows in Canada are like those they follow in most countries. Subsequently, we must work with our allies to create rules that all companies will follow.
How hard is that to do?
MP Housefather: It’s certainly not easy. There are many issues to consider. How do you create a common definition? How do you flag the content? What crosses the line into illegal content? I believe that what is considered hate speech off the internet, should also be considered hate speech on the internet. The intent (to provoke a negative or hateful reaction from people) is the same whether offline or online. To make this happen, we must work with providers to ensure that the policy is reasonable so they can put it into action.
So, you favour greater regulation?
MP Housefather: I’m in favor of government regulating providers to ensure they adequately deal with the issue of hate. In my view (though it’s not the final view), providers should remove, by law, content considered criminal and that could lead to prosecutions.
But regarding content that isn’t hate in the legal sense (yet is still hateful), we need to educate people: They need to know when they are tweeting something that it is antisemitic and why – by flagging the content and, at times, referring them to authoritative information. Neither should that site’s content be promoted.
What about smaller players that are less financially secure, how would they take to more regulations?
MP Housefather: Common rules are not just gentlemen’s agreements but must be enforced. However, they must also be reasonable. We want platforms to operate in Canada, but we can’t just leave it to providers to self-regulate.
How do we get a consensus on what is hate-speech and what is legitimate criticism?
MP Housefather: You start by creating a definition. That’s mainly what the IHRA Working Definition of Antisemitism has tried to do. Since our report, the Liberal government has adopted this definition, which applies to all federal jurisdictions. Providers can use this or some other consensual definition under their own guidelines. The definition of antisemitism can’t be changing all the time. We need to have common rules that social media providers must follow so they can easily say we can’t do different things in 200 different countries.
The IHRA defines antisemitism as “a certain perception of Jews, which may be expressed as hatred toward Jews. Rhetorical and physical manifestations of antisemitism are directed toward Jewish or non-Jewish individuals and/or their property, toward Jewish community institutions and religious facilities.”
Raphael Uzan studies law at the Université de Montréal and is a CIJR Baruch Cohen intern.