Every week seems to bring another scandal about content on social media. Last week, it was Tory political advertisements in the UK election, some of which have been banned from Google and YouTube for violating copyright law and other Google ad policies. Many of the scandals surround political content, such as Facebook leaving up a doctored video of Nancy Pelosi in May 2019 that made her appear drunk in May 2019 and not swiftly labelling the video as altered.
Political posts are understandably top of mind for politicians. But there are many other issues with content on social media. In June 2019, Vox journalist Carlos Maza finally took to Twitter to complain about harassment on YouTube. YouTube star Steven Crowder had been harassing Maza for months using anti-LGBTQ language that contravened YouTube’s own terms of service. But YouTube had not removed Crowder. After Maza’s Twitter complaint went viral, YouTube flipped within 24 hours from stating that Crowder’s anti-gay speech was allowed on the platform to demonitizing Crowder for using anti-gay, anti-Latino language.
Maza’s case highlighted a plight faced by many users from marginalized communities. It also raised bigger questions about social media companies’ policies and their enforcement. Would YouTube have responded if Maza were based in Chile, not the United States? Would YouTube have responded if Maza were not a journalist? Would YouTube have removed Crowder completely if he were not a star? It was impossible for civil society or governments to understand why certain cases like Maza’s were elevated and others unaddressed. The attention to Maza’s case may not have been arbitrary. But it seemed like it to many outside observers.
Complicating factors and unintended consequences
It is tempting to focus on solving individual cases. But the speed and scale of posts on social media make that impossible. Moreover, focusing on content can potentially lead to troubling consequences for freedom of expression. Singapore’s new Protection from Online Falsehoods and Manipulation Act, for example, enables the government to demand content takedowns or corrections, if it deems content to be false. The law has already been used against an opposition member of parliament. Any solutions need to account for freedom of expression and to think through unintended consequences.
A further complication is that US-based social media companies understand the concept of freedom of expression differently than other democracies like Canada. While the United States focuses on negative obligations to stop the suppression of speech, European countries and Canada also incorporate positive obligations to ensure a diversity of perspectives. A commitment to multiculturalism is embedded in the Canadian Charter of Rights and Freedoms.
Any solutions to these problems need to account for core democratic values like freedom of expression, while also reckoning with the massive structural problems of content online. One approach is to create institutions rather than regulate content itself. We need institutions to discuss the broader standards and mechanisms used by companies to create and execute their content policies.
A new institution for a new set of problems
This institution could be a social media council. A council would convene top-level representatives from social media companies and civil society organizations, especially those working with marginalized communities. The council would meet regularly to discuss myriad problems in social media, such as artificial intelligence, content moderators, or advertising policies. In Canada, social media councils could also help the government to fulfill its mandate to promote multiculturalism. By mandating regular meetings and information-sharing, social media councils could become robust institutions to address problems that we have not yet even imagined.
Although social media councils could take many forms, the overall idea is supported by civil society organizations like ARTICLE 19, the Stanford Global Digital Policy Incubator, and the UN Special Rapporteur on the Right to Freedom of Opinion and Expression. With Chris Tenove and Fenwick McKelvey, I have written a policy report on how social media councils might work in Canada. For a medium-sized country, a council could help to create more meaningful and robust exchanges with social media companies whose business interests mean that they focus on the countries with the largest numbers of users.
It would be foolish to pretend that there are simple solutions to complex structural problems like content moderation. But a social media council in Canada could be an excellent first step.
R$