- The document says that users in social media should be able to defend themselves before their post is deleted due to content moderation;
- Entities believe that big techs’ own moderation is not reliable.
Latin American entities launched last week the manifest “Standards for democratic regulation of large platforms that guarantee online freedom of speech and a free and open Internet”. The text takes a Latin American perspective to achieve big tech content moderation processes compatible with international human rights standards and addresses points under discussion in the Fake News Bill in Brazil that are under scritiny across the region.
Signed by Intervozes – Collective of Social Communication, Brazilian Institute of Consumer Protection (Idec), OBSERVACOM – Latin American Observatory of Media and Convergence, of Uruguay, and Desarollo Digital, of Argentina, the document also has the support of other entities in the region such as Paraguay, Ecuador, and Costa Rica and specialists such as Javier Pallero, coordinator of Public Policies for Latin America through Access Now.
READ ALSO: WhatsApp allows forwarded messages to be fact-checked
According to these organizations, regulation cannot prevent innovation, competition, or the development of startups, while at the same time must follow human rights guidelines. The Brazilian Senate recently approved the bill that intends to harden the fight against the spread of fake news. The bill is criticized by the text since Latin American entities believe that most of these legal initiatives are solutions that are disproportionate and increase the risk of violations of the right to freedom of speech, as they assign responsibilities and obligations that make platforms private judges or police over third-party content that can circulate on the Internet.
READ ALSO: Facebook will label political propaganda and say who pays for it
“The signatories to this document have opposed to these proposals and will continue to do so”, they say. “We believe that the model of self-regulation that has prevailed until now presents risks similar to the effective exercise of basic human rights. A few corporations have centralized and concentrated the power to manage the circulation, exchange or search for information and opinions and exercise this power arbitrarily without any accountability mechanism to institutions that guarantee rights”.
According to the document, terms of service, from an instance, of all content platforms, as well as other complementary documents (such as content application guidelines), must be written in a clear, precise, intelligible and accessible form for all users in their national languages. Also, in text for children and teenagers, the terms and conditions of the service must adopt a language that is understandable for this group of people.
Gustavo Gómez, Executive Director at OBSERVACOM said, during the presentation of the manifest, that in Brazil, even though legislators have good intentions, the text of the bill ends up affecting fundamental rights.
Fundamentally, the proposal believes that platforms have to be forced to be transparent with the algorithms and seeks to protect users’ freedom of expression in front of the platform, requiring that big techs provide information about what they do when moderating content. The document also asks the user to be able to defend themselves before having the content removed.
For him, there is a great pressure for the big techs to make decisions regarding hate content and fake news in the world, such as economic pressures, through the recent boycott of advertising companies on Facebook, for example.
READ ALSO: Fake news crackdown in Brazil opens a dangerous precedent for censorship
But he believes this content moderation debate is polarized: on the one hand, whoever wants platforms to self-regulate and on the other, who advocates abusive moderation from authoritarian governments that try to end criticism.
That is why this document prepared a “third way”, proposed by Latin American entities, which, according to Gómez, is more compatible with human rights and more democratic. It establishes more clearly the responsibility of intermediaries when they act by their acts, but also incorporates government limits on what platforms could do, including guarantees of content moderation. The full document is available in Portuguese here.