Summary by Moomoo AI
Meta Platforms, known for its social media platforms Facebook and Instagram, is facing scrutiny over its content moderation policies in non-US markets. Shareholders are urged to vote for Proposal Eight, which calls for a report on the company's effectiveness in preventing and mitigating human rights risks, particularly in relation to hate speech, disinformation, and incitement to violence. The proposal highlights recent failures in content moderation in India and Brazil, two of Meta's largest markets. In India, investigations have revealed that Meta's platforms were used to spread Islamophobic hate speech and disinformation during elections, with far-right networks spending over $1 million on ads that potentially violated election laws. In Brazil, disinformation on Meta's platforms has hindered flood response efforts. The company's 2021 Civil Rights Audit Update acknowledges the need for ongoing policy evolution to meet user needs. The shareholders' request for a report aims to provide transparent data on content moderation performance in these critical markets, emphasizing the potential regulatory and financial consequences of inadequate content moderation.