Meta’s Oversight Board Shows How it’s Helping to Evolve Meta Policies in New Report

featured image

The Meta-funded Oversight Board, which provides an alternative appeals tribunal for Facebook and Instagram users in the case of their content being removed or otherwise penalized, has shared its Q1 2022 Transparency Report, which provides a full overview of the cases that it’s heard, the recommendations that it’s made, and how Meta then actioned such throughout the first quarter of the year.

The Oversight Board, which began hearing cases in October 2020, is a collection of independent, external experts that collaboratively review appeals of content decisions made by Facebook and Instagram’s moderation teams. That provides an extra layer of governance, and enables Meta’s users to seek another form of recourse for any decision made.

And users are certainly seeking to utilize that capacity.

As per the report:

From January to March 2022, we estimate that users submitted nearly 480,000 cases to the Board. This represents an increase of two-thirds on the 288,440 cases submitted in the fourth quarter of 2021.”

As you can see in this chart, over time, more people are seeking to question Meta’s moderators, and the initial rulings made about their content. The Oversight Board isn’t able to investigate every one of these cases, but it works to select specific instances where Meta’s policies are the core issue, which can then help to evolve Meta’s overall approach.

The most commonly appealed decisions in Q1 related to removals based on ‘violence and incitement’, followed by ‘hate speech’ and ‘bullying and harassment’.

The data could reflect Meta’s increased enforcement of its rules around each element, with removals based on violence and incitement’ in particular seeing a big increase.

Over time, Meta has become increasingly aware of the role that its apps can play in the dissemination of information, and how that can then incite real world violence, and these stats, as noted, may well reflect increased action from Meta’s teams to curb any such risk. As opposed to more posts inciting violence being shared – though that could also be a possibility, but the fact that this data is based on content appeals, not instances, would suggest that it’s Meta’s rules around such that are changing, not habitual behaviors of engagement.

Based on the Board’s findings, Meta, in the majority of the cases, has agreed with the Board’s assessment.

That’s then led to Meta updating its policies in many cases, with the Board noting that, most of the time, Meta has taken adequate action, even if it hasn’t implemented all of its suggestions.

Much of the Board’s pointers relate to clarity and transparency in Meta’s content rulings:

“Our recommendations have repeatedly urged Meta to be clear with people about why it removed their posts. In response, the company is giving people using Facebook in English who break its hate speech rules more detail on what they’ve done wrong and is expanding this specific messaging to more violation types.

So Meta is updating its approaches, in line with each case. Though it’s not entirely in lockstep with the Board’s decisions:

As of Q1 2022, most of the Board’s 108 recommendations are either in progress or have been implemented by Meta in whole or in part. However, the Board continues to lack data to verify progress on or implementation of the majority of recommendations.”

So not everything’s being implemented. But still, the Oversight Board is helping to evolve Meta’s approach, by providing independent, expert assessment, outside of Zuck and Co.’s internal thought bubble, which, really, is what the project was designed to achieve.

Meta’s independent Oversight Board is essentially an experiment to demonstrate how additional oversight, via third-party regulation, could help to improve social media platforms overall, with Meta’s longstanding view being that it shouldn’t be running this type of double-checking process on its own accord.

Meta has repeatedly called for the establishment of an official regulatory body, overseeing all social networks, made up of a group of an independent group of experts like this. That, ideally, would take these types of decisions entirely out of its hands, while also ensuring that every social platform operates on a level playing field, under the same, centrally determined rules and parameters – because right now, each company is being forced to make tough calls that really, seemingly, shouldn’t be left to the determination of a corporate entity, especially one which benefits from in-app engagement.

The Oversight Board does provide an independent perspective on this, but at the end of the day, Meta still funds the group. That means that there will always be a level of perceived vested interest, whether it actually exists or not, while Meta’s also not beholden to the Board’s rulings or recommendations.

Based on these new stats, you can see how a global, independent assessment authority might help to enhance platform rulings and policies, with the board making a range of recommendations on Meta’s current rules around adult content, racist/divisive remarks, COVID misinformation, the banning of former President Donald Trump, attempts to silence anti-government speech, etc.

Meta, as you can see, hasn’t actioned all of these. But maybe it should – and maybe, as Meta says, all platforms should be held to the same standards, based on independent assessment of this type.

You can read the Oversight Board’s full Q1 2022 Transparency Report here.

Read More


Share on Google Plus
    Blogger Comment
    Facebook Comment

0 Comments :

Post a Comment