The Oversight Board is recommending that Meta reconsider their policy regarding manipulated media.
A review committee is condemning Meta, the owner of Facebook, for their inadequate and inconsistent policies on manipulated content. These policies do not effectively combat the spread of false information, which has already begun to impact elections worldwide this year.
The semi-autonomous committee announced on Monday that their examination of a modified video featuring President Joe Biden, which went viral on Facebook, revealed deficiencies in the current policy. The committee suggested that Meta should broaden the policy to encompass not just videos produced using artificial intelligence, but all forms of media regardless of their origins. This includes false audio recordings, which have previously successfully imitated political figures in the United States and other countries.
The statement also suggests that Meta should specify the harm it aims to prevent and instead of deleting posts entirely, should label images, videos, and audio clips as altered.
The board’s feedback reflects the intense scrutiny that is facing many tech companies for their handling of election falsehoods in a year when voters in more than 50 countries will go to the polls. As both generative artificial intelligence deepfakes and lower-quality “cheap fakes” on social media threaten to mislead voters, the platforms are trying to catch up and respond to false posts while protecting users’ rights to free speech.
According to oversight board co-chair Michael McConnell, the current policy of Meta is illogical. In a statement on Monday, he suggested that the company address any loopholes in the policy while also safeguarding political expression without fail.
According to Meta, they are currently reviewing the guidance provided by the oversight board and plan to publicly address the recommendations within 60 days.
According to Corey Chambliss, a spokesperson for the company, audio deepfakes are not specifically addressed in the manipulated media policy, but they can still be fact-checked. If deemed false or altered by fact-checkers, they will be labeled or given a lower ranking. Additionally, Chambliss stated that any content that goes against Facebook’s Community Standards will result in action being taken.
Facebook, which turned 20 this week, remains the most popular social media site for Americans to get their news, according to Pew. But other social media sites, among them Meta’s Instagram, WhatsApp and Threads, as well as X, YouTube and TikTok, also are potential hubs where deceptive media can spread and fool voters.
In 2020, Meta established an oversight board to act as a mediator for the content posted on its platforms. The board’s recent suggestions were made after they analyzed a doctored video of Biden and his adult granddaughter, which was deceiving but did not break the company’s rules as it did not manipulate his words.
The original footage showed Biden placing an “I Voted” sticker high on his granddaughter’s chest, at her instruction, then kissing her on the cheek. The version that appeared on Facebook was altered to remove the important context, making it seem as if he touched her inappropriately.
On Monday, the board affirmed Meta’s 2023 choice to keep the seven-second video on Facebook, as it did not go against the company’s current policy on manipulated media. According to Meta’s current policy, any videos produced with artificial intelligence tools that alter someone’s speech will be taken down.
The ruling stated that the video in this post was not edited using AI and it portrays President Biden doing something he did not actually do (not something he did not say). Therefore, it does not go against the current policy.
The board recommended that the company revise its policy and classify similar videos as manipulated in upcoming instances. It stated that, in order to safeguard users’ freedom of speech, Meta should designate content as manipulated rather than deleting it from the platform unless it violates other policies.
The board acknowledged that certain types of altered media are created for comedic, satirical, or parodic purposes and should be safeguarded. Rather than placing emphasis on the methods used to create a distorted image, video, or audio clip, the company’s policy should prioritize the potential harm that manipulated posts can inflict, such as interfering with the election process, according to the ruling.
According to its website, Meta expressed its appreciation for the Oversight Board’s decision regarding the post by Biden and will make updates to the post after carefully considering the board’s suggestions.
Meta is required to heed the oversight board’s rulings on specific content decisions, though it’s under no obligation to follow the board’s broader recommendations. Still, the board has gotten the company to make some changes over the years, including making messages to users who violate its policies more specific to explain to them what they did wrong.
Professor Jen Golbeck from the College of Information Studies at the University of Maryland believes that Meta has the potential to take a leading role in labeling manipulated content. However, it is crucial for them to follow through and make necessary policy changes.
The speaker posed the question of whether the suggested changes will be put into action and upheld, despite potential pushback from those with malicious intentions. This begs the question of whether implementing the changes without enforcement would only add to the erosion of trust caused by misinformation.
Source: voanews.com