InformedInsights

Get Informed, Stay Inspired

The YouTube video depicting a beheading has been removed, but there are still unanswered questions.
Technology

The YouTube video depicting a beheading has been removed, but there are still unanswered questions.

The dissemination of a graphic video on YouTube, allegedly depicting a man from Pennsylvania beheading his father, has once again raised concerns about the limitations of social media platforms in preventing the spread of disturbing content online.

On Wednesday, authorities announced that Justin Mohn, 32, has been arrested and charged with first-degree murder and desecration of a corpse for decapitating his father, Michael, in their Bucks County residence. Mohn also posted a 14-minute video of the gruesome act on YouTube, accessible to viewers worldwide.

Reports of the event, which sparked similarities to the beheading footage shared by the Islamic State militants almost ten years ago, coincided with federal lawmakers grilling the CEOs of Meta, TikTok, and other social media giants over their perceived failure to address child safety concerns on their platforms. Despite being one of the top platforms used by teenagers, YouTube (owned by Google) was notably absent from the hearing.

The disturbing video from Pennsylvania follows other horrific clips that have been broadcast on social media in recent years, including domestic mass shootings livestreamed from Louisville, Kentucky; Memphis, Tennessee; and Buffalo, New York — as well as carnages filmed abroad in Christchurch, New Zealand, and the German city of Halle.

According to Police Captain Pete Feeney of Middletown Township, the video was uploaded in Pennsylvania around 10 p.m. on Tuesday and remained online for approximately five hours. This delay raises concerns about the effectiveness of social media platforms’ moderation policies, especially during times of conflict in Gaza and Ukraine, and a highly contentious U.S. presidential election.

According to Alix Fraser, director of the Council for Responsible Social Media at Issue One, this is yet another instance where companies have failed to safeguard us. She believes that we cannot rely on them to evaluate their own performance.

A representative from YouTube stated that the platform took down the video and removed Mohn’s channel. They are also actively removing any potential re-uploads. YouTube utilizes a combination of AI and human moderators to oversee its platform, but did not provide an explanation as to how the video was detected or why it wasn’t taken down earlier.

A vehicle is parked in the driveway of a home that was a scene of a murder in Levittown, Pa., on Jan. 31, 2024.


On January 31, 2024, a car is situated in the driveway of a residence where a homicide occurred in Levittown, Pennsylvania.

Large social media corporations use advanced automated systems to moderate online content, which can efficiently detect and remove prohibited content before it is seen by a human. However, these systems may not always be effective when dealing with violent and graphic videos that are unique or uncommon, according to Brian Fishman, co-founder of trust and safety technology company Cinder.

According to him, human moderators play a crucial role in this aspect. Although AI is advancing, it is not at its best yet.

The Global Internet Forum to Counter Terrorism, an organization established by technology companies to stop the dissemination of these videos on the internet, was in contact with all of its members regarding the incident on Tuesday evening. According to Adelina Petit-Vouriot, a representative for the group, they were actively communicating about the matter.

Around 40 minutes past midnight EST on Wednesday, the Global Internet Forum to Counter Terrorism (GIFCT) released a “Content Incident Protocol” to notify its members and other involved parties of a violent event that has been either livestreamed or recorded. GIFCT enables the platform that has the initial footage to submit a “hash,” which is a unique digital identifier for the video, and alerts approximately 20 other member companies to block it from their platforms.

However, on Wednesday morning, the video had already been shared on X. The platform contained a graphic clip of Mohn holding his father’s head for at least seven hours and received 20,000 views. The company, previously known as Twitter, did not provide a comment in response to the request.

Specialists in the field of radicalization state that the use of social media and the internet has decreased the level of difficulty for individuals to delve into extremist organizations and beliefs. This enables anyone with a propensity for violence to connect with a community that supports such notions.

In the video released following the murder, Mohn referred to his father as a federal employee of 20 years and expressed belief in several conspiracy theories while also ranting against the government.

Many social media platforms have rules in place to delete content that is violent or promotes extremism. However, they are not able to catch everything and the rise of newer, less regulated sites has provided a breeding ground for hateful ideologies, according to Michael Jensen, a senior researcher at the Consortium for the Study of Terrorism and Responses to Terrorism (START) at the University of Maryland.

Jacob Ware, a research fellow at the Council on Foreign Relations, suggested that social media companies should increase their efforts in monitoring and controlling violent content, despite the challenges they may face.

According to Ware, social media has become a primary battleground for extremism and terrorism. This will necessitate a stronger and dedicated effort to combat it.

Nora Benavidez, a legal expert at the organization Free Press that advocates for media, expressed her desire for tech companies to make reforms. She specifically mentioned wanting more transparency regarding which employees are affected by layoffs and increased investment in trust and safety personnel.

In the present month, Google, the parent company of YouTube, has let go of numerous staff members from its hardware, voice assistance, and engineering departments. In the previous year, the company announced a reduction of 12,000 employees “across Alphabet, various product divisions, job roles, levels, and geographic locations,” but did not provide any further specifics.

Source: voanews.com