Meta Fight Against Child Pornography On Instagram

sintania.amalia00 . June 08, 2023



Techspace – Meta has started investigating how their app could share Instagram photos facilitating the distribution and sale of child sexual abuse material.

The effort was made by parent company Facebook following a report from the Stanford Internet Observatory which found that a large network of accounts apparently operated by minors openly advertised self-generated child sexual abuse material for sale.

The researchers found that there was an interaction between buyers and sellers of self-generated child sexual abuse material connected through Instagram's direct message feature, and Instagram's recommendation algorithm made advertising of prohibited material more effective.

The findings offer more insight into how internet companies have struggled for years to find and prevent the spread of sexually explicit images that violate the rules on their social networks.

Highlighting how the prevalence of intimate image abuse or so-called revenge porn has skyrocketed during the pandemic, prompting tech companies, porn sites, and civil society to increase their moderation tools.

In April, the Guardian said its two-year investigation found that Facebook and Instagram had become the main platforms for buying and selling children for sex.

“Due to the widespread use of hashtags, the relatively long lifespan of seller accounts, and, above all, its effective recommendation algorithm, Instagram serves as the primary discovery mechanism for this niche community of buyers and sellers,” the researchers stated.

Stanford researchers say the overall size of the seller network ranges between 500 and 1,000 accounts at any given time. They said they started an investigation following a tip from the Wall Street Journal, which first reported the findings.

Civil society groups and regulators are concerned about platform predators, privacy, and the mental health impact of social media networks that can affect children and young people supervising those who are underage.

The company abandoned its controversial plans in September 2021 to create a separate version of Instagram designed specifically for children under 13. Later that year, lawmakers also berated the head of Instagram, Adam Mosseri, over the disclosures that appeared in documents shared with regulators by Meta. whistleblower Frances Haugen points out that Instagram is dangerous for most young users, especially young girls.

Meta says it has strict policies and technology in place to prevent predators from finding and interacting with juveniles. In addition to the task force, the company said it dismantled 27 abusive networks between 2020 and 2022 and in January disabled more than 490,000 accounts for violating its child safety policies.

"Child exploitation is a terrible crime," Meta spokesman Andy Stone said in a statement. "We are working aggressively to fight it on and off our platform, and to support law enforcement in their quest to catch and prosecute the criminals behind it."

While Instagram is a major player in facilitating the dissemination and sale of child sexual images, other technology platforms also play a role, according to the report. For example, it found that accounts promoting self-generated child sexual abuse material were also highly prevalent on Twitter, although the platform appeared to be taking them down more aggressively.

Some Instagram accounts also advertise links to groups on Telegram and Discord, some of which appear to be managed by individual sellers, according to the report.

Last year the European Commission requested that Google, Meta, and other online service providers would be required to find and remove child pornography online under proposed European Commission rules, and companies that fail to comply with the regulations face fines of up to 6% of their annual revenue or global turnover. , which will be set by EU countries.

This is due to over one million reports of child sexual abuse in a block of 27 countries in 2020, with the COVID-19 pandemic a factor in the 64% increase in such reports in 2021 compared to the previous year. In addition, 60% of child sexual abuse material worldwide is hosted on EU servers.

"The proposed rules introduce an obligation for relevant online service providers to assess the risk of misuse of their services for the dissemination of child sexual abuse material or the solicitation (treatment of) children," the Commission said in a statement.

Companies must then report and remove known and new images and videos, as well as cases of treatment. The EU's Center for Child Sexual Abuse will be set up to act as a center of expertise and to forward reports to the police.

author0
teknologi id bookmark icon

Leave a comment

0 Comments