The Philippine government has once again urged Facebook to remove sites that are reportedly selling newborns online as adoptions.

The country's social welfare department secretary Rex Gatchalian slammed the social media giant on Tuesday for its passivity on the problem, according to GMA News. During a press conference, he stressed Facebook's "lack of responsibility" over unregulated content, including human trafficking violations despite reports from the National Authority for Child Care (NACC) since 2023.

No Response from Facebook

The official emphasized that Facebook had not responded to the NACC's letter from last year. Gatchalian noted that the platform's lack of regulation implies that Facebook is allowing mothers are allowed to "sell" their children on the platform.

The Department of Social Welfare and Development (DSWD) secretary said baby selling and adoption via social media is a form of child exploitation and human trafficking. He asked the public to report human trafficking as the agency regulates the site.

Meanwhile, Janella Estrada, Executive Director of the NACC, stated that they are actively monitoring 20 to 40 newborn and child trafficking Facebook pages, per the Inquirer. These private pages have thousands of followers, where newborns are being sold.

Estrada said, since February, her agency has been working with the police to stop such criminal activity. Selling of newborns is unlawful in the Philippines, punished with life imprisonment and a fine ranging from P1 to P5 million. 

The parent of Facebook and Instagram defines human trafficking as utilizing deceit, force, or coercion to exploit individuals for commercial sex, work, or other activities.

Read Also: Intelligence Agencies Continue to Warn About AI's Threat on Election Security

MYANMAR-POLITICS-MILITARY
(Photo : KIRILL KUDRYAVTSEV/AFP via Getty Images) 
A photo taken on February 22, 2024 shows the logo of US online social media and social networking service Facebook on a smartphone screen in Frankfurt am Main, western Germany.

Facebook's Flawed Conetent Moderation System Slammed 

Previously, Meta was criticized in Canada for its unreliable automated moderation systems. 

Facebook's parent company, said it would improve its content-monitoring system after initially declining to delete illicit drug ads because they did not violate advertising regulations.

While researching a study on Canada's unlawful cannabis sector, Deloitte Canada senior manager Christopher McGrath saw Facebook ads for illegal substances, possibly prompted by algorithms, as reported by National Post.

Following a cannabis research, the writer saw several Facebook ads for tax-free cigarettes, illegal cannabis, LSD, and hallucinogenic mushrooms.

Facebook's automatic moderation mechanism denied that these advertisements violated advertising rules. After the publication approached Facebook for comments, the advertisements were deleted.

The issue was social media sites' use of machine learning and automated algorithms for content policing, according to University of Toronto media economics expert Brett Caraway. Due to the volume, some companies now use algorithms instead of human censors.

Users submit objectionable information, automatically assessed to see if it needs human review, but that examination seldom occurs per reports.

According to the report, one Canadian mail-order cannabis company promoted psilocybin-infused edibles. Another ad for an adhesives firm promised mushrooms and LSD, while another featured a lady clutching a tagged vial of LSD and offering two- to three-day Canadian shipment.

Meta told the publication that its regulations forbid Facebook ads promoting the "buying and selling of pharmaceutical and non-medical drugs." It is noted in the statement that it takes down such content once it detects content violating rules. 

The tech giant also noted that it keeps enhancing its system "to keep our platforms safe."

Related Article: Meta's Ad Approval Process Fails to Block Violent Content, Disinformation in India

byline quincy

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion