X CEO Linda Yaccarino to outline platform’s measures against child sex abuse at Senate hearing

Leaders of several social media platforms, including X CEO Linda Yaccarino, are set to testify before the Senate Judiciary Committee on Wednesday regarding efforts to counter the sexual exploitation of children online.

Yaccarino, who took the helm of the platform formerly known as Twitter in June, will testify alongside Mark Zuckerberg, CEO of Facebook and Instagram parent Meta, as well as TikTok CEO Shou Chew, Snap CEO Evan Spiegel and Discord CEO Jason Citron. Their testimony comes as senators on the panel have advanced multiple bills that would ramp up protections aimed at eliminating online child sexual exploitation (CSE).

The testimony also comes days after X temporarily blocked searches for Taylor Swift after fake sexually explicit images of the pop star that were generated with AI began circulating on social media last week. 

Since the takeover of Twitter by the Elon Musk-led ownership group in late 2022 and its subsequent rebranding as X, the platform has struggled to reassure advertisers that it has sufficient content moderation policies in place. In a draft version of Yaccarino’s opening remarks provided to FOX Business, she sought to emphasize the steps that the platform has taken to root out CSE.

“In the last 14 months, X has made material changes to protect children: Our policy is clear – X has zero tolerance towards any material that features or promotes child sexual exploitation,” Yaccarino said. 

ELON MUSK’S X TO HIRE 100 CONTENT MODERATORS IN AUSTIN THIS YEAR

“My written testimony details X’s extensive policies on content or actions that are prohibited and that includes – grooming, blackmail, and identifying alleged victims of CSE,” she said. “We’ve also strengthened our enforcement with more tools, and technology to prevent bad actors from distributing, searching for, or engaging with CSE content across all forms of media. If CSE content is posted on X, we remove it.”

“Now we also remove any account that engages with CSE content – whether it’s real or computer generated,” Yaccarino added. 

In addition to backing a range of legislative responses, Yaccarino also plans to ask the Senate and the social media industry to focus on two areas – ensuring law enforcement has the resources needed to bring bad actors to justice and the need for collaboration because artificial intelligence (AI) will allow offenders’ tactics and capabilities to evolve.

ELON MUSK SAYS AUDITS SHOW LESS ANTISEMITISM ON X THAN OTHER APPS

In a blog post on the X website last week, the company said it’s “determined to make X inhospitable for actors who seek to exploit minors” and that, “In 2023, we made clear that our top priority was tackling CSE online.”

The company noted that it is “improving our detection mechanisms to find more reportable content on the platform to report to the National Center for Missing and Exploited Children (NCMEC). Plus, we are also building a Trust and Safety center of excellence in Austin, Texas, to hire more in-house agents so we can keep accelerating our impact.” X is planning to hire an additional 100 social media content moderators to work out of the Austin-based center.

X said it has developed an automated reporting system that sends reports to NCMEC’s CyberTipline and has partnered with Thorn to automatically suspend, deactivate and report suspected CSE incidents to NCMEC “in minutes without human involvement.”

SOCIAL MEDIA CEOS TO TESTIFY BEFORE SENATE ON ONLINE CHILD EXPLOITATION IN JANUARY

The company also increased training for human content operators to increase the number of manually submitted reports sent to NCMEC, which it said resulted in a tenfold increase from an average of 6,300 reports per month to an average of 64,000 reports per month from June through November 2023.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

The company’s safety team posted a notice informing users, “Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content. Our teams are actively removing all identified images and taking appropriate actions against accounts responsible for posting them.”

   

Advertisements