X Provides Updates on Its Countermeasures for CSAM in the App


X’s Initiatives Discussed at the Inaugural Client Council MeetingMeta Introduces Business Verification in New ZealandTikTok Introduces In-Stream Labels to Flag AI-Generated Content on the PlatformPinterest Shares Latest Information on its Enhanced Recommendation ToolsYouTube Unveils Enhanced Creator Tools, Featuring Innovations in Generative AI, During ‘Made On’ EventTikTok Explores Potential Partnership With Google to Boost Search TrafficPinterest Revealed New Ad Tools and Creative Features at Pinterest Presents EventA New eCommerce Marketing Starter Guide is Released by TikTokReaching 1 Billion Members: LinkedIn Introduces New AI Summary Tools For Articles and Job AdsLinkedIn Sales Navigator Gets Smarter With New AI Search CapabilitiesSnapchat at “Lens Fest” 2023 Announces New AR Creation ElementsX Introduces Timestamp Links for Uploading VideosX Provides Updates on Its Countermeasures for CSAM in the AppTikTok Introduces a 2024 Marketing Calendar to Help Your Content PlanningX launched a Job Listings Feature with Over 750K Jobs Available in the AppPinterest Releases Its Forecasts for 2024 Trends Using In-App Engagement DataNew Insights: Snapchat Shares the Roles that Social Platforms Play in Facilitating Close ConnectionsFor the First Time, MrBeast Uploads Video Straight to X as the Platform Expands Into Video ContentThreads Allows You to Hide Both Like and Share Counts From Your Threads PostsEU Takes Control: Separating Facebook and Instagram Accounts for Greater PrivacyBetMGM Sports Gambling Statistics With In-Stream Will Soon Be Integrated Into XInstagram Now Allows You to Choose Not to Receive DMs ReceiptsInstagram Tests New Option to Preview Feed Post Placement Before PublishingPinterest Announces New Google Ad Partnership and Gains More Users in Q4TikTok Announces New Olympic Partnership for the Upcoming Games with the UK TeamsLinkedIn is Eliminating the “Creator Mode” Feature500 Roles are Affected by the New Round of Layoffs Announced by SnapchatX Announces a New Way for Sponsors to Support App Creators’ Video UploadsPinterest Unveils Vibrant Color Trends Set to Dominate 2024Threads Make Saved Posts Available to All UsersLinkedIn Releases New Premium Advertising CampaignThreads Opens up its API to Live Testing, Enabling Content Scheduling in Selected Third-party Apps

X Provides Updates on Its Countermeasures for CSAM in the App

X Provides Updates on Its Countermeasures for CSAM in the App

X launched its latest performance update in the last week of December 2023. This update is about the ongoing concerns of the revised content moderation approach on the platform.

It’s been revealed that more offensive and harmful content remains active on the platform, provoking more X ad partners to halt their campaigns over the platform. Elon Musk, X’s owner, had shared a priority concern claiming that the company is aiming to elucidate its efforts in the key area.

This update mainly focuses on its efforts to stamp out child sexual abuse material, CSAM. The company claims that such kind of material has significantly been reduced from the app as they are working on and running improved processes over the last 18 months. Some third-party channels contradict the statement. However, X states that they are seemingly doing a lot more to detect and address CSAM.

child sexual abuse material

According to reports, at first, X stated that the app is suspending a lot more accounts involved in violating its policy on CSAM.

“From January to November of 2023, X permanently suspended over 11 million accounts for violations of our CSE policies. For reference, in all of 2022, Twitter suspended 2.3 million accounts.”

It has been experienced that X is enacting more violations. Though it may lead to wrongful suspensions or accounts and responses. However, it’s been considered that these actions may not be the perfect reflection of improvement on this front.

X has also revealed that they have reported a lot more other incidents regarding CSAM. As they mentioned;

“In the first half of 2023, X sent a total of 430,000 reports to the NCMEC CyberTipline. In all of 2022, Twitter sent over 98,000 reports.”

This is an impressive measure coming to see from X, but these actions are employing NCMEC reporting fully automated. It means that all the reported posts are no longer subject to manual review. Hence, a lot more posts are being reported subsequently.

These actions may lead to better results, as more reports will lessen the risks on the app. But, it is not entirely effective or indicative as no data from NCMEC confirms the validity of such reports. The number of reports is rising, but no results or insights are effectively indicated.

For instance, at one step X also argued that the CSAM data has virtually been eliminated overnight. This has been done by identifying and blocking some hashtags used on the app. This is likely what X refers to here;

“Not only are we detecting more bad actors faster, we’re also building new defenses that proactively reduce the discoverability of posts that contain this type of content. One such measure that we have recently implemented has reduced the number of successful searches for known Child Sexual Abuse Material (CSAM) patterns by over 99% since December 2022.”

Identifying and blocking such tags will bring some results. However, experts claimed that if X blacklists certain tags, CSAM paddlers will probably be switched to the new ones. So, the activities on the app may be reduced for a certain time, but it is hard to say whether it will remain the same or have long-lasting effects on the app.

Third-party insights reported that under the new rules and processes of X, CSAM has become more widely accessible on the platform. The New York Times conducted a study back in February, indicating the accessibility rate of CSAM over the platform. The study revealed that such type of content is easy to find on the app and the action to report content by X is slower compared to Twitter performance.

Another report conducted and revealed by NBC stated the same. This report claimed that despite proclamations from Musk that detecting CSAM is a key priority, the action had been little more than ground level. The report has also shown that the actions taken by X had no real results. Moreover, Musk’s action to eliminate the team responsible for such content had potentially exacerbated the problems, rather than solving them.

Theoretically, the actions taken by X should limit such content from the platform or application. Apparently, the data certainly pointed out that X is working a lot more to make a bigger push on this front. However, the results and effectiveness of such actions remain in question.

All Categories