The graphic video of the suicide gone viral on TikTok in early September was “the result of a coordinate dark web raid,” the company told MPs. Testifying to the Commons Committee on Digital, Culture, Media and Sports (DCMS). Theo Bertram, TikTok’s European Director of Public Policy, said the video, which originally broadcast live on Facebook. Used in a “coordinated attack” on the app social video a week after its initial recording.
“We discovered that dark web bunches were wanting to assault social media stages, including TikTok, to circulate recordings over the Internet,” Bertram said.
“What we saw a group of users who repeatedly tried to upload a video to our platform. And put it together, edited it, cut it in various ways,” he added. “I would prefer not to blabber freely in this discussion about how we distinguish and deal with this. Yet our AI crisis administrations turned on and they found
Death widespread on the site, and prominent users ended up sharing tips with others on how to detect and avoid the video before it played automatically.
- Samsung Cloud storage: Everything you need to know about it
- How to speed up Windows 10
- Six Site building error to watch out for
Almost a week separated death from the “raid” on TikTok. Which prompted the company to propose a “global coalition” to help protect users from such harmful content.
Facebook, Instagram, Google, YouTube, Twitter, Twitch, Snapchat, Pinterest and Reddit, ”Bertram said,“ and we suggest that just like these companies are already working together around [child sexual abuse images] and terrorism-related content we now need to establish a partnership around working with this type of content. ”
Such a partnership would allow, for example, Facebook to share the technical details of the graphic video so that TikTok can prevent it from downloading in the first place.
It would also ease the burden of moderation on smaller companies. In the UK, TikTok has 363 moderators, Bertram revealed, out of 800 employees.
Tik-Tok: why it is being sale and who will own it
In addition to working to ensure that the platform is clear of malicious content. These moderators also tasked with adhering to TikTok’s age guidelines. Every time you moderate a video with a human reviewer, that person is a reviewer. No matter what else they are viewing it for the video also looks to see if this account belongs to someone under 13, Bertram said. If the account looks like it really belongs to someone underage, it deleted.
Bertram also asked MPs to help streamline the process, demanding Apple and Google to include stricter age restrictions in their app stores.
“There are only two app stores,” he argues. “Thus, in the event that you said to application stores, ‘this is the main spot we will guardians for evidence old enough. At that point, you’re not expanding the danger of information loss by forcing parents to get data for each app. But you can ensure that each app must then check age using this app store.
“I can’t help suspecting this would be an answer for a harder age confirmation.”
Responding to Tory MP Steve Pickle’s complaint about “shoddy” content in the app, such as “mother and daughter pushing their butts. If you know what I mean,” Bertram encouraged MPs to watch Andrew Lloyd Webber’s video.