TikTok has written to social media companies asking them to affix collectively to take away content material that depicts self-harm or suicide extra rapidly.
It comes after a clip of a person killing himself was extensively circulated on its platform and seen by younger youngsters.
Theo Bertram, Europe’s public coverage head, stated the sharing of the video urged a co-ordinated assault, probably from bot accounts.
He declined to debate ongoing negotiations on the way forward for TikTok.
Mr Bertram was being grilled by MPs on the Division of Digital Tradition, Media and Sport who’re investigating how social media platforms cope with on-line harms.
They had been additionally eager to listen to extra about the way forward for the corporate exterior China, in wake of President Donald Trump’s menace to ban the app within the US except a deal is struck with American companies.
Proprietor ByteDance is presently in talks with Oracle and Walmart over its future, however reviews recommend that China is unlikely to approve what it sees as an unfair deal.
Mr Bertram stated he was not in a position to touch upon the main points of the continued negotiations.
“I feel there are broader issues round China and China’s position on the planet. And I feel that these issues are projected on to TikTok and do not suppose they’re at all times pretty projected,” he advised MPs.
When pressed on how the platform handled content material delicate to the Chinese language authorities, resembling protests in Hong Kong and the therapy of the Uighur Muslims, he advised MPs: “TikTok is a enterprise exterior of China and is led by European administration which have the identical issues and the identical world view that you just do and we care about our customers.”
A few of these customers have not too long ago been traumatised by a clip circulating on the platform displaying a US man killing himself, and Mr Bertram acknowledged that the agency needed to “do higher”.
Mr Bertram defined that the agency had seen an enormous spike within the sharing of the clip every week after the printed came about on Fb Reside.
“Following an inner overview, we discovered proof of a co-ordinated effort by dangerous actors to unfold this video throughout the web and platforms, together with TikTok.
“And we noticed individuals trying to find content material in a really particular approach. Ceaselessly clicking on a profile of individuals as in the event that they’re type of anticipating that these individuals had uploaded a video.”
He stated the agency had written to the chief executives of Fb, Instagram, Google, YouTube, Twitter, Twitch, Snapchat, Pinterest and Reddit.
“What we’re proposing is that, the identical approach these firms already work collectively round youngster sexual imagery and terrorist-related content material, we must always now set up a partnership round coping with this kind of content material.”
And for TikTok itself, he promised “adjustments to machines studying and emergency methods” in addition to how algorithms that detect such content material can work higher with the agency’s content material moderators.
He was additionally requested about reviews that TikTok had eliminated content material round disabilities or LGBTQ.
He defined that “sadly” there had been a coverage round not selling content material that may encourage bullying, which restricted content material from individuals with disabilities and LGBTQ content material.
“That’s now not our coverage,” he stated.
He was much less clear on whether or not the agency restricted the promotion of LGBTQ hashtags in Russia, saying: “Not so far as I am conscious… The one time we’ll take away that content material is when we now have a authorized requirement to take action.”