Hit play on the player above to hear the podcast and follow along with the transcript below. This transcript was automatically generated, and then edited for clarity in its current form. There may be some differences between the audio and the text.
More:Daily news, true crime, and more USA TODAY podcasts
Hey there listeners it’s Brett Molina. Welcome back to talking tech TikTok is once again, capturing attention for an alleged viral trend involving us schools. Several school districts in the US issued warnings, beefed up security and even canceled classes with some sighting threats that were allegedly made on the social media app TikTok. I write about this in a story that you can read on tech.USAtoday.com. A lot of the focus is on TikTok in its response. And so I write about how they responded, but also kind of looking more broadly at what they do when it comes to moderating what is available on their platform, because it’s one of first things I saw on Twitter as people were discussing this is what is TikTok doing? Should it be doing more to prevent these types of things from spreading? Now to be very clear here, TikTok put out a statement.
They said, “We handle even rumor threats with utmost seriousness, which is why we’re working with law enforcement to look into warnings about potential violence at schools. Even though we have not found evidence of such threats originating or spreading via TikTok.” TikTok has said from the beginning as this has all come down, is that they have continued to aggressively search for any content related to these threats on its platform. They also said that they were working to remove “Alarmist warnings that violate or misinformation policy.” This was after they had confirmed with local authorities in the FBI and Homeland security that there was no credible threat tied to this rumored threat of violence at schools. Let’s get to the moderating part. What does TikTok do to moderate the content that’s on it’s platform? And a lot of this is similar across the board, no matter what social media app. They use technology to inspect whatever’s uploaded.
Story from Know Narcolepsy®
Learning to live with narcolepsy
Tim was always tired. For many years, that was just the way life was – until he was diagnosed with narcolepsy.
See More →
So someone uploads a fun video or anything of that sort it’s inspected for anything that’s possibly violating policies from TikTok. They also have a safety team that will then… So that’s an actual person that will be there. Not that the safety team is one person, but just a representative of that safety team will review the video for these possible policy violations and then determine it’s violating policy or it’s not. Creators can appeal the removal of the video. And then that way TikTok can review that. TikTok has also rolled out new tools to automatically remove uploaded content that might violate its guidelines to help out this safety team. They also have policies in place to help law enforcement in the event of an emergency so that they can, if there’s an emergency happening, they can hand over user data that’s necessary to prevent whatever potential harm is there as permitted by law.
So that gives you a little bit of a sense of kind of what TikTok does on their end. We’ve heard a lot about these online challenges time and again. If you’re a parent, you’ve probably heard lots of talk about these online challenges and this isn’t the first time that TikTok has been associated with a viral trend of this form. Although not to the severity of what we’ve seen on Friday, where there was talk of potential… These rumors talking about potential violence at schools. In October students, some students at some schools face charges over what was called a slap a teacher challenge that had reportedly started on TikTok.
TikTok said that it didn’t trend there and it would remove any related content. There was also the fact checking site Snopes had reported there was little evidence that the challenge even existed on TikTok to begin with. In September, we also had this devious licks trend, which involved stealing or vandalizing school property. TikTok along the same lines said it would remove any content related to it and redirect any hashtags to their community guidelines that their users knew, this is how you’re supposed to behave on TikTok.
This is obviously something that’s not going to go away and this is something that social media companies are obviously going to have to continue to deal with, policing the millions and in some cases, billions of users on their platform and how can they keep track of every little thing that gets posted at every time? I don’t know what kind of advances are in store on that front, but we’ll keep watch of that. Any updates related to you can read on tech.USAtoday.com listeners let’s hear from you. Do you have any comments, questions, or show ideas? Any tech problems you want us to try to address? You can find me on Twitter @Brettmolina23. Please don’t forget to subscribe and rate us or leave a review on Apple podcasts, Spotify, Stitcher, anywhere you get your podcasts. You’ve been listening to Talking Tech. We’ll be back tomorrow with another quick hit from the world of tech.