TikTok is More Dangerous Than We Thought

TikTok is far more dangerous than we thought. In the last two years, at least 15 kids, aged 12 or younger, across the globe from Milwaukee to Sicily, have painfully passed on after attempting what seemed to them like a harmless challenge they found on the world’s most popular app. TikTok. The blackout challenge encourages users to record holding their breath until they passed out and viewers could, essentially, watch as they regained consciousness. 


Despite the fact that this is obviously dangerous for anyone on the planet, in the hands of children, the trend had for more devastating consequences. Nyquil challenge, penny challenge, the milk crate challenge. Time  after time we’ve seen TikTok spring up one dangerous viral trend after another, all of which puts users, especially younger users, in potential serious physical danger. 


In a previous Aperture episode, we looked at the mental health effects of TikTok. But it seems as though the self-inflicted risks directly linked to the app are more far-reaching and have greater consequences than we could have ever imagined. From health and safety to privacy and security, given the long list of problems that TikTok seems to pose to the general public, what do we do? What can we do? Can we “fix” TikTok? Or should we just ban the app entirely?

 

If you’ve opened TikTok, you’ll find its genius upon first glance. An endless stream of content from strangers all across the globe curated specifically for you by one of the most refined algorithms humanity has ever created. You get a rush of dopamine, from watching dogs do dances to amateur chefs making elaborate meals. Before we look at just how dangerous TikTok has become, I think it’s important for me to mention that it’s definitely not all bad. 

 

TikTok’s rise to the top was greatly accelerated during the pandemic. Everyone was forced to stay indoors, without physical access to their friends, family and community. To help us find a way to connect, young people flocked to TikTok, doing silly dances and challenges to cure their boredom and help them feel part of a community again. They found solace in  an app. From doctors explaining the COVID symptoms people were experiencing, to a random comment saving people from potentially life-threatening conditions.

 

Whether you got bullied at school or you were living in an unhappy home, you could log onto the app and see other people in the exact same situation  as you offering you help, support and a sense of belonging. Sadly, the reality as with most things is that there’s far more than meets the eye. Just spend a few minutes on the app and instantly you can see just how difficult, heck almost impossible, it is to leave. 

 

Today, people spend more time there than they do on any other social media platform with a global average of 96 minutes a day. And for some, the number is far greater than that. This, for anyone, is way too much time on just one app, but it’s even worse when you realize that around half of TikTok’s users are young people, many of whom are below the age of 13. Creators and users who try to destigmatize mental health on the app see the benefits of more people learning and talking about these issues. Isn’t that only the case, though, if users are getting the right information? 

 

One of the greatest problems on TikTok is that just about anyone can buy a scrub on Amazon and claim to be a doctor, spreading misinformation with ease. There’s no formal fact-checking  so, often, we might just hear what we want to hear. If we have a symptom we’re worried about, these videos could have the WebMD effect of making us think we’re dying when really we just have mild allergies. When you think about it like that, you realize that the videos pushed to us by TikTok can actually worsen what we’re feeling. If we’re anxious about a relationship or a meeting at work and we’re continually getting shown videos of other anxious people, is that going to help us feel better or make us spiral? 

 

All of these issues are without mentioning the things that plague every social media app like cyberbullying, social exclusion or the temptation to compare ourselves to others. We get addicted to scrolling and posting… and scrolling and posting… and scrolling until we’re convinced that we’re just not as good as everyone else. The scariest thing about it is that no one is immune to the grasp of the algorithm, not even those who should be better informed. One doctor, Brian Boxer Wachler, grew an impressive TikTok following by offering medical advice and reacting to other health-related videos. Knowing his audience, he became fluent in Gen-Z slang to be more relatable, despite being middle-aged.

 

He became so obsessed with growing his following that his family had to stage an intervention to help him curb his addiction and approach his channel as a healthy distraction rather than an obsession. In late 2022, he released a book detailing his experiences with social media addiction. Creators have to produce new videos all the time to maintain view counts. The pressure, and the burnout that comes with it, is real no matter what your age, education level or IQ score. In that sense, TikTok is a great democratizer of our time, but with debilitating side effects. 

 

Seeing all the dangers of TikTok, the big question remains: Is this fixable? Is there a way to combat those  dopamine hits we feel every time we open the app? The problem is that the app wants us to be addicted. That’s how it keeps users, increases downloads and stays relevant. We can put restrictions on our screen time or make our phones require breaks from the app as much as we want, but if the app doesn’t change, chances are that we won’t either. And for most people that’s ok. They might obsess over the latest TikTok dance or unfairly compare themselves to a beauty influencer, but they’ll be fine.

 

Their life won’t be dramatically altered. But for kids, the potential for damage is much higher. TikTok took down 41 million underage accounts in the first half of 2022 alone, but that’s a fool’s errand. Those users can just sign up again with a different account. TikTok’s army of 40,000 global moderators review potentially harmful videos, but it’s an impossible task to catch everything. Over one billion videos are viewed on the app every single day. That means each mod would have to review 25,000 videos. Let’s say the videos average out to around a minute each, that’s still 416 hours of content to watch in just 24 hours. It is literally impossible. 

 

So how do we protect kids? There’s no effective way to block underage users from social media platforms because it’s impossible to verify their age. But what if it wasn’t? In 2021, TikTok met with providers of facial age estimation software which can distinguish between a child and a teen and can work without directly identifying an individual or storing any data. This could be a game changer for an app that’s trying to be safer for children. But, unfortunately, child safety isn’t the only scandal swirling around TikTok. Facial recognition technology on an app that’s been accused of spying on its users and sharing data with a government would not be a great look. 

 

We can’t really talk about TikTok in today’s world without talking about privacy. Most of us know and ignorantly accept that our data is being stored, seen and used in some way when we surf the internet. It’s the classic “accept all cookies” option without thinking. But is TikTok, and its parent company, ByteDance, taking it to a new level? An internal investigation found that a group of ByteDance employees was found to be surveilling several U.S.  journalists who covered the app in an attempt to track potential anonymous sources. In 2020, a security update on the iPhone caught TikTok tracking the keystrokes of Apple users while on the app.

 

But here’s the thing, ByteDance isn’t the first company  to be accused of any of this. Uber and Facebook have been known to track the location of journalists who report on their apps, just like TikTok employees were found to be doing. And in more than 124,000 documents leaked to the press in 2022 spanning 2014 to 2017, Uber was shown to be doing everything it could to bypass regulations across the globe. Meta, Twitter, Google and even Apple all collect and use our personal data in some way. So what’s the big issue with ByteDance? Geopolitics. 

 

ByteDance is a Chinese company. TikTok says that it does not share user data with the Chinese government, but politicians, journalists and other critics are quick to call a bluff. China already has a practice of stealing massive quantities of data about Americans and other governments, but do TikTok and its billions of user profiles offer a more direct line? If the security concerns turn out to be true, we might see widespread fraud, hacking or influence operations launched through the platform. As a result, United States lawmakers have issued warnings about the app and enacted executive orders to address the potential security risks it poses.

 

Calls from inside the U.S. Congress have, in a feat of modern-day politics, brought Republicans and Democrats together against a common enemy. Closed-door talks between TikTok executives and the Committee on Foreign Investment in the U.S. have been going on for years. There’s a security contract in negotiations with the Treasury Department on how TikTok will handle Americans’ user data. All of this in an effort to fix things. To curb the threat. To limit our exposure as users. But will it work? 

 

Many think that contracts, talks and hearings won’t get the job done. That fixing just isn’t an option. The calls to ban TikTok in the United States and many other countries grow by the day. The U.S. military banned the app in 2019. Now TikTok is off-limits on all government devices and there’s a bill sitting in Congress to prohibit it completely. Take a second and think. What would that look like? The most popular app in the world unavailable in the United States? Would there be revolts? Cheers? Confusion? A restructuring of society as we know it?

 

To understand what life post-TikTok could be  like, look no further than China’s neighbor, India. The East Asian country banned TikTok in 2020 after a geopolitical dispute with China. Of course, there were consequences. People employed by TikTok in India lost their jobs. Influencers who had amassed followings on the app lost their income. But it wasn’t all bad news in India. In fact, it was surprisingly good for many of its citizens. Replacement apps developed in India are hoping to fill the hole that TikTok's departure created, with the hopes of focusing on the needs of its users rather than doing whatever it takes to fulfill business objectives.

 

If India does this well, will other countries follow suit? For now, we can’t really say for sure. As for U.S national security? There’s no smoking gun. No evidence of an urgent threat. This begs the question: Is all the discussion by politicians and regulators really about a unique national security issue, or is it a way for them to talk about larger issues like privacy, disinformation and content moderation that help bolster their own personal gains platforms? Despite any ulterior motives, the potential for danger seems to be enough to at least keep conversations about partial or full bans of the app going. And you don’t even need to look  with a globalized lens to understand that there are harmful aspects to TikTok. Remember the kids who suffered a terrible fate attempting the blackout challenge they were never supposed to see in the first place? 

 

The reality is that technology is always one step ahead of us, our governments and our ability to maintain  our mental health. And as long as there’s money to be made, it won’t slow down. Because users aren’t customers, we are the product. Something to be sold and analyzed in the name of financial gain. If measures like enforcing age restrictions to make an app safer are not in the interest of a company’s bottom line, why would they ever enact them? So, can we fix TikTok? Or should we just get rid of it? The app is making small efforts to fix itself, adding new features like one that tells users why the algorithm recommends certain videos. But the algorithm is designed to keep people watching for as long as possible so it promotes the most extreme, the most controversial, the most eye-catching videos. And even if you know why you’re watching a video, will that really change the way that video makes you feel or act? 

 

TikTok could try a route such as YouTube took  with YouTube Kids to protect its youngest users. It’s a version of the platform designed for children 12 and under  that hosts age-appropriate videos and screen-time limits. As for our privacy and security, perhaps it’s up to the governments of the world to place restrictions on what information about its citizens a foreign body has access to. Or, if a full ban of the world’s most popular app is in our future to protect us, our mental health and the safety of our homes, those 96 minutes on average that we spend every day on TikTok would have to go toward  something else. 

 

I would encourage us all to focus that time on something more productive, because the reality is a world without TikTok will still have other platforms that embrace the blistering pace and addictive nature of short-form content. Things being lit on fire, flash mobs performing goofy choreographed dances and people eating too much pizza on camera. In fact, right now, there’s already YouTube Shorts, Instagram and Facebook Reels. So we better watch out.