The danger of TikTok to children is real

Three months ago, before Covid-19 took over, computers were not central to my 10-year-old daughter’s life. Neither was the iPad, email, texting, or an insidious app used for making and sharing short videos called TikTok.

If you haven’t seen TikTok in action, in a nutshell, it allows users to record 15-second videos and upload them for the world to see. People post everything from comedy routines to lip-syncing acts, but in this house it’s dancing.

Every day my daughter films herself doing a short sequence of hip-hop-esque moves as she mumbles the lyrics to some song. Then she uses the TikTok app to share the video with her friends. Seems innocent enough, right?

Wrong.

TikTok claims its mission is to “capture and present the world’s creativity, knowledge, and moments that matter.” Um, right. Try capturing kids’ attention and personal data by getting them addicted to filming themselves.

Previously known as Musical.ly, TikTok was acquired by the Chinese company ByteDance in 2017, and has since become one of the most-used apps across the globe. It’s available in more than 150 countries and has about 800 million monthly active users. The pandemic has fueled an explosion in its use. According to Sensor Tower, in the first quarter of 2020, TikTok was downloaded 315 million times — making it the best quarter for any app, ever.

Before TikTok came along, the mere suggestion that my daughter post a video of herself online doing anything, let alone dancing, would have sent her into a fit of rage. But in the space of a few weeks, my husband and I have seen this app take our child hostage.

Now we’re in the process of negotiating her release, trying to explain just how dangerous TikTok is. But it’s an uphill battle. How, she wonders, can something so fun and dancy that else is doing possibly be so scary and terrible?

As a service to other parents who may not have read the fine print on TikTok, I want to share a few facts with the hope that you will get your child off this app — which will help me get my child off this app — immediately.

For starters, Congress was so worried that TikTok might be sharing user data with its parent company (again, that’s ByteDance, a Chinese conglomerate), that it opened a national security review.

Chuck Schumer told the review was, “validation of our concern that apps like TikTok — that store massive amounts of personal data accessible to foreign governments — may pose serious risks to millions of Americans.”

In a statement emailed to the , a TikTok spokesperson wrote, “We take privacy seriously and are committed to helping ensure that TikTok continues to be a safe and entertaining community for our users.”

Yet, in January, Check Point, a cybersecurity company in Israel, published research that found serious vulnerabilities within TikTok that would have allowed hackers to manipulate user data and reveal personal information.

According to the , “The weaknesses would have allowed attackers to send TikTok users messages that carried malicious links. Once users clicked on the links, attackers would have been able to take control of their accounts, including uploading videos or gaining access to private videos. A separate flaw allowed Check Point researchers to retrieve personal information from TikTok user accounts through the company’s website.”

TikTok said it learned about the conclusions of Check Point’s research in November and had fixed all of the vulnerabilities by the next month.

If all of that’s not enough for you, this should be:

In May, a coalition of 20 leading U.S. child advocacy, consumer, and privacy groups filed a complaint urging the Federal Trade Commission to investigate and sanction TikTok for putting kids at risk by continuing to violate the Children’s Online Privacy Protection Act (COPPA).

According to a statement from the Center for Digital Democracy, “In February 2019, TikTok paid a $5.7 million fine for violating COPPA, including illegally collecting personal information from children. But more than a year later, with quarantined kids and families flocking to the site in record numbers, TikTok has failed to delete personal information previously collected from children and is still collecting kids’ personal information without notice to and consent of parents.”

The statement continues, “TikTok makes it easy for children to avoid obtaining parental consent. When a child under 13 tries to register using their actual birthdate, they will be signed up for a “younger users account” with limited functions, and no ability to share their videos. If a child is frustrated by this limited functionality, they can immediately register again with a fake birthdate from the same device for an account with full privileges, thereby putting them at risk for both TikTok’s commercial data uses and inappropriate contact from adults. In either case, TikTok makes no attempt to notify parents or obtain their consent. And TikTok doesn’t even comply with the law for those children who stick with limited “younger users accounts.” For these accounts, TikTok collects detailed information about how the child uses the app and uses artificial intelligence to determine what to show next, to keep the child engaged online as long as possible.”

Josh Golin, Executive Director of Campaign for a Commercial-Free Childhood, said, “For years, TikTok has ignored COPPA, thereby ensnaring perhaps millions of underage children in its marketing apparatus, and putting children at risk of sexual predation. Now, even after being caught red-handed by the FTC, TikTok continues to flout the law.”

I know, I know. You enabled all the safety settings. Okay, that’s great, but they’re toggle settings and thus very easy for a child to change. And have you considered all that is unintentionally revealed every time your child uploads a video? For example, is that a Red Sox cap in the background? You just gave away where you live. Let’s zoom in on the logo on that folder on your desk. Now let me Google it. Ah, so that’s where you go to school. You mentioned in that video last week that you’re 10, so I already know you’re in fourth grade.

And if you think someone who spends their time preying on children can’t bypass a few security settings, you’re out of your mind. This is what child predators do. It’s their expertise. They also know how to talk to children, how to flatter and charm them, make them feel pretty and noticed and special. If your kid’s videos haven’t already been viewed by a pedophile, or someone who collects and distributes videos to pedophiles, you’d better believe they’re trying to access those videos at this very moment.

If you want a visual of one of those brokers, a better understanding of how what appears to be a harmless website run by harmless folks can in reality be a hub for perverted freaks, a front for a dark, vast, terrifying underworld, I’d urge you to watch the first scene of the first episode of the television series “Mr. Robot.” There’s a link below.

Another link I’ve included below is an overview of the TikTok experience written by a mother who started out thinking the app was “cute,” but quickly became “truly scared” by everything from the warped idea it was giving her daughter about friendship to the potential damage it was doing to her self-esteem. “The anxiety, social pressure and insecurity our children are feeling is only amplified by apps such as TikTok,” she wrote. “The solution isn’t getting rid of the app, because there will always be another one. The solution is hands-on parenting.”

I think it’s both. This particular app has to go and I have to explain the reason to my daughter. I want this done fast, but I don’t want to impose a swift ban and have it backfire on me — so I started with a conversation about how TikTok may seem super fun and harmless, but it’s not. It poses significant danger to everything from her confidence to her body image to her physical safety.

She pushed back saying that her dad and I, and her school, had taught her how to be safe online. And besides, she had a private account that only her friends had access to, and only 10 people had ever viewed her videos.

“That you know of,” I said.

I went on to tell her that people who target children online have ways of moving around these apps like predatory ghosts, stalking and seeking their prey. It doesn’t matter if you’ve taken all the precautions, they can still get to you. And not just on TikTok — on any app or social media platform. I didn’t want her to just take my word for it, so I found an article and shared what I could of it in an age-appropriate way.

In essence, the piece tells the story of a 14-year-old girl who posted on Instagram shortly after getting her first smartphone, and ended up being raped in her home by a 22-year-old man.

The girl’s mother says, “We had conversations about privacy. About not giving out personal information, not telling people where you live, not sharing details about your life, not talking to strangers online… We all kind of have a tendency to go ‘not my kid’ and ‘it won’t happen to us,’ but … regardless of who you are or where you’re from, this can happen to your kid.”

Don’t let it. Get your daughter off TikTok now.

www.scarymommy.com/week-life-children-social-media/

https://www.youtube.com/watch?v=sO2raiSHOhc

Written by

Writer, athlete, mom, sports fan. New York City native. Probably the only person on earth who has interviewed Derek Jeter and written dialogue for Susan Lucci.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store