Tech Legal (Parents): TikTok (Musical.ly,) hit with record fine for collecting data on children

playahaitian

Rising Star
Certified Pussy Poster
https://www.cnn.com/2019/02/28/tech/tiktok-ftc-fine-children/index.html

TikTok hit with record fine for collecting data on children

By Sherisse Pham, CNN Business



Updated 1:22 AM ET, Thu February 28, 2019







181210151438-exp-gps-1209-witw-kai-fu-lee-china-tech-boom-00005701-exlarge-169.jpg
Popular video-sharing app TikTok has agreed to pay $5.7 million to settle allegations that it illegally collected personal information from children under the age of 13, such as names, email addresses and their location.
The US Federal Trade Commission said in a statement Wednesday that the TikTok fine is a record for a child privacy case.


TikTok, which belongs to $75 billion startup ByteDance, has more than half a billion users worldwide, giving it an international edge over other Chinese-owned social media platforms, which have struggled to expand outside their home market.


The US fine relates to Musical.ly, a video-sharing app Bytedance bought in 2017 and merged with TikTok last August.
The FTC said its investigation of Musical.ly had "uncovered disturbing practices, including collecting and exposing the location" of young children. Despite receiving thousands of complaints from parents, the company failed to comply with requests to delete information about underage children and held onto it longer than necessary, according to the commission.

TikTok said in a statement that it is committed to "creating measures" to protect users, including tools for parents to protect their kids.
The company on Wednesday introduced a "separate app experience"for younger US users in which they "cannot do things like share their videos on TikTok, comment on others' videos, message with users, or maintain a profile or followers."
Creating accounts without parents' knowledge
Launched in 2014, Musical.ly let users upload short videos of themselves lipsyncing to popular songs. To register for the app, people had to give an email address, phone number, username, first and last name, short bio and profile picture.
The collection of that personal information provided more than enough evidence for Musical.ly to see it was violating the US Children's Online Privacy Protection Act, or COPPA, the FTC said.
The youth of Musical.ly's userbase was "easily apparent in perusing users' profile pictures and ... profiles, many of which explicitly note the child's age, birthdate, or school," according to a complaint filed by the FTC.
Musical.ly, which had more than 65 million registered users in the United States before it merged with TikTok, received thousands of complaints from parents saying their underage children had created an account without their knowledge. COPPA requires internet companies to obtain "verifiable parental consent" before collecting, using or disclosing personal information from children.

When parents asked, Musical.ly would shut down the accounts of preteens. But it didn't delete the users' videos or profile information from its servers, according to the FTC complaint.
"There have been public reports of adults trying to contact children via the Musical.ly app," the commission said. "In addition, until October 2016, the app included a feature that allowed users to view other users within a 50-mile radius of their location."
As part of the settlement with the FTC, TikTok must take down all videos made by children under age 13.
Other big tech companies have run afoul of the United States' child privacy law in the past.
Oath, which is owned by Verizon (VZ), agreed to pay $5 million in December for using data such ascookies and geolocation to place online ads on websites targeted at children under 13.
Disney (DIS) subsidiary Playdom was fined $3 million in 2011 for collecting and displaying children's information in multiplayer online games.
 

Owes

Rising Star
BGOL Investor
I'm all for sexual liberation but do that shit when you're 18+.
There should be no reason why underaged girls are dressing and behaving overtly sexual.

It was very disturbing to see 9/10/11 yr old girls (& early teens) doing sexual skits with middle-aged men.
 

ansatsusha_gouki

Land of the Heartless
Platinum Member
YouTube was hit with a 170 th million dollar fine for the same thing and starting January 1st,a lot of YouTube channels might be gone because of YouTube's stupidity..

 

Adam Knows

YouTube: Adam Knows
Platinum Member
that is the one thing that is fucked up about tok...

minors and adults easily co-mingling.... never seen that on instagram, etc..

it's minors all day in my feed when in reality they should be on a completely different server separate from adult posters
 

Cannibal

Rising Star
BGOL Investor
Just removed this shit from my kids phone. I allowed her to download it not knowing what it was. Thought it was just like a junior youtube, but it ain't. I flipped through her phone just perusing, making sure everything was still sweet, and it wasn't. There was a lot of homosexual cutesy kissy videos, kids cussing and talking about eating pussy, and other shit and the content was much older than she was. Watch your kids on TikTok. As a matter of fact, don't let your kids on that shit.
 

cold-n-cocky

BGOL vet down since the “56k stay out!” days
BGOL Gold Member
A lot of these apps are no good. I’d let my kid download Roblox; then realized how easy it was for absolute strangers to chat with and connect; deleted that shit a day later and had a long y’all about how digital strangers are just as dangerous as in person ones.

This article about digital predators and kids is chillin.

 

Cannibal

Rising Star
BGOL Investor
A lot of these apps are no good. I’d let my kid download Roblox; then realized how easy it was for absolute strangers to chat with and connect; deleted that shit a day later and had a long y’all about how digital strangers are just as dangerous as in person ones.

This article about digital predators and kids is chillin.

Reading...
 

playahaitian

Rising Star
Certified Pussy Poster
A lot of these apps are no good. I’d let my kid download Roblox; then realized how easy it was for absolute strangers to chat with and connect; deleted that shit a day later and had a long y’all about how digital strangers are just as dangerous as in person ones.

This article about digital predators and kids is chillin.


^^^^
 

cold-n-cocky

BGOL vet down since the “56k stay out!” days
BGOL Gold Member
My kid plays roblox, but she can barely read now. So..I'll be getting on there to check it out more. thanks. My be a way to turn off chat, I'll see.

Man in one day my kid, < age 8, and who can read, had friend requests and had someone inbox message her “hi”. Then I read thru the chat and while thee was little that was overly sexual, the commentary was way above her age range and my comfort level. I also blocked chat on Minecraft.

Read that NY Times article I posted a few posts ago; it’s scary shit.
 

Cannibal

Rising Star
BGOL Investor
Man in one day my kid, < age 8, and who can read, had friend requests and had someone inbox message her “hi”. Then I read thru the chat and while thee was little that was overly sexual, the commentary was way above her age range and my comfort level. I also blocked chat on Minecraft.

Read that NY Times article I posted a few posts ago; it’s scary shit.
Broham, I read it, then called my kid up here, and re-read it with her. Let her know that talking to strangers is completely out. Pointed out parts of the story where freaks were trying to get nudes and coercing kids with fear of pics being sent. Showed her the date to let her know its recent. Then went through my other kids roblox and changed the settings to just friend for chatting, which is just her cousin, removed all other request, which were 22. She so young she didn't except, but there were 22 request for friends, and 4 chat conversation that people had started with "Hello". Changed her profile page to a message like "My dad is watching so becareful". Did all this after you posted that article.
 

cold-n-cocky

BGOL vet down since the “56k stay out!” days
BGOL Gold Member
Broham, I read it, then called my kid up here, and re-read it with her. Let her know that talking to strangers is completely out. Pointed out parts of the story where freaks were trying to get nudes and coercing kids with fear of pics being sent. Showed her the date to let her know its recent. Then went through my other kids roblox and changed the settings to just friend for chatting, which is just her cousin, removed all other request, which were 22. She so young she didn't except, but there were 22 request for friends, and 4 chat conversation that people had started with "Hello". Changed her profile page to a message like "My dad is watching so becareful". Did all this after you posted that article.
Well done fam; glad it was of help. Reviewed all mines settings on her iPad apps as well and we had that same discussion.

what really was telling is how at the gaming conference few wanted to discuss the elephant in the room, and the companies are making money so they don’t care about protecting kids.
 

playahaitian

Rising Star
Certified Pussy Poster
Video Games and Online Chats Are ‘Hunting Grounds’ for Sexual Predators
Criminals are making virtual connections with children through gaming and social media platforms. One popular site warns visitors, “Please be careful.”
By NELLIE BOWLES and MICHAEL H. KELLER DEC. 7, 2019


You have 5 seconds to take each pic


Take your pj's off. Just pull them down


Show your entire stomach. Retake pic.


im going to kill you


I'm about to post your pics If you don't chat me back in 10 minutes.


Reply now


Either you talk to me or you can suffer


You're so pretty :) How old are you


Your time is close to running out, too


you boys did good


DO you want your mom seeing your pics?


Show full body. Not just half. Retake pic


You have 1 minute. Hurry up


I mean you can trust me right


Show us that you’re crying



Conversations excerpted from court documents
When Kate’s 13-year-old son took up Minecraft and Fortnite, she did not worry.
The video games were hardly Grand Theft Auto — banned in their home because it was too violent — and he played in a room where she could keep an eye on him.
But about six weeks later, Kate saw something appalling pop up on the screen: a video of bestiality involving a young boy. Horrified, she scrolled through her son’s account on Discord, a platform where gamers can chat while playing. The conversations were filled with graphic language and imagery of sexual acts posted by others, she said.
Her son broke into tears when she questioned him last month.
“I think it’s a huge weight off them for somebody to step in and say, ‘Actually this is child abuse, and you’re being abused and you’re a victim here,’” said Kate, who asked not to be identified by her full name to protect her family’s privacy.
Exploited
Articles in this series examine the explosion in online photos and videos of children being sexually abused. They include graphic descriptions of some instances of the abuse.


Sexual predators and other bad actors have found an easy access point into the lives of young people: They are meeting them online through multiplayer video games and chat apps, making virtual connections right in their victims’ homes.
The criminals strike up a conversation and gradually build trust. Often they pose as children, confiding in their victims with false stories of hardship or self-loathing. Their goal, typically, is to dupe children into sharing sexually explicit photos and videos of themselves — which they use as blackmail for more imagery, much of it increasingly graphic and violent.
Reports of abuse are emerging with unprecedented frequency around the country, with some perpetrators grooming hundreds and even thousands of victims, according to a review of prosecutions, court records, law enforcement reports and academic studies. Games are a common target, but predators are also finding many victims on social platforms like Instagram and Kik Messenger.


A gamer at DreamHack, a festival held last month in Atlanta. Kholood Eid for The New York Times
The New York Times reported earlier this year that the tech industry had made only tepid efforts to combat an explosion of child sexual abuse imagery on the internet. The Times has also found that the troubled response extends to the online gaming and chat worlds, where popular and successful companies have created spaces that allow adults and children to interact, despite efforts to create some safeguards.
There are tools to detect previously identified abuse content, but scanning for new images — like those extorted in real time from young gamers — is more difficult. While a handful of products have detection systems in place, there is little incentive under the law to tackle the problem as companies are largely not held responsible for illegal content posted on their websites.
“Our society says we’re going to protect kids in the physical world, but we’ve yet to see that in the same way on the digital side,” said Steven J. Grocki, who leads the child exploitation and obscenity section at the Justice Department.
A spokesman for Discord said in a statement that the company had a “zero-tolerance policy for any illegal activity.”
“No parent should have to worry that their child is exposed to inappropriate content,” he added, “and we deeply empathize with the challenges that families face in protecting their children online.”
Six years ago, a little over 50 reports of the crimes, commonly known as “sextortion,” were referred to the federally designated clearinghouse in suburban Washington that tracks online child sexual abuse. Last year, the center received over 1,500. And the authorities believe that the vast majority of sextortion cases are never reported.
There has been some success in catching perpetrators. In May, a California man was sentenced to 14 years in prison for coercing an 11-year-old girl “into producing child pornography” after meeting her through the online game Clash of Clans. A man in suburban Seattle got a 15-year sentence in 2015 for soliciting explicit imagery from three boys after posing as a teenager while playing Minecraft and League of Legends. An Illinois man received a 15-year sentence in 2017 after threatening to rape two boys in Massachusetts — adding that he would kill one of them — whom he had met over Xbox Live.
“The first threat is, ‘If you don’t do it, I’m going to post on social media, and by the way, I’ve got a list of your family members and I’m going to send it all to them,’” said Matt Wright, a special agent with the Department of Homeland Security. “If they don’t send another picture, they’ll say: ‘Here’s your address — I know where you live. I’m going to come kill your family.’”
The trauma can be overwhelming for the young victims. An F.B.I. study reviewing a sample of sextortion cases found that more than a quarter of them led to suicide or attempted suicide. In 2016, a Justice Department report identified sextortion as “by far the most significantly growing threat to children.”
It makes sense the gaming world is where many predators would go: It’s where the children are. Almost every single teenage boy in America — 97 percent — plays video games, while about 83 percent of girls do, according to the Pew Research Center.
In many states, gaming counts as a team sport and can earn players a varsity letter. Colleges offer scholarships to elite gamers, and cities are racing to establish professional teams. The industry is enormously profitable, generating over $43 billion in revenue last year in the United States.
There are many ways for gamers to meet online. They can use built-in chat features on consoles like Xbox and services like Steam, or connect on sites like Discord and Twitch. The games have become extremely social, and developing relationships with strangers on them is normal.
In many instances, the abusive relationships start in the games themselves. In other cases, adults posing as teenagers move conversations from gaming sites and chat rooms to platforms like Facebook Messenger, Kik and Skype, where they can communicate more privately.
To report online child sexual abuse or find resources for those in need of help, contact the National Center for Missing and Exploited Children at 1-800-843-5678.


“These virtual spaces are essentially hunting grounds,” said Mary Anne Franks, a professor at the University of Miami School of Law and president of the Cyber Civil Rights Initiative, a nonprofit group dedicated to combating online abuse.
One platform frequently used by predators is the video chat site Omegle — users need look no further than the site’s home page to find that out. “Predators have been known to use Omegle, so please be careful,” the site advises under a banner that exclaims, “Talk to strangers!” Omegle did not respond to requests for comment.
This fall, the F.B.I. rolled out an awareness campaign in middle and high schools to encourage children to seek help when caught in an exploitive sexual situation. “Even if you accepted money or a game credit or something else, you are not the one who is in trouble,” material from the campaign explains.
All of it worries Bubba Gaeddert, the executive director of the Varsity Esports Foundation, which gives high schools financial assistance to promote “healthy gaming habits” and develop students’ science and technology skills.
Every time there’s a new report of predatory behavior, he said, people ask, “Who’s to blame — schools, society, parents?”
“Well, it’s all of us.”
Playing With Strangers
It was a gamer’s paradise. Last month, 35,000 people had registered for a video game conference in Atlanta called DreamHack, a weekend-long festival for fans to meet and play and for companies to showcase new offerings.
There were exhilarated gamers everywhere, some passed out on sofas from exhaustion.
Ben Halpert was not one of them. As he surveyed the exhibition space, he saw potential peril around every corner. Mr. Halpert runs Savvy Cyber Kids, a nonprofit focused on online safety, a subject he was there to talk about.


You’ve been tricked. I’m a guy. And I will email all the pics u sent to your school I would not block me or delete kik, because if you do that, I'm still sending it to your school. If you obey me and do everything I want, then I will delete your pics and I will not email your school. But I warn you, if you disobey me once, I will send your pics


Okay what do I have to do?


It’ll involve nudes
Take ur bra off and show your boobs


I’m crying
Please is there anything else I can do?


Just 1 pic?


No. I will let you know when we are done.


If I don’t send you nudes, what is my other option?


There is literally no other option.


Please give me another option.


Source: United States District Court for the Middle District of Louisiana.
Note: This conversation has been edited for clarity.
“Tech has made it easier for predators to get our kids faster and more efficiently,” he said, adding that it made children vulnerable by “normalizing communication with strangers.”
Today, even games meant for small children, like on Roblox, allow players to chat with others. His own daughter was just 6, Mr. Halpert said, when another player in a children’s animal game asked who she was.
“I didn’t even know they could chat,” he said. “It had nothing to do with gameplay, and none of the other animals were talking to each other. That was someone reaching out to my daughter.”
After making contact, predators often build on the relationship by sending gifts or gaming currency, such as V-Bucks in Fortnite. Then they begin desensitizing children to sexual terms and imagery before asking them to send naked pictures and videos of their own.
“Parents aren’t telling their kids at 6 years old, ‘Keep your clothes on online,’” Mr. Halpert said. “But they need to.”
[Read experts’ advice to parents on internet safety.]
Mr. Halpert had arranged to hold a panel on sextortion at DreamHack, but it proved difficult to find experts to join the discussion, he said. Some prominent cosplayers, a game community manager and a former e-sports professional sat onstage with him, but they focused on other problems like cyberbullying.

Ben Halpert, head of Savvy Cyber Kids, an online safety nonprofit. The audience for his DreamHack panel on sextortion. A recording of the panel, which he found difficult to organize because of the topic.
Kholood Eid for The New York Times


When he brought up grooming, the panelists fell silent or changed the subject.
The audience, too, was quiet. In a festival with tens of thousands of attendees, Mr. Halpert’s talk had attracted just a half-dozen.
“People don’t want to talk about it,” he said.
Catching the Predators
New Jersey police departments were flooded with phone calls from parents and teachers, alarmed about pedophiles lurking on game sites and in chat rooms.
And so law enforcement officials from across the state took over a building near the Jersey Shore last year, and started chatting under assumed identities as children.
In less than a week, they arrested 24 people.
Nearly “every game has a chat, so it’s hard for parents to keep track, even if they’re doing their homework,” said Lilianne Daniel, a deputy attorney general in New Jersey and one of the operation’s leaders. “So we realized there were these unregulated conversations happening.”
“We needed to take the next step,” said Christine Hoffman, an assistant attorney general who helped lead the sting.
The authorities did it again, this time in Bergen County, a suburb close to New York City. They made 17 arrests. And they did it once more, in Somerset County, toward the center of the state, arresting 19. One defendant was sentenced to prison, while the other cases are still being prosecuted.
After the sting, the officials hoped to uncover a pattern that could help in future investigations. But they found none — those arrested came from all walks of life. Among them were a police officer, a teacher, a minister, a nurse, a bank manager, a mechanic, a waiter, a dental hygienist, a college student and a deliveryman.
“It cuts across all social and racial lines, across class lines — it cuts across every line,” Ms. Hoffman said. “There is no profile.”
When announcing the arrests, the authorities highlighted Fortnite, Minecraft and Roblox as platforms where suspects began conversations before moving to chat apps. Nearly all those arrested had made arrangements to meet in person.
In a separate case in Ohio, the digital abuse of a young boy led to his physical abuse. The offender, Jason Gmoser, would encourage boys to show their genitals while on PlayStation, according to court records. Mr. Gmoser, who was found with over 500 videos recorded while gaming with boys, often offered gift cards that could be used on the network.
He told detectives in 2014 that he spent years interacting with an 8-year-old who had appeared in several of the videos, including one in which the boy exposed himself and said he would “do anything” for a $20 gift card.
Mr. Gmoser traveled to Missouri to visit the boy and his family, showering them with gifts and paying some of their bills. On at least one trip, he said, he sexually abused the child. He is now serving a life sentence in a separate case, for running a child sexual abuse site on the dark web.



Lilianne Daniel, a deputy attorney general in New Jersey, and Christine Hoffman, an assistant attorney general, who helped lead a raid on online predators. Lt. John Pizzuro, who took part in the operation, with the game Minecraft.
Kholood Eid for The New York Times


Any young person who converses online with strangers is at risk.
In 2018, three men from across the country were convicted of running a sextortion ring for years that lured hundreds of children, some as young as 8, from social and video-streaming platforms that included LiveMe; Omegle; Musical.ly, the predecessor of TikTok; Skype; Snapchat; and Twitter’s Periscope. The men pretended to be teenage boys and girls and coerced children into undressing and performing explicit acts. They shared files and information about the victims over Discord.
In another case, a girl attending high school in Tennessee thought she had made a new female friend on Kik Messenger. They both loved volleyball, and they even resembled each other physically, their profile photos showed. They chatted for six months.
After the teenager shared a partially nude photo of herself, the “friend” became threatening and demanded that she record herself performing explicit acts. “You literally have no choice but to obey unless u want ur pics spread to your friends,” the person wrote, according to court records.
The girl told her mother, who called the police. The offender was a Louisiana man, Matthew Chaney Walker, who was 24 years old at the time of the chat in 2014. The police said Mr. Walker, now in prison, had forced more than 50 girls to send him nude and sexually explicit photos.
The photos of the Tennessee teenager were never shared publicly, but she said in an interview with The Times that she was haunted by the experience years later.
“I thought for a long time that there was something wrong with me or that I was a bad person,” she said. “Now that I’ve gotten to college, I’ll talk to my friends about it, and there have been so many girls who have said, ‘That exact same thing happened to me.’”
An Industry Without Answers
There are a few seemingly simple protections against online predators, but logistics, gaming culture and financial concerns present obstacles.
Companies could require identification and parental approvals to ensure games are played by people of the same age. But even as some platforms have experimented with programs like Real ID, a verification effort, gamers have resisted giving up anonymity.
“There’s been community-layer rejection of those systems because people like to be able to be anybody,” said Todd Harris, who co-founded Hi-Rez, a game development studio based in Atlanta.
While Facebook has algorithms that can detect some text-based grooming, many gamers use audio and video chat. And eliminating audio and video interactions would be a death sentence for a gaming company fighting for customers. “You can’t seriously compete without talking,” Mr. Harris said. “The team with the best communication will win.”
In 2016, an industry coalition was formed to create a common front on online bullying, hate speech and child exploitation, but officials with the group said they would not publish a series of guidelines for companies until next year.
Separately, some gaming companies deploy automated systems they say can detect some grooming behaviors, including attempts to move a chat off platform. Roblox, for instance, uses software from Two Hat Security, a Canadian firm, designed to block explicit language and contact information.
The company takes other measures, a Roblox spokesman said, prohibiting users from asking about names and ages. For children under 13, the spokesman said, filters are more aggressive.
Microsoft, which owns Xbox and the popular game Minecraft, said it planned to release software early next year that could recognize some forms of grooming and sextortion. The company said it would offer the software to other tech businesses free of charge.
Marc-Antoine Durand, the chief operating officer of Yubo, a video chat app based in France that is popular among teenagers, said it started working this year with a company called Yoti to analyze users’ selfies and estimate their ages. Yubo also examines profile information to separate adults from minors, Mr. Durand said, and watches for grooming behavior with the Two Hat software.
Yubo disables accounts when it finds an age discrepancy, often requiring users to provide a government-issued ID. But users frequently object to providing documentation, and many children do not possess it. In February, a 26-year-old Ohio man was charged with sexual exploitation after claiming to be 13 on Yubo and luring a 12-year-old girl, the authorities said.
A Facebook spokeswoman said it had made a variety of efforts to separate adults from children, including limiting how adults can message and connect with them. The Times was able to find minors by looking at users’ lists of friends and activities on pages popular with children.


I’m about to post your pics If you don’t chat me back in 10 minutes


Sorry. I haven’t been on since thanksgiving morning. Ive had a lot of stuff going on with my mom in the Hospital.
I’m with my little sister my aunt , and my granda . I can’t talk right now.


Yes you can do u want me to post your pics everywhere?
I only want 10 more pics and I will leave you alone forever If not I will show everyone


Okay I swear I will get you 10 pics by tonight. Can you tell me what 10 pics you want,.? I swear I will get them to you tonight.


Ok how old is your sis baby?


Send some pics of her instead


Wait. What kind of pics????


I don’t care naked LOL


My little sister is 7 years old. I’m not going to take naked pictures of her.


Ok well I will post your pics then I want a few of her and you ok


Please Thats my little sister. I can’t do that to her. I beg you. Please


Source: United States District Court for the Middle District of Florida.
Note: This conversation has been edited for clarity.
Instagram, owned by Facebook, does not have the same restrictions. Until this past week, it had allowed users to send private messages to anyone, and a Times reporter was able to contact and video-chat with a 13-year-old girl who had a private account (the girl and her parents gave permission to conduct the test). After The Times asked about the policy, Instagram announced new features on Wednesday that allow users to block messages from people they do not follow. The company said it would also require users to enter their age when signing up.
Other companies have taken a more hands-off approach, citing privacy concerns.
A member of the trust and safety team at Discord said that it scanned shared images for known illegal material, and that moderators reviewed chats considered a high risk. The company does not, however, automatically monitor conversations for grooming, suggesting it would be unreliable and a privacy violation.
“We use industry-leading technology such as photo-detection software to scan every image and video for child pornography,” a Discord spokesman said.
Some of the biggest gaming companies provided few, if any, details about their practices. Epic Games, the creator of Fortnite, which has roughly 250 million users, did not respond to multiple messages seeking comment.
Sony, the maker of PlayStation, which had nearly 100 million monthly active users earlier this year, said it took sextortion seriously, pointing to its tutorials on parental controls and tools that let users report abusive behavior.
But the solution many game developers and online safety experts return to is that parents need to know what their children are playing, and that children need to know what tools are available to them. Sometimes that means blocking users and shutting off chat functions, and sometimes it means monitoring the games as they are being played.
“‘Literacy’ is the word I say a billion times a day,” said Mr. Gaeddert, of the e-sports foundation.
‘A Friend of Somebody’
When Kate started scrolling through her son’s Discord account, she saw how the sexualized chats had unfolded. The imagery became increasingly disturbing, moving from innocuous anime figures to pornographic illustrations — and finally to actual children being abused.
So Kate started asking her son about some of the user names of his fellow gamers. “And he’s saying, ‘That’s so-and-so who goes to this school.’ And they all think it’s a friend of somebody,” she said, “but then they realize it’s not a friend of anybody.”
Kate reached out to Discord to alert them to the problem. The company wrote back saying that because her son had deleted the messages, it no longer had the data and could take no action.
“First I was very sad, but I’m really angry,” she said.
At DreamHack, the festival in Atlanta, many parents accompanying their children described how confusing it was to keep track of the various chats and streams their young gamers use.
“We’re not gamers, so we’re relying on our son to tell us about how it all works, which is probably naïve,” said John Marshall of Tuscaloosa, Ala., whose 17-year-old son was competing at the festival with his high-school gaming team.



Parents who attended DreamHack said they were confounded by the volume of new apps to which their children had access.
Kholood Eid for The New York Times


His wife, Rhonda, compared the predicament to watching a child get behind the wheel of a car: “You have to trust them.”
Ms. Marshall said it was also hard to ignore the potential upside of their son’s gaming. “He kept telling me, ‘Mom, you told me I shouldn’t spend so much time on computers, but I can get scholarships for this,’” she said.
Still, there’s no hiding the dangers. Kristy Custer, the principal at Complete High School Maize in Kansas, helped design the curriculum used by many high-school e-sports teams.
“Right now, in the curriculum, we have a section on, ‘If this happens to you, this is what you do,’” she said. “But we probably need to say, ‘When this happens to you, this is what you need to do.’”
Dr. Custer said parents should react carefully when their children report encounters with online predators. Punishing the children — no more video games or social media, for example — could backfire by pushing them into even more dangerous places for their online activity.
“You just did exactly what that predator wanted them to do — and drove them into the darker space,” she said.
Gabriel J.X. Dance contributed reporting.
Produced by Aliza Aufrichtig, Rich Harris, Virginia Lozano and
 

4 Dimensional

Rising Star
Platinum Member
Just removed this shit from my kids phone. I allowed her to download it not knowing what it was. Thought it was just like a junior youtube, but it ain't. I flipped through her phone just perusing, making sure everything was still sweet, and it wasn't. There was a lot of homosexual cutesy kissy videos, kids cussing and talking about eating pussy, and other shit and the content was much older than she was. Watch your kids on TikTok. As a matter of fact, don't let your kids on that shit.

I’ve been dropping the ban hammer on my 11 year old pretty regular. I’m not sure how cool we are lately. Lol.
 

Ballatician

Rising Star
BGOL Investor
A lot of these apps are no good. I’d let my kid download Roblox; then realized how easy it was for absolute strangers to chat with and connect; deleted that shit a day later and had a long y’all about how digital strangers are just as dangerous as in person ones.

This article about digital predators and kids is chillin.


Damn my 8 yr old plays with Robolox, I need to remove it.
 

4 Dimensional

Rising Star
Platinum Member
My kid plays roblox, but she can barely read now. So..I'll be getting on there to check it out more. thanks. My be a way to turn off chat, I'll see.

I’ve had this rodeo with my daughter chatting. You can’t delete chats (you can disable) on roblox, so she was caught after we told her not to.

I have a tech savvy daughter, but she isn’t as slick as she think she is.

I disable the internet (through the router) just to let her know what the deal is.

Problem is they’ll always find a way to chat and the older they get, the slicker they’ll become. You can monitor all day long, but they’ll eventually learn how to create emails and register accounts with multiple user names. My daughter created 8 roblox accounts. Lol

I think we got em all.
 

Jumbodicc

Rising Star
BGOL Investor
https://www.inputmag.com/tech/reddi...tok-app-claims-massive-data-collection-scheme


....Data-collection chaos — The data allegedly being collected by TikTok ranges far and wide, extending way past what’s considered normal by tech industry standards. “If there is an API to get information on you, your contacts or your device...well, they’re using it,” writes bangorlol.
Here’s a brief run-down of the information TikTok is allegedly collecting from its users:
  • Phone hardware information, including CPU type, screen dimensions, memory usage, and hardware IDs
  • Other installed apps
  • Network information (IP address, MAC address, WiFi access point name, and router MAC address)
  • Jailbreak information, if applicable
  • GPS pinging (if you’ve ever location-tagged a post)
  • Local proxy server set-up with no authentication

Not the first reports — Worries over TikTok’s potential for exploiting user data have been around for basically as long as the app has been popular. That’s led to some pretty serious bans — U.S. Army operatives aren’t allowed to use it, for instance, and the TSA asked employees to stop using it at work in February.
But hard evidence is hard to come by in the case of TikTok. There are at least two ongoing official U.S. investigations into TikTok, but neither has released any results.
High-level executives, such as Reddit’s CEO and Facebook COO Sheryl Sandberg are willing to publicly condemn TikTok as a threat, but none have brought hard evidence to the table to back up their claims. Instead, abstract “security concerns” are cited and re-cited — and they’re beginning to come off more as alarmist than realistic.
Anti-China sentiment — All it takes is a cursory review of the replies under bangorlol’s Reddit tell-all to understand that this isn’t just about data privacy — it’s about China. More specifically, it’s about the U.S.’s fear of China.
Back in January 2019, the Peterson Institute set off all this concern over TikTok by publishing a report calling the app a national security threat. Basically every report condemning TikTok since has cited that report, or at the very least piggybacked off its research.
Here’s the thing about that report: it centers on privacy concerns in China, rather than on TikTok itself. Peterson essentially called the app a threat because of its immense popularity and its parent company ByteDance’s roots in China.
More transparency, please — It’s impossible to tell whether or not bangorlol’s assessment of TikTok is correct; the post notes that all of their documentation has been lost to a motherboard failure. And even if their assessment is factual, there’s no proof that TikTok is actually collecting this information on any servers or using it in any way.
What we really need from TikTok is more transparency about what data it’s collecting from users and how that data is stored and used by the company. This is true across the board, though — just about every tech company could benefit from being more transparent. Users deserve to have a complete understanding of how and why their data is being collected.
Rather than face privacy concerns head-on, TikTok and ByteDance continually attempt to skirt them with convoluted location-based developer restrictions and subterfuge about where, exactly, the company is headquartered.
Anti-TikTok sentiment has mostly quieted down in the U.S. amid larger concerns like the COVID-19 pandemic and racial equality protests. But this isn’t over for ByteDance. As recently as last month, a European Union watchdog set up a task force to assess TikTok’s privacy risks.
TikTok may be massively popular right now — but if it hopes to achieve long-term success, the company behind it will need to learn to be more open about how it handles user data.
 

playahaitian

Rising Star
Certified Pussy Poster

US TikTok ban temporarily blocked as judge grants preliminary injunction
A federal judge has granted TikTok’s request for a preliminary injunction following an emergency hearing. The move temporarily halts a ban on new TikTok app downloads in the US, hours before it was due to come into effect.



 
Top