Judge denies TikTok's request to dismiss Utah's case alleging sexual exploitation of children
In the lawsuit, Utah Division of Consumer Protection vs. TikTok, filed in June 2024, the state alleges TikTok is aware its livestream feature allows young users to be sexually exploited, with the allegations based on internal studies and admissions by employees.
'TikTok's design tactics encourage and allow it to profit from crime and the sexual exploitation of children,' the suit states. 'These deceptive and unconscionable practices violate Utah's Consumer Sales Practices Act and harm Utah's consumers.'
TikTok tried to get the case dismissed, but on Friday, 3rd District Court Judge Coral Sanchez denied the request.
'Protecting Utah children from exploitation and the harms that TikTok knowingly inflicts upon them is one of my highest priorities as attorney general,' Utah Attorney General Derek Brown said in a press release, responding to news of the dismissal. 'I am grateful for the court's decision that allows Utah's lawsuit against TikTok to move forward, helping us protect young people from sexual exploitation. This decision will help me in my fight to protect and defend Utah children.'
About the case, a TikTok spokesperson previously said, 'TikTok has industry-leading policies and measures to help protect the safety and well-being of teens. Creators must be at least 18 years old before they can go LIVE, and their account must meet a follower requirement. We immediately revoke access to features if we find accounts that do not meet our age requirements.'
Over the past few years, attorneys general across the country have tried to address safety concerns associated with children and teens using social media platforms like TikTok and Snapchat. In a joint letter signed and sent in 2022 by 44 AGs to the two online platforms, the attorneys warned of 'online dangers including cyberbullying, drug use and sexual predation' that children are allegedly subjected to while using the apps.
Utah filed its first lawsuit against TikTok in October 2023. In that consumer protection suit, the state alleged TikTok was aware its social media app was addictive via features such as the algorithm and push notifications.
About that suit, a TikTok spokesperson previously said, 'TikTok has industry-leading safeguards for young people, including an automatic 60-minute time limit for users under 18 and parental controls for teen accounts. We will continue to work to keep our community safe by tackling industry-wide challenges.'
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
2 hours ago
- Yahoo
Beauty retailers must partner new brands to attract trend-led consumers
In an evolving UK health and beauty market in which consumers' preferences constantly shift, retailers must increase their partnerships with emerging brands to capture the attention of discerning younger generations. Social media platforms such as TikTok have accelerated shoppers' exposure to emerging beauty trends, and 41% of UK consumers stated they would try a trending beauty technique out of curiosity, up 6 percentage points (ppts) from 2024. Given the growing influence of social media, especially among young consumers, retailers must swiftly incorporate new brands into their offers and seek exclusive partnerships to capitalise on the latest social media trends. UK consumers who rely on social media to stay up-to-date on current beauty trends GlobalData's June 2025 monthly survey of 2,000 UK respondents reveals that 34% of consumers rely on social media to stay up-to-date on current beauty trends, a 6 ppt increase on 2024. Younger consumers are driving this sentiment, with 67% of 16-to-24-year-olds relying on social media, indicating this is a primary source of inspiration for the demographic. However, the fleeting nature of trends on social media presents a challenge for retailers, as what was popular a few months ago may no longer hold the same relevance. Retailers should partner with trending brands, ensuring they improve their perception among this younger age group. For example, awareness of trends such as slugging (the practice of applying a thick, occlusive ointment such as Vaseline as the final step in a nighttime skincare routine) and at-home teeth whitening has declined in 2025 compared to the previous year, while awareness of skin flooding (a skincare technique that involves layering multiple hydrating products on damp skin to maximise moisture absorption and retention) and light therapy has risen. Working closely with third-party brands specialising in these trends to secure small-batch or limited-edition runs of trending products will enable retailers to keep up with rapidly changing trends without the risk and cost of developing and launching their own-brand alternatives. Given the pace at which new health and beauty trends emerge through social media, retailers must ramp up investment in category analysis capabilities. This development must involve prioritising trending brands within social media marketing and allocating high-visibility shelf space in stores to these items. Market leader Boots has recognised this shift in consumer preferences and has been actively refreshing its brand selection to reflect this. The retailer has introduced popular K-beauty brands such as Laneige and Beauty of Joseon, along with new brands such as Humantra, in June 2025. Boots' strategy for expanding its market reach has been effective; it is attracting younger consumers while still appealing to its existing diverse customer base. "Beauty retailers must partner new brands to attract trend-led consumers" was originally created and published by Retail Insight Network, a GlobalData owned brand. The information on this site has been included in good faith for general informational purposes only. It is not intended to amount to advice on which you should rely, and we give no representation, warranty or guarantee, whether express or implied as to its accuracy or completeness. You must obtain professional or specialist advice before taking, or refraining from, any action on the basis of the content on our site. Sign in to access your portfolio


The Hill
3 hours ago
- The Hill
Utah accuses Snapchat of designing algorithm addictive to children
Top Utah officials are suing Snap Inc., which owns the social media platform Snapchat, and accusing it of creating an algorithm addicting children to the app, as well as enabling the illegal sales of drugs and sexual exploitation. Republican Gov. Spencer Cox and state Attorney General Derek Brown filed the lawsuit on Monday, saying Snap 'profits from unconscionable design features created to addict children to the app, and facilitates illegal drug sales and sextortion.' The image-sharing app allows users to send pictures that disappear after they are viewed, which the lawsuit states is a 'favored tool for drug dealers and sexual predators targeting children.' The lawsuit details four cases where men groomed, sexually abused or assaulted children through Snapchat since 2021. It also lists the arrest of a drug dealer running a 'truly massive' drug ring through Snapchat in 2019. The lawsuit also alleges that the platform's AI feature, 'My AI,' which allows users to send text, pictures and video to it, 'comes as states confront the harsh realities of AI technology's impact on children.' The lawsuit accuses the AI model of 'hallucinating false information and giving dangerous advice' to users, including minors. 'Tests on underage accounts have shown My AI advising a 15-year-old on how to hide the smell of alcohol and marijuana; and giving a 13-year-old account advice on setting the mood for a sexual experience with a 31-year-old,' the lawsuit states. 'This lawsuit against Snap is about accountability and about drawing a clear line: the well-being of our children must come before corporate profits,' Cox said in a statement. 'We won't sit back while tech companies exploit young users.' The state also accuses Snap of deceiving users and their parents about the safety of its platform, noting it violates the Utah Consumer Privacy Act by not informing users of their data-sharing practices and failing to allow users to opt out of sharing their data. It states that the AI feature still collects user geolocation data even when 'Ghost Mode,' which hides users' location from other users, is activated. 'Snap's commitment to user safety is an illusion,' the lawsuit reads. 'Its app is not safe, it is dangerous.' The Hill has reached out to Snap Inc. for comment. The filing is Utah's fourth lawsuit filed against social media companies, following lawsuits against Meta, which owns Facebook and Instagram, and TikTok. Utah is not the first state to sue the platform for its impact on children. In April, Florida sued the platform as well, making similar allegations about its harm to children.


WIRED
3 hours ago
- WIRED
AI Videos of Black Women Depicted as Primates Are Going Viral
Jul 1, 2025 1:31 PM Some Instagram creators are using Google's Veo 3 and racking up millions of views on AI videos of 'bigfoot baddies.' They'll teach you how to make them for $15. An AI-generated 'bigfoot baddie,' with acrylic nails and a pink wig, speaks directly to her imaginary audience using an iPhone. 'We might have to go on the run,' she says. 'I'm wanted for a false report on my baby daddy.' This AI video, generated by Google's Veo 3, has racked up over a million views on Instagram. It's just one of many viral posts on Instagram and TikTok viewed by WIRED that depict Black women as primates and perpetuate racist tropes using AI video tools. Google's Veo 3 was a hit with online audiences when it dropped at the company's developer conference in May. Surreal generations of Biblical characters and cryptids, like bigfoot, doing influencer-style vlogging quickly spread across social media. AI-generated bigfoot vlogs were even used by Google as a selling point in ads promoting the new feature. With 'bigfoot baddies,' online creators are taking what was a fairly innocuous trend on social media and repurposing it to dehumanize Black women. 'There's a historical precedent behind why this is offensive. In the early days of slavery, Black people were overexaggerated in illustrations to emphasize primal characteristics,' says Nicol Turner Lee, director of the Center for Technology Innovation at the Brookings Institution. 'It's both disgusting and disturbing that these racial tropes and images are readily available to be designed and distributed on online platforms,' says Turner Lee. One of the most popular Instagram accounts posting these generated clips has five videos with over a million views, less than a month after the account's first post. The AI videos feature the animal-woman hybrids speaking African American Vernacular English in a caricatured manner, with the characters often shown wearing a bonnet and threatening to fight people. In one clip, the AI generation, using a country accent, implies she pulled out a bottle of Hennessy liquor that was stored in her genitals. Veo 3 can create everything seen in videos like this, the scenery to the spoken audio to the characters themselves, from a single prompt. The bio of the popular Instagram account includes a link to a $15 online course where you can learn how to create similar videos. In videos with titles like 'Veo 3 does the heavy lifting,' three teachers use voiceover to step students through the process of prompting the AI video tool for bigfoot clips and creating consistent characters. The email address listed as the administrator of the online course bounced back messages when WIRED attempted to contact the creators. A spokesperson for Meta, which owns Instagram, declined to comment on the record. Google and TikTok both acknowledged WIRED's request for comment, but did not provide a statement prior to publication. Our social media analysis found copycat accounts on Instagram and TikTok reposting the 'bigfoot baddie' clips or generating similar videos. A repost of one video on Instagram has 1 million views on an AI-focused meme page. A different Instagram account has another 'bigfoot baddie' video with almost 3 million views. It's not just on Instagram; an account on TikTok dedicated to similar AI-generated content currently has over 1 million likes. These accounts did not immediately respond to a request for comment. 'If I die here, I better get resurrected with a BBL,' says an AI-generated female bigfoot on a different account, talking to the camera as she dodges bombs while vacationing in Israel. 'One of the problems with generative AI is that the creators of AI tools cannot conceive of all of the ways that people can be horrible to each other,' says Meredith Broussard, a professor at New York University and author of More Than a Glitch , a book about biases in technology. 'So, they can't put up a sufficient number of guardrails. It's exactly the same problem we've seen on social media platforms.' A screenshot of one of the 'Bigfoot Baddies' videos WIRED found on Instagram. The video was generated by AI tools. Courtesy of Reece Rogers After clicking on a few of the female bigfoot videos, the Instagram Reels feed for our test account was soon filled by the algorithm with other racist videos—including an AI generation of a Black man on a fishing boat excitedly catching a piece of fried chicken and referring to a chimpanzee as his son. While these AI videos are upsetting, they are not necessarily surprising. Back in 2023, as an AI-generated video of Will Smith eating spaghetti was going viral on social media, WIRED senior writer Jason Parham dissected the video as a form of minstrelsy. 'This coming age of new minstrelsy will assume an even more cunning chameleon form, adaptive and immediate in its guile, from humanistic deepfakes and spot-on voice manipulations to all manner of digital deceit,' Parham wrote at the time. With this latest wave of generative AI video tools, helmed by Google's Veo 3, it's never been easier to produce photorealistic AI videos. The ease of generating numerous videos paired with the consistent spread of AI slop on social media platforms is part of what's popularized these 'bigfoot baddies.' More social media trends where creators use AI to attack minority groups will likely continue. 'AI has not only made it easier to manipulate images,' Turner Lee says. 'But the algorithm itself, and the ecology of the algorithm, has also made it easier to share or to ramp up your consumption of this content.'