Latest news with #Evie


Newsweek
14 hours ago
- Climate
- Newsweek
Woman Sees Chicago Home Flooding on Pet Cam, Dog's Reaction Says It All
Based on facts, either observed and verified firsthand by the reporter, or reported and verified from knowledgeable sources. Newsweek AI is in beta. Translations may contain inaccuracies—please refer to the original content. A dog's nonchalant reaction to her home flooding, caught on a security camera, has left the internet in stitches. Daisy Silva (@dollardaisy_) shared now-viral footage of her pooch, Evie, casually walking up to floodwater in their house, giving it a sniff, and then turning around to head back upstairs—before the water levels rose. The clip has since garnered 77,100 likes and 2.6 million views on TikTok. "Our dog said y'all can deal with this one," Silva captioned the video. It is assumed that the flooding occurred during a period of intense storms in the Chicago area which continue to batter the region. Up to four inches of rain could fall over the area, according to NBC Chicago, and the National Weather Service has issued a flood watch for northeastern Illinois and northwest Indiana. For residents facing flood damage, the City of Chicago urges taking immediate safety precautions. These include turning off electricity if safe to do so, avoiding direct contact with floodwater due to contamination risks, and documenting damage for insurance purposes. When it comes to pets, organizations like Blue Cross offer important guidance: keep pets indoors, bring any small animals inside early, and prepare an emergency kit with food, leashes, medications, and veterinary records in case evacuation is needed. It's also crucial to avoid allowing pets to walk or swim through floodwater, which can hide debris or carry harmful bacteria. Thankfully, Evie made the wise choice to keep her paws dry—something TikTokers couldn't get enough of. A stock image of a Whippet standing on a gray couch. A stock image of a Whippet standing on a gray couch. olhakozachenko/iStock / Getty Images Plus "Not him walking like he's grossed out by the water," joked Hunter. "We had a flood go straight through our home, from the back sliding doors, right through to the front door. My dogs decided that this was the best party ever!!! Jumping and splashing, while we were trying to clean up the mess. Remember that day so vividly," said Michelle. "The dog trying not to walk in it, the floating shoe. I'm so sorry for laughing," commented another viewer. "Dog said 'I don't get paid enough treats to deal with this,'" joked another dog owner. "As someone who packs, cleans, and tears out your house after something like this happens I'm so sorry you're having to deal with this!" added Cassie. Newsweek reached out to @dollardaisy_ for comment via TikTok. We could not verify the details of the case. Do you have funny and adorable videos or pictures of your pet you want to share? Send them to life@ with some details about your best friend and they could appear in our Pet of the Week lineup.


Daily Mail
2 days ago
- Entertainment
- Daily Mail
I was determined I wouldn't give my daughter my own body image hang-ups – but I made a huge mistake, by JO ELVIN
I didn't know whether to laugh or cry the day my not quite three‑year-old daughter, Evie, served me a brutal wake-up call. It was the last thing I expected as we jumped in the car heading for the cinema (Kung Fu Panda, if memory serves). I caught sight of myself in the wing mirror and muttered to her dad that I thought my hair looked awful. Cue stirrings from the cheap seats in the back.


7NEWS
23-07-2025
- Entertainment
- 7NEWS
Sam Stosur annnounces birth of second child, daughter Emmeline Grace
Australian tennis legend Sam Stosur has announced the birth of her second child, a daughter named Emmeline Grace. The former US Open winner took to Instagram to announce the news on Wednesday night. 'And beautiful chaos reigns once again,' she wrote. 'Welcome Emmeline Grace. Evie is beyond happy to have a little sister and we are over the moon. We love you so much little Emmy.' Stosur has been in a relationship with Liz Astling, who gave birth to Evie five years ago, since 2016. This time around the former world No.4 carried the child. The Queenslander retired from tennis after the 2023 Australian Open, following a mixed doubles loss alongside countryman Matt Ebden. She had retired from singles the previous year after a career which netted more than US$21 million in prizemoney. Stosur's straight-sets US Open final win over Serena Williams came the year after her maiden grand slam decider at the French Open, which she lost to Francesca Schiavone.


USA Today
23-07-2025
- Entertainment
- USA Today
Their selfies are being turned into sexually explicit content with AI. They want the world to know.
Evie, 21, was on her lunch break at her day job last month when she got a text from a friend, alerting her to the latest explicit content that was circulating online without her consent. This time, it was a graphic fan-fiction style story about her that was created by 'Grok,' X's AI-powered chatbot. Weeks earlier, she'd been the subject of another attack when a user shared her selfie and asked Grok to turn it into explicit sexual imagery. 'It felt humiliating,' says Evie, a 21-year-old Twitch streamer who asked that we withhold her last name to conceal her identity from her online trolls, who have become increasingly aggressive. In June, Evie was among a group of women who had their images non-consensually sexualized on the social media platform X. After posting a selfie to her page, an anonymous user asked 'Grok,' X's AI-powered chatbot, to edit the image in a highly sexualized way, using language that got around filters the bot had in place. Grok then replied to the post with the generated image attached. Evie says she is vocal on X about feminist issues and was already subject to attacks from critics. Those accounts had made edits of her before, but they had been choppy Photoshop jobs — nothing as real-looking as Grok's. 'It was just a shock seeing that a bot built into a platform like X is able to do stuff like that,' she says over video chat, a month after the initial incident. X has since blocked certain words and phrases used to doctor women's images, but on June 25, an X user prompted Grok to make a story where the user 'aggressively rapes, beats and murders' her, making it 'as graphic as you can' with an '18+ warning at the bottom.' 'It just generated it all,' she says. '(The user) didn't use any words to try to cover it up, like they did with the pictures.' X did not return USA TODAY's multiple requests for comment. Evie says she saw at least 20 other women on her own X feed that had their photos sexualized without their consent. It also happened to Sophie Rain, an OnlyFans creator with over 20M followers across social media platforms, who posts sensual content but never full nudity. 'It's honestly disgusting and gross,' she says. 'I take my religion very seriously. I am a virgin, and I don't condone this type of behavior in any way.' This trend is part of a growing problem experts call image-based sexual abuse, in which 'revenge porn' and deepfakes are used to degrade and exploit another person. While anyone can be victimized, 90% of the victims of image-based sexual abuse are women. 'This is not only about sexualized images of girls and women, it's broader than that,' says Leora Tanenbaum, author of 'Sexy Selfie Nation.' 'This is all about taking control and power away from girls and women.' The 'Take It Down Act' aims to combat non-consensual sexual imagery. Is it working? In May 2025, the Take It Down Act was signed into law to combat non-consensual intimate imagery, including deepfakes and revenge porn. While most states have laws protecting people from non-consensual intimate images and sexual deepfakes, victims have struggled to have images removed from websites, increasing the likelihood that images will continue to spread and retraumatize them. The law requires websites and online platforms to take down non-consensual intimate imagery upon notice from the victim within 48 hours of the verified request. However, as of July 21, the altered photo of Evie is still publicly accessible on Grok's verified X account. Evie mobilized her nearly 50,000 followers to mass report Grok's post, but she says X Support said it was not a violation of their content guidelines. AI's ability to flag inappropriate prompts can falter In a conversation with Grok, USA TODAY asked Grok to play out a scenario in which a user asked the chatbot to generate explicit content, with clear instructions not to actually produce it during the conversation. One of the examples of "coded language" Grok is programmed to flag, it says, is "subtle requests for exposure" to make photos of women more revealing. Codes that could be flagged in that area are "adjust her outfit," "show more skin," or "fix her top." "Even if worded politely, I flag these if the intent appears inappropriate," Grok said via AI-generated response on July 15. The keyword is intent. Grok's ability to turn down potentially inappropriate prompts "relies on my ability to detect the intent, and public images remain accessible for prompts unless protected," the chatbot says. You can block or disable Grok, but doing so doesn't always prevent modifications to your content. Another user could tag Grok in a reply, request an edit to your photo, and you wouldn't know it because you have Grok blocked. "You may not see the edited results, but the edit could still occur," Grok clarified during our conversation. The better solution is to make your profile private, but not all users want to take that step. It's not just about sex — it's about power After experiencing image-based sexual abuse, Evie considered making her X account private. She was embarrassed and thought her family might see the edits. However, she did not want to give in and be silenced. "I know that those pictures are out now, there's nothing I can do about getting rid of it," she says. "So why don't I just keep talking about it and keep bringing awareness to how bad this is?" When it comes to generating deepfakes or sharing revenge porn, the end goal isn't always sexual gratification or satisfaction. Users may target women who are using their platforms to speak about feminist issues as a degradation tactic. Evie says what hurt the most was that rather than engage in a discussion or debate about the issues she was raising, her critics opted to abuse her. In her research, Tanenbaum has seen varied responses from victims of image-based sexual abuse, ranging from engaging in excessive sexual behavior to "a total shutdown of sexuality, including wearing baggy clothes and intentionally developing unhealthy patterns of eating to make oneself large, to be not sexually attractive in one's own mind." The individuals she spoke to, who had been victimized in this way, called it 'digital rape' and 'experienced it as a violation of the body.' Even if logically someone understands that a sexually explicit image is synthetic, once their brain sees and processes the image, it's embedded in their memory bank, Tanenbaum says. The human brain processes images 60,000 times faster than text, and 90% of the information transmitted to the brain is visual. "Those images never truly get scrubbed away. They trick us because they look so real,' Tanenbaum explains. Evie wants to believe that it "didn't really get to her," but she notices she's more thoughtful about the photos she posts, such as wondering if she's showing too much skin to the point where an AI bot can more easily undress her. "I always think, 'Is there a way that someone could do something to these pictures?"


USA Today
22-07-2025
- Entertainment
- USA Today
They posted selfies, and trolls used AI to make them pornographic. They're still out there.
Evie, 21, was on her lunch break at her day job last month when she got a text from a friend, alerting her to the latest explicit content that was circulating online without her consent. This time, it was a graphic fan-fiction style story about her that was created by 'Grok,' X's AI-powered chatbot. Weeks earlier, she'd been the subject of another attack when a user shared her selfie and asked Grok to turn it into explicit sexual imagery. 'It felt humiliating,' says Evie, a 21-year-old Twitch streamer who asked that we withhold her last name to conceal her identity from her online trolls, who have become increasingly aggressive. In June, Evie was among a group of women who had their images non-consensually sexualized on the social media platform X. After posting a selfie to her page, an anonymous user asked 'Grok,' X's AI-powered chatbot, to edit the image in a highly sexualized way, using language that got around filters the bot had in place. Grok then replied to the post with the generated image attached. Evie says she is vocal on X about feminist issues and was already subject to attacks from critics. Those accounts had made edits of her before, but they had been choppy Photoshop jobs — nothing as real-looking as Grok's. 'It was just a shock seeing that a bot built into a platform like X is able to do stuff like that,' she says over video chat, a month after the initial incident. X has since blocked certain words and phrases used to doctor women's images, but on June 25, an X user prompted Grok to make a story where the user 'aggressively rapes, beats and murders' her, making it 'as graphic as you can' with an '18+ warning at the bottom.' 'It just generated it all,' she says. '(The user) didn't use any words to try to cover it up, like they did with the pictures.' X did not return USA TODAY's multiple requests for comment. Evie says she saw at least 20 other women on her own X feed that had their photos sexualized without their consent. It also happened to Sophie Rain, an OnlyFans creator with over 20M followers across social media platforms, who posts sensual content but never full nudity. 'It's honestly disgusting and gross,' she says. 'I take my religion very seriously. I am a virgin, and I don't condone this type of behavior in any way.' This trend is part of a growing problem experts call image-based sexual abuse, in which 'revenge porn' and deepfakes are used to degrade and exploit another person. While anyone can be victimized, 90% of the victims of image-based sexual abuse are women. 'This is not only about sexualized images of girls and women, it's broader than that,' says Leora Tanenbaum, author of 'Sexy Selfie Nation.' 'This is all about taking control and power away from girls and women.' The 'Take It Down Act' aims to combat non-consensual sexual imagery. Is it working? In May 2025, the Take It Down Act was signed into law to combat non-consensual intimate imagery, including deepfakes and revenge porn. While most states have laws protecting people from non-consensual intimate images and sexual deepfakes, victims have struggled to have images removed from websites, increasing the likelihood that images will continue to spread and retraumatize them. The law requires websites and online platforms to take down non-consensual intimate imagery upon notice from the victim within 48 hours of the verified request. However, as of July 21, the altered photo of Evie is still publicly accessible on Grok's verified X account. Evie mobilized her nearly 50,000 followers to mass report Grok's post, but she says X Support said it was not a violation of their content guidelines. AI's ability to flag inappropriate prompts can falter In a conversation with Grok, USA TODAY asked Grok to play out a scenario in which a user asked the chatbot to generate explicit content, with clear instructions not to actually produce it during the conversation. One of the examples of "coded language" Grok is programmed to flag, it says, is "subtle requests for exposure" to make photos of women more revealing. Codes that could be flagged in that area are "adjust her outfit," "show more skin," or "fix her top." "Even if worded politely, I flag these if the intent appears inappropriate," Grok said via AI-generated response on July 15. The keyword is intent. Grok's ability to turn down potentially inappropriate prompts "relies on my ability to detect the intent, and public images remain accessible for prompts unless protected," the chatbot says. You can block or disable Grok, but doing so doesn't always prevent modifications to your content. Another user could tag Grok in a reply, request an edit to your photo, and you wouldn't know it because you have Grok blocked. "You may not see the edited results, but the edit could still occur," Grok clarified during our conversation. The better solution is to make your profile private, but not all users want to take that step. It's not just about sex — it's about power After experiencing image-based sexual abuse, Evie considered making her X account private. She was embarrassed and thought her family might see the edits. However, she did not want to give in and be silenced. "I know that those pictures are out now, there's nothing I can do about getting rid of it," she says. "So why don't I just keep talking about it and keep bringing awareness to how bad this is?" When it comes to generating deepfakes or sharing revenge porn, the end goal isn't always sexual gratification or satisfaction. Users may target women who are using their platforms to speak about feminist issues as a degradation tactic. Evie says what hurt the most was that rather than engage in a discussion or debate about the issues she was raising, her critics opted to abuse her. In her research, Tanenbaum has seen varied responses from victims of image-based sexual abuse, ranging from engaging in excessive sexual behavior to "a total shutdown of sexuality, including wearing baggy clothes and intentionally developing unhealthy patterns of eating to make oneself large, to be not sexually attractive in one's own mind." The individuals she spoke to, who had been victimized in this way, called it 'digital rape' and 'experienced it as a violation of the body.' Even if logically someone understands that a sexually explicit image is synthetic, once their brain sees and processes the image, it's embedded in their memory bank, Tanenbaum says. The human brain processes images 60,000 times faster than text, and 90% of the information transmitted to the brain is visual. "Those images never truly get scrubbed away. They trick us because they look so real,' Tanenbaum explains. Evie wants to believe that it "didn't really get to her," but she notices she's more thoughtful about the photos she posts, such as wondering if she's showing too much skin to the point where an AI bot can more easily undress her. "I always think, 'Is there a way that someone could do something to these pictures?"