logo
#

Latest news with #geography

Jannik Sinner's three-month drugs ban from tennis felt all too convenient, too light on the fading principle of strict liability... excuse those of us rooting for Carlos Alcaraz at Wimbledon, writes RIATH AL-SAMARRAI
Jannik Sinner's three-month drugs ban from tennis felt all too convenient, too light on the fading principle of strict liability... excuse those of us rooting for Carlos Alcaraz at Wimbledon, writes RIATH AL-SAMARRAI

Daily Mail​

time2 hours ago

  • Science
  • Daily Mail​

Jannik Sinner's three-month drugs ban from tennis felt all too convenient, too light on the fading principle of strict liability... excuse those of us rooting for Carlos Alcaraz at Wimbledon, writes RIATH AL-SAMARRAI

This is a true story - a dog once ate my homework. Well, he clawed a couple of pages into a state of disrepair, but my geography teacher didn't quite buy that a border collie was solely responsible for late submission. Nor did a different teacher when the same pet peed on my sister's English essay. I miss that dog, a good boy mostly, but I also wonder what those teachers might have brought to the arbitration of doping disputes in elite sport.

Alex Garland Reveals The Original Very Different Story Idea For 28 YEARS LATER — GeekTyrant
Alex Garland Reveals The Original Very Different Story Idea For 28 YEARS LATER — GeekTyrant

Geek Tyrant

time21 hours ago

  • Entertainment
  • Geek Tyrant

Alex Garland Reveals The Original Very Different Story Idea For 28 YEARS LATER — GeekTyrant

Before 28 Days Later became the haunting, emotional, and feverishly intense and wild film that we got, it was something else entirely, something much different. According to screenwriter Alex Garland, that original idea probably would've tanked the franchise. In a recent interview with Rolling Stone, Garland opened up about a very early concept for the long-awaited third film in the rage-virus saga. Let's just say, if things had gone that route, we wouldn't be talking about 28 Years Later like we are right now. 'I had a version of this story that was basically a big, dumb action movie,' Garland admitted. The story would've followed a group of Chinese Special Forces soldiers who break quarantine and sneak into the U.K. to find the lab where the virus started hoping to find a cure. But when they arrive, another group is already there... trying to weaponize it. 'It was completely and utterly f***ing generic,' Garland said. 'Shootouts and mass attacks and big, action-adventure-style set pieces.' Oh, and it would've been entirely in Mandarin with English subtitles. Danny Boyle, his longtime creative partner, didn't exactly embrace the idea. 'He just laughed,' Garland recalled, adding that they both eventually tried to rework it, but 'finally, we both gave up on it.' Still, the process wasn't a total loss. Garland said: 'Writing something so generic was the freeing element to all of our problems. It gave us permission to have a totally blank slate.' That 'blank slate' gave way to the film we eventually got, which is a movie that builds on the legacy of the original without trying to imitate it. It centers on a father and son (played by Aaron Taylor-Johnson and Alfie Williams) who leave their isolated island and reenter a U.K. that has mutated with the virus—both biologically and psychologically. The infected have evolved. The recation has been divisive, but I loved it! I liked the big swings that it took and enjoyed what it ultimtely delivered. It also got me super excited about The Bone Temple . It's kind of a small miracle we got this version instead of the Mandarin-language military shoot-'em-up. Sure, Garland's scrapped idea could've made for something interesting, but it's clear the soul of the 28 Days universe lies in something more grounded, personal, disturbing, and bonkers. It's certinaly not a generic film!

AI And Domestic Violence: Boon, Bane — Or Both?
AI And Domestic Violence: Boon, Bane — Or Both?

Forbes

timea day ago

  • Forbes

AI And Domestic Violence: Boon, Bane — Or Both?

Output of an Artificial Intelligence system from Google Vision, performing Facial Recognition on a ... More photograph of a man, with facial features identified and facial bounding boxes present. (Photo by Smith Collection/Gado/Getty Images) One evening, a woman clicked open the camera app on her phone, hoping to capture what nobody else was around to see — her partner's escalating rage. Their arguments followed a familiar pattern: she would say something that set him off, and he would close the physical distance between them, screaming in a high-pitched voice, often threatening her with violence. 'I'd never actually do it, of course,' he would often say later on, once the dust had settled between them. 'You're the one mentally torturing me.' Sometimes, the response would escalate even further. He would throw her phone across the room, upset by the 'invasion of [his] privacy', or snatch an object from her hands, raising it as if to strike her with it. No physical bruises were left, but the writing was on the wall — with no device to capture it, no alert to trigger and no safe place to store the evidence. For many women, this isn't a plot point from a cringey Netflix drama — it's near-daily reality, and comprises the kind of behavior that rarely results in a police complaint. Notably, while threats of physical harm are explicitly criminal in many jurisdictions — including India and the U.S. — they've long gone undocumented and unprosecuted. Experts note that this very pattern — escalating verbal threats, threatened or actual destruction of property and intimidation — often marks the early stages of more serious and damaging domestic violence. And in certain contexts, AI-enabled tools are making it easier to discreetly gather evidence, assess personal risk and document abuse — actions that were previously unsafe or more difficult to carry out. At the same time, these technologies open up unprecedented avenues for new forms of harm. Increasingly, the most common 'eyewitness' in situations like these is a phone, a cloud account or a smart device — secretly recording, storing and offering support or a lifeline. But just as easily, the same tools can be turned into instruments of control, surveillance and even manipulated retaliation. Tech For Good Around the world, one in three women has experienced physical or sexual violence by a partner, according to the World Health Organization. As AI becomes more embedded in everyday devices, a growing number of tools have come up, often with the stated goal of making homes safer for those at risk — particularly those experiencing intimate partner violence. During the COVID-19 pandemic, as cases of domestic violence surged, Rhiana Spring, a human rights lawyer and founder of the Swiss-based nonprofit Spring ACT, saw an opportunity to deploy AI for good. Her organization developed Sophia, a chatbot that offers confidential, 24/7 assistance to domestic violence survivors. Users can talk to Sophia without leaving a digital trace, covertly seek help and even store evidence for use in legal proceedings. Unlike traditional apps, Sophia doesn't require a download, minimizing surveillance risks. 'We've had survivors contact Sophia from Mongolia to the Dominican Republic,' Spring told Zendesk after winning a Tech for Good award in 2022. Meanwhile, smart home cameras, like those from Arlo or Google Nest, now offer AI-driven motion and sound detection that can distinguish between people, animals and packages. Some can even detect screaming or unusual sounds and send alerts instantly — features that can be valuable for creating a digital record of abuse, especially when survivors are worried about gaslighting or lack physical evidence. Several CCTV systems also allow cloud-based, encrypted storage, which prevents footage from being deleted or accessed locally by an abuser. Services like Wyze Cam Plus offer affordable cloud subscriptions with AI tagging, and features like 'privacy masking' allow selective blackouts in shared spaces. For discreet assistance, several smartphone apps also integrate AI with panic alert features. Examples include MyPlan, Aspire News — which poses as a news app but offers emergency contacts and danger assessment tools — and Circle of 6. Smart jewelry like InvisaWear and Flare hide panic buttons in accessories, where, with a double-tap, users can clandestinely notify emergency contacts and share their GPS location. Beyond home safety and personal apps, AI is also entering hospitals and law enforcement in the context of domestic violence response and prevention. Dr. Bharti Khurana, a radiologist at Brigham and Women's Hospital, developed an AI-powered tool called the Automated Intimate Partner Violence Risk Support (AIRS) system, which scans medical records and imaging data for subtle injury patterns often missed by doctors and flags patients who may be victims of abuse. According to Khurana's team, AIRS has helped identify domestic violence up to four years earlier than patients typically report it. Another U.S.-based initiative, Aimee Says, was launched in Colorado to help survivors navigate the complexities of the legal system. The chatbot walks users through the process of filing protection orders, finding support organizations and understanding their rights. The app features guest mode sessions that disappear after use as well as a hidden exit button for quick redirection if an abuser walks into the room. 'We want to be there before the person is ready to reach out to a victim service organization — hopefully, early enough to prevent a future of violence,' said co-founder Anne Wintemute in a December 2024 interview with The Decatur Daily. Double-Edged Sword In India and much of the Global South, domestic violence continues to be rampant, widespread and hugely underreported. According to the National Family Health Survey (NFHS-5), nearly one in three Indian women aged 18 to 49 has experienced spousal violence — but only a fraction seek help, often due to stigma, dependency, fear of escalation or lacunae in response frameworks and accountability. In these contexts, AI has the potential to be a particularly powerful tool — helping survivors document abuse or seek help — but its reach is limited by access, resources and trust in the technology itself. Surveillance concerns also loom large, especially in environments where privacy is already compromised. Moreover, the same technologies that support survivors can also open new avenues for harm — particularly when wielded by abusers. Deepfake technology, which uses generative AI to produce hyper-realistic fake audio, images or video, is already complicating legal proceedings, with fabricated call logs, messages or videos sometimes used to falsely implicate victims. In restraining order hearings or custody disputes, which often happen quickly and with limited fact-finding, courts may have little time or capacity to assess the authenticity of digital evidence. Products that store data, enable remote surveillance and monitor behavior can just as easily become weaponized by abusers. Few tech companies offer transparency and answerability on how their tools could be misused in these ways, or build in strong enough safety features by design. 'In parallel, the emergence of deepfake technology … also raises alarms regarding privacy invasion, security risks and propagation of misinformation,' warned Supreme Court Justice Hima Kohli of India, explaining how easy it has become to manipulate trust in digital content. The same code that is used as a lifeline, then, can also become a weapon in the same breath. As AI evolves, while the real test for the tech industry is indeed about how 'smart' their tools can become, it's also about how safely and justly they can adapt to serve those who need them most.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store