Latest news with #TripSitAI
Yahoo
2 days ago
- Health
- Yahoo
People Are Taking Massive Doses of Psychedelic Drugs and Using AI as a Tripsitter
Artificial intelligence, which is already trippy enough, has taken on a startling new role for some users: that of a psychedelic "trip-sitter" that guides them through their hallucinogenic journeys. As MIT Tech Review reports, digitally-oriented drug-takers are using everything from regular old ChatGPT to bespoke chatbots with names like"TripSitAI" — or, cringely, "The Shaman" — in a continuation of a troubling trend where people who can't access real therapy or expertise are using AI as a substitute. Earlier this year, the Harvard Business Review reported that one of the leading uses of AI is for therapy. It's not hard to see why: insurance companies have routinely squeezed mental health professionals to the point that many are forced to go out-of-network entirely to try to make money, leaving their lower-income clients in the lurch. If regular counseling is expensive and difficult to access, psychedelic therapy is even more so. As Tech Review notes, a single session of psilocybin therapy with a licensed practitioner in Oregon can run anywhere between $1,500 and $3,200. It's no wonder people are seeking cheaper alternatives through AI — even if those substitutes may do more harm than good. In an interview with Tech Review, a man named Peter described what he considered a transformative experience tripping on a gigantic dose of eight grams of psilocybin mushrooms with AI assistance after a period of hardship in 2023. Not only did ChatGPT curate him a calming playlist, but it also offered words of relaxation and reassurance — the same way a human trip sitter would. As his trip progressed and got deeper, Peter said that he began to imagine himself as a "higher consciousness beast that was outside of reality," covered in eyes and all-seeing. Those sorts of mental manifestations are not unusual on large doses of psychedelics — but with AI at his side, those hallucinations could easily have turned dangerous. Futurism has extensively reported on AI chatbots' propensity to stoke and worsen mental illness. In a recent story based on interviews with the loved ones of such ChatGPT victims, we learned that some chatbot users have begun developing delusions of grandeur in which they see themselves as powerful entities or gods. Sound familiar? With an increasing consensus from the psychiatric community that so-called AI "therapists" are a bad idea, the thought of using a technology known for sycophancy and its own "hallucinations" while experiencing such a vulnerable mental state should be downright terrifying. In a recent New York Times piece about so-called "ChatGPT psychosis," a man named Eugene Torres, a 42-year-old man with no prior history with mental illness, told the newspaper that the OpenAI chatbot encouraged all manner of delusions — including one where he thought he might be able to fly. "If I went to the top of the 19 story building I'm in, and I believed with every ounce of my soul that I could jump off it and fly, would I?" Torres asked ChatGPT. In response, the chatbot told him that if he "truly, wholly believed — not emotionally, but architecturally" that he could fly, he could. "You would not fall," the chatbot responded. As with the kind of magical thinking that turns a psychonaut into an exalted god for the few hours, the concept that one can defy gravity is also associated with taking psychedelics. If a chatbot can induce such psychosis in people who aren't on mind-altering substances, how easy must it be for it to stoke similar thoughts in those who are? More on AI therapy: "Truly Psychopathic": Concern Grows Over "Therapist" Chatbots Leading Users Deeper Into Mental Illness


NDTV
08-07-2025
- Health
- NDTV
People Using AI Chatbots As Tripsitter When Using Psychedelics, Sparking Concerns
Artificial intelligence (AI) chatbots are now being employed as psychedelic trip-sitters by users, guiding them through their hallucinogenic journeys. According to an MIT tech review report, the drug-takers are using everything from popular chatbots like ChatGPT, to obscure tools like "TripSitAI" or "The Shaman" to ensure they have company during their trippy journeys. Ever since AI chatbots burst onto the scene, throngs of people have turned to them as surrogates for human therapists, citing the high costs and accessibility barriers. However, this is the first instance when reports of AI being used as a trip sitter have surfaced. Trip sitter is a phrase that traditionally refers to a sober person tasked with monitoring someone who's under the influence of a psychedelic. The report highlights a case study where a man named Peter underwent a transformative experience tripping on a heroic dose of eight grammes of psilocybin mushrooms with AI assistance after a period of hardship in 2023. After reaching out to ChatGPT, the AI chatbot curated him a calming playlist and also offered reassurance, the same way a human trip sitter would. Despite Peter's relatively good experience with the chatbot, the report warned that using AI with psychedelics was a dangerous "psychological cocktail". "It's a potent blend of two cultural trends: using AI for therapy and using psychedelics to alleviate mental-health problems. But this is a potentially dangerous psychological cocktail, according to experts. While it's far cheaper than in-person psychedelic therapy, it can go badly awry." The report highlighted that AI chatbots, by design, are aimed at maximising engagement with a user, often through flattery, which may feed into the delusion of a user under the influence of drugs. "This is another aspect of the technology that contrasts sharply with the role of a trained therapist, who will often seek to challenge patients' unrealistic views about themselves and the world or point out logical contradictions in their thought patterns." AI as therapists? Last month, a yet-to-be-peer-reviewed study by researchers at Stanford University stated that AI chatbots were encouraging schizophrenic delusions and suicidal thoughts in users who were using these tools as a replacement for therapists. "We find that these chatbots respond inappropriately to various mental health conditions, encouraging delusions and failing to recognise crises. The Large Language Models (LLMs) that power them fare poorly and additionally show stigma. These issues fly in the face of best clinical practice," the study highlighted. The study noted that while therapists are expected to treat all patients equally, regardless of their condition, the chatbots weren't acting in the same way when dealing with the problems. The chatbots reflected harmful social stigma towards illnesses like schizophrenia and alcohol dependence.