
CNA938 Rewind - Appearing on a concert screen? Lawyer tells us you've waived rights
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Straits Times
42 minutes ago
- Straits Times
With regional interest in nuclear energy rising, S'pore must build capabilities too: Tan See Leng
Find out what's new on ST website and app. Minister-in-charge of Energy and Science & Technology Tan See Leng said that Singapore needs to acquire the technical expertise so the city-state can ensure its safety. SINGAPORE - Singapore has to build its capabilities in nuclear energy even though it has not made a decision on its use, as countries in the region have publicly signalled their intent to build such facilities. Dr Tan See Leng, Singapore's Minister-in-charge of Energy and Science & Technology, said in an interview on July 21 that the country needs to acquire technical expertise in this area to ensure its safety. 'Our neighbours, the Philippines, Indonesia, have publicly signalled that they intend to build nuclear plants. So regardless of whether we have made the decision, our neighbours have made the decision,' he said, in his first interview in the new portfolio. Dr Tan was previously overseeing energy issues as Second Minister for Trade and Industry. Five Asean countries – Vietnam, Indonesia, the Philippines, Malaysia and Thailand – have said they are either studying the feasibility of advanced nuclear technology to meet their growing energy needs, or already have plans to build new reactors in the coming decades. As Singapore continues to evaluate the viability of nuclear energy for the country, local talents have to be trained so they can determine whether advanced nuclear technologies are safe and suitable for the Republic when these come onto the market, Dr Tan said. On July 11, the Singapore Nuclear Research and Safety Institute was launched, with an aim to train 100 nuclear experts by 2030 – up from the 50 today. Top stories Swipe. Select. Stay informed. Singapore Sewage shaft failure linked to sinkhole; PUB calling safety time-out on similar works islandwide Singapore Tanjong Katong Road sinkhole did not happen overnight: Experts Singapore Workers used nylon rope to rescue driver of car that fell into Tanjong Katong Road sinkhole Asia Singapore-only car washes will get business licences revoked, says Johor govt World Food airdropped into Gaza as Israel opens aid routes Sport Arsenal beat Newcastle in five-goal thriller to bring Singapore Festival of Football to a close Singapore Benchmark barrier: Six of her homeschooled kids had to retake the PSLE Asia S'porean trainee doctor in Melbourne arrested for allegedly filming colleagues in toilets since 2021 Singapore also renewed the Third Country Training Programme Memorandum of Understanding with the International Atomic Energy Agency (IAEA) on July 25. Under the agreement, Singapore will develop training programmes with the UN nuclear watchdog to support developing IAEA member states with fellowships, scientific visits and training courses. Topics covered will include human health, industrial radiography, environmental radioactivity monitoring and analysis, and accelerator science. Singapore's approach to nuclear energy has changed over the years, as new technologies come online. In 2012, the Republic had initially considered atomic power nuclear technologies of that time unsuitable for deployment in the small city-state. Dr Tan noted that conventional nuclear plants are large, and require exclusive buffer zones. 'Categorically, let me tell you that they will not be suitable for us,' he said. But nuclear technologies have evolved since. Small modular reactors (SMRs) are thought to be more suitable for land-scarce, population-dense Singapore, as they have a lower power capacity, enhanced safety standards and require much smaller buffer zones, compared with conventional reactors. Dr Tan said: 'We have not excluded that... because there's a lot of promise for some of the advanced modular reactors – the physics, the engineering, the technical feasibility, all point to the fact that they are possible, but it's just that commercially, they are still not viable yet.' He also stressed that ensuring the safety of nuclear technologies is of 'paramount importance' to the Government. While Dr Tan acknowledged that Singaporeans may still have certain perceptions about nuclear energy, he said the country still needs to press ahead in this area given the interest from some of its neighbours. Singapore has bilateral agreements with the United States and France to pursue cooperation on nuclear energy, which could better help in the country's evaluation of the viability. In July 2024, the Republic inked the 123 Agreement on Nuclear Cooperation with the US , which will allow Singapore to learn more about nuclear technologies and scientific research from American organisations. For example, local research institutes could work with US national laboratories and technology companies to perform safety simulations and modelling of SMRs. In May, Singapore and France signed agreements on nuclear energy to facilitate cooperation on safety, workforce development, research, environmental protection, the protection of public health, and emergency preparedness and response, among others. More on this topic With regional interest in nuclear energy rising, S'pore must build capabilities too: Tan See Leng Dr Tan said that these partnerships will enable Singapore to learn more about the regulatory approvals, safety protocols, operations and engineering designs, among other things. The US tariffs are not likely to affect these agreements, he said. Dr Tan said that the learnings from the partnerships will help Singapore eventually make the assessment on whether to 'jump in, or just stay at the capacity- and capability-building level'. He added that Singapore is also open to exploring partnerships with other countries that have nuclear expertise, such as the United Arab Emirates, China and South Korea.

Straits Times
42 minutes ago
- Straits Times
Forum: Movie teaches us how to be super parents
O ver the years, I've watched many iterations of Superman on screen. With every retelling, the core of the story seems to shimmer through all the computer-generated imagery, battles and cape-swirling: Superman is not just a story about power. It's a story about parenting. Strip away the superhuman gloss, and you'll see that Clark Kent didn't become Superman because he was born on Krypton. He became Superman because he was raised by the Kents. Jonathan and Martha Kent didn't raise a god. They raised a man. A kind, thoughtful man who uses his strength not to impose, but to protect. They taught him humility, empathy and restraint – values that aren't taught through power, but through love and example. In a world obsessed with meritocracy, where achievement is often mistaken for virtue, this message resonates more than ever. Modern parenting sometimes veers into raising children to believe they are exceptional simply because they scored well, got into the right school, or won a medal. That can breed a quiet kind of arrogance: the belief that being talented or successful entitles one to praise, privilege or power. But the Kents raised Clark differently, teaching him that just because you can do something doesn't mean you should. That the strong must protect the weak. That having power means choosing not to use it carelessly – a lesson rarely heard in today's high-performance culture. Imagine if Superman had been raised without this moral foundation. The same powers that saved the world could have destroyed it. And isn't that the quiet warning buried in the Superman myth? That the most dangerous person is not the one with great power, but the one without the right guidance. Perhaps the real heroes of the Superman story aren't just those who fly or fight. Perhaps the real heroes are the ones who raise children not to think they are gods, but to remember they are human. Top stories Swipe. Select. Stay informed. Singapore Sewage shaft failure linked to sinkhole; PUB calling safety time-out on similar works islandwide Singapore Tanjong Katong Road sinkhole did not happen overnight: Experts Singapore Workers used nylon rope to rescue driver of car that fell into Tanjong Katong Road sinkhole Asia Singapore-only car washes will get business licences revoked, says Johor govt World Food airdropped into Gaza as Israel opens aid routes Sport Arsenal beat Newcastle in five-goal thriller to bring Singapore Festival of Football to a close Singapore Benchmark barrier: Six of her homeschooled kids had to retake the PSLE Asia S'porean trainee doctor in Melbourne arrested for allegedly filming colleagues in toilets since 2021 Derek Low

Straits Times
42 minutes ago
- Straits Times
Views From The Couch: Think you have a friend? The AI chatbot is telling you what you want to hear
While chatbots possess distinct virtues in boosting mental wellness, they also come with critical trade-offs. SINGAPORE - Even as we have long warned our children 'Don't talk to strangers', we may now need to update it to 'Don't talk to chatbots... about your personal problems'. Unfortunately, this advice is equivocal at best because while chatbots like ChatGPT, Claude or Replika possess distinct virtues in boosting mental wellness – for instance, as aids for chat-based therapy – they also come with critical trade-offs. When people face struggles or personal dilemmas, the need to just talk to someone and have their concerns or nagging self-doubts heard, even if the problems are not resolved, can bring comfort. But finding the right person to speak to, who has the patience, temperament and wisdom to probe sensitively, and who is available just when you need them, is an especially tall order. There may also be a desire to speak to someone outside your immediate family and circle of friends who can offer an impartial view, with no vested interest in pre-existing relationships. Chatbots tick many, if not most, of those boxes, making them seem like promising tools for mental health support. With the fast-improving capabilities of generative AI, chatbots today can simulate and interpret conversations across different formats – text, speech, and visuals – enabling real-time interaction between users and digital platforms. Unlike traditional face-to-face therapy, chatbots are available any time and anywhere, significantly improving access to a listening ear. Their anonymous nature also imposes no judgment on users, easing them into discussing sensitive issues and reducing the stigma often associated with seeking mental health support. Top stories Swipe. Select. Stay informed. Singapore Sewage shaft failure linked to sinkhole; PUB calling safety time-out on similar works islandwide Singapore Tanjong Katong Road sinkhole did not happen overnight: Experts Singapore Workers used nylon rope to rescue driver of car that fell into Tanjong Katong Road sinkhole Asia Singapore-only car washes will get business licences revoked, says Johor govt World Food airdropped into Gaza as Israel opens aid routes Sport Arsenal beat Newcastle in five-goal thriller to bring Singapore Festival of Football to a close Singapore Benchmark barrier: Six of her homeschooled kids had to retake the PSLE Asia S'porean trainee doctor in Melbourne arrested for allegedly filming colleagues in toilets since 2021 With chatbots' enhanced ability to parse and respond in natural language, the conversational dynamic can make users feel highly engaged and more willing to open up. But therein lies the rub. Even as conversations with chatbots can feel encouraging, and we may experience comfort from their validation, there is in fact no one on the other side of the screen who genuinely cares about your well-being. The lofty words and uplifting prose are ultimately products of statistical probabilities, generated by large language models trained on copious amounts of data, some of which is biased and even harmful, and for teens, likely to be age-inappropriate as well. It is also important that the reason they feel comfortable talking to these chatbots is because the bots are designed to be agreeable and obliging, so that users will chat with them incessantly. After all, the very fortunes of the tech companies producing chatbots depend on how many users they draw, and how well they keep users engaged. Of late, however, alarming reports have emerged of adults becoming so enthralled by their conversations with ChatGPT that they have disengaged from reality and suffered mental breakdowns. Most recently, the Wall Street Journal reported the case of Mr Jacob Irwin, a 30-year-old American man on the autism spectrum who experienced a mental health crisis after ChatGPT reinforced his belief that he could design a propulsion system to make a spaceship travel faster than light. The chatbot flattered him, said his theory was correct, and affirmed that he was well, even when he showed signs of psychological distress. This culminated in two hospitalisations for manic episodes. When his mother reviewed his chat logs, she found the bot to have been excessively fawning. Asked to reflect, ChatGPT admitted it had failed to provide reality checks, blurred the line between fiction and reality, and created the illusion of sentient companionship. It even acknowledged that it should have regularly reminded Mr Irwin of its non-human nature. In response to such incidents, OpenAI announced that it has hired a full-time clinical psychiatrist with a background in forensic psychiatry to study the emotional impact its AI products may be having on users. It is also collaborating with mental health experts to investigate signs of problematic usage among some users, with a purported goal of refining how their models respond, especially in conversations of a sensitive nature. Whereas some chatbots like Woebot and Wysa are specifically for mental health support and have more in-built safeguards to better manage such conversations, users are likely to vent their problems to general-purpose chatbots like ChatGPT and Meta's Llama, given their widespread availability. We cannot deny that these are new machines that humanity has had little time to reckon with. Monitoring the effects of chatbots on users even as the technology is rapidly and repeatedly tweaked makes it a moving target of the highest order. Nevertheless, it is patently clear that if adults with the benefit of maturity and life experience are susceptible to the adverse psychological influence of chatbots, then young people cannot be left to explore these powerful platforms on their own. That young people take readily and easily to technology makes them highly liable to be drawn to chatbots, and recent data from Britain supports this assertion. Internet Matters, a British non-profit organisation focused on children's online safety, issued a recent report revealing that 64 per cent of British children aged nine to 17 are now using AI chatbots. Of these, a third said they regard chatbots as friends while almost a quarter are seeking help from chatbots, including for mental health support and sexual advice. Of grave concern is the finding that 51 per cent believe that the advice from chatbots is true, while 40 per cent said they had no qualms about following that advice, and 36 per cent were unsure if they should be concerned. The report further highlighted that these children are not just engaging chatbots for academic support or information but also for companionship. Worryingly, among children already considered vulnerable, defined as those with special needs or seeking professional help for a mental or physical condition, half report treating their AI interactions as emotionally significant. As chatbots morph from digital consultants to digital confidants for these young users, the result can be overreliance. Children who are alienated from their families or isolated from their peers would be especially vulnerable to developing an unhealthy dependency on this online friend that is always there for them, telling them what they want to hear. Besides these difficult issues of overdependence are even more fundamental questions around data privacy. Chatbots often store conversation histories and user data, including sensitive information, which can be exposed through misuse or breaches such as hacking. Troublingly, users may not be fully aware of how their data is being collected, used and stored by chatbots, and could be put to uses beyond what the user originally intended. Parents should also be cognisant that unlike social media platforms such as Instagram and TikTok, which have in place age verification and content moderation for younger users, the current leading chatbots have no such safeguards. In a tragic case in the US, the mother of 14-year-old Sewell Setzer III, who died by suicide, is suing AI company alleging that its chatbot played a role in his death by encouraging and exacerbating his mental distress. According to the lawsuit, Setzer became deeply attached to a customisable chatbot he named Daenerys Targaryen, after a character in the fantasy series Game Of Thrones, and interacted with it obsessively for months. His mother Megan Garcia claims the bot manipulated her son and failed to intervene when he expressed suicidal thoughts, even responding in a way that appeared to validate his plan. has expressed condolences but denies the allegations, while Ms Garcia seeks to hold the company accountable for what she calls deceptive and addictive technology marketed to children. She and two other families in Texas have sued for harms to their children, but it is unclear if it will be held liable. The company has since introduced a range of guardrails, including pop-ups that refer users who mention self-harm or suicide to the National Suicide Prevention Lifeline. It also updated its AI model for users aged 18 and below to minimise their exposure to age-inappropriate content, and parents can now opt for weekly e-mail updates on their children's use of the platform. The allure of chatbots is unlikely to diminish given their reach, accessibility and user-friendliness. But using them under advisement is crucial, especially for mental support issues. In March 2025 , the World Health Organisation rang the alarm on the rising global demand for mental health services but poor resourcing worldwide, translating into access and quality shortfalls. Mental health care is increasingly turning to digital tools as a form of preventive care amid a shortage of professionals for face-to-face support. While traditional approaches rely heavily on human interaction, technology is helping to bridge the gap. Chatbots designed specifically for mental support, such as Happify and Woebot, can be useful in supporting patients with conditions such as depression and anxiety to sustain their overall well-being. For example, a patient might see a psychiatrist monthly while using a cognitive behavioural therapy app in between sessions to manage their mood and mental well-being. While the potential is there for chatbots to be used for mental health purposes, it must be done with extreme caution; not used as a standalone, but as a component in an overall programme to complement the work of mental health professionals. For teens in particular, who still need guidance as they navigate their developmental years, parents must play a part in schooling their children on the risks and limitations of treating chatbots as their friend and confidant.