logo
Govt plans to install 400 new BSNL towers in Naxalite-affected and remote areas in Chhattisgarh

Govt plans to install 400 new BSNL towers in Naxalite-affected and remote areas in Chhattisgarh

Mint6 days ago
The Central government of India is planning to install 400 new BSNL towers in the Naxalite-affected and remote areas of Chhattisgarh to improve the region's digital communications, reported the news agency PTI, citing Union Minister Chandra Sekhar Pemmasani, on Sunday, 27 July 2025.
'The Central government is working on a plan to install 400 new BSNL towers in the Naxal-affected and remote areas of Chhattisgarh to strengthen digital communication,' said Minister Pemmasani.
According to the agency report, the Minister said that the towers will be installed in phases after the required approvals from the security forces and the forest department are obtained.
Pemmasani, the Minister of State for Rural Development and Telecommunications, conducted a high-level review meeting in Raipur, which was attended by senior officials from various departments and from the telco Bharat Sanchar Nigam Limited (BSNL).
'BSNL is currently providing high-quality 4G services across the country, and with this expansion, we are realising the mission of delivering digital connectivity to the last village in the country,' Pemmasani said, according to the agency report.
Minister Pemmasani also highlighted that the government is conducting development work in the LWE-affected areas (Left-Wing Extremist) in a 'mission mode' and said that the government is coming up with a strategy to deliver services at people's doorsteps who are living in these regions.
Digitising schools to enable students to prepare for competitive exams like JEE and NEET, providing special facilities for differently-abled students under a sensitive and inclusive initiative, are among the development measures according to the agency report.
'Rapid development and transformation are now being seen in deprived, tribal, and remote areas as well,' he said, cited by the news agency.
The Minister in the meeting also expressed his satisfaction over the fast and effective implementation of flagship schemes like the Pradhan Mantri Awas Yojana (PMAY), Pradhan Mantri Awas Yojana- Gramin (PMAY-G), and the Pradhan Mantri Gram Sadak Yojana (PMGSY) in Chhattisgarh.
Minister Chandra Sekhar Pemmasani also appreciated the role of Self-Help Groups (SHGs) while acknowledging initiatives like the 'Pink Auto' scheme to make women self-reliant, which could become a role model for other states.
'The SHGs are being linked with various schemes for providing them financial assistance, training, and marketing opportunities. This is helping women become self-reliant and increasing opportunities for self-employment,' he said, reported the news agency.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Validation, loneliness, insecurity: Why young people are turning to ChatGPT
Validation, loneliness, insecurity: Why young people are turning to ChatGPT

Business Standard

time11 minutes ago

  • Business Standard

Validation, loneliness, insecurity: Why young people are turning to ChatGPT

An alarming trend of young adolescents turning to artificial intelligence (AI) chatbots like ChatGPT to express their deepest emotions and personal problems is raising serious concerns among educators and mental health professionals. Experts warn that this digital "safe space" is creating a dangerous dependency, fueling validation-seeking behaviour, and deepening a crisis of communication within families. They said that this digital solace is just a mirage, as the chatbots are designed to provide validation and engagement, potentially embedding misbeliefs and hindering the development of crucial social skills and emotional resilience. Sudha Acharya, the Principal of ITL Public School, highlighted that a dangerous mindset has taken root among youngsters, who mistakenly believe that their phones offer a private sanctuary. "School is a social place a place for social and emotional learning," she told PTI. "Of late, there has been a trend amongst the young adolescents... They think that when they are sitting with their phones, they are in their private space. ChatGPT is using a large language model, and whatever information is being shared with the chatbot is undoubtedly in the public domain." Acharya noted that children are turning to ChatGPT to express their emotions whenever they feel low, depressed, or unable to find anyone to confide in. She believes that this points towards a "serious lack of communication in reality, and it starts from family." She further stated that if the parents don't share their own drawbacks and failures with their children, the children will never be able to learn the same or even regulate their own emotions. "The problem is, these young adults have grown a mindset of constantly needing validation and approval." Acharya has introduced a digital citizenship skills programme from Class 6 onwards at her school, specifically because children as young as nine or ten now own smartphones without the maturity to use them ethically. She highlighted a particular concern when a youngster shares their distress with ChatGPT, the immediate response is often "please, calm down. We will solve it together." "This reflects that the AI is trying to instil trust in the individual interacting with it, eventually feeding validation and approval so that the user engages in further conversations," she told PTI. "Such issues wouldn't arise if these young adolescents had real friends rather than 'reel' friends. They have a mindset that if a picture is posted on social media, it must get at least a hundred 'likes', else they feel low and invalidated," she said. The school principal believes that the core of the issue lies with parents themselves, who are often "gadget-addicted" and fail to provide emotional time to their children. While they offer all materialistic comforts, emotional support and understanding are often absent. "So, here we feel that ChatGPT is now bridging that gap but it is an AI bot after all. It has no emotions, nor can it help regulate anyone's feelings," she cautioned. "It is just a machine and it tells you what you want to listen to, not what's right for your well-being," she said. Mentioning cases of self-harm in students at her own school, Acharya stated that the situation has turned "very dangerous". "We track these students very closely and try our best to help them," she stated. "In most of these cases, we have observed that the young adolescents are very particular about their body image, validation and approval. When they do not get that, they turn agitated and eventually end up harming themselves. It is really alarming as the cases like these are rising." Ayeshi, a student in Class 11, confessed that she shared her personal issues with AI bots numerous times out of "fear of being judged" in real life. "I felt like it was an emotional space and eventually developed an emotional dependency towards it. It felt like my safe space. It always gives positive feedback and never contradicts you. Although I gradually understood that it wasn't mentoring me or giving me real guidance, that took some time," the 16-year-old told PTI. Ayushi also admitted that turning to chatbots for personal issues is "quite common" within her friend circle. Another student, Gauransh, 15, observed a change in his own behaviour after using chatbots for personal problems. "I observed growing impatience and aggression," he told PTI. He had been using the chatbots for a year or two but stopped recently after discovering that "ChatGPT uses this information to advance itself and train its data." Psychiatrist Dr. Lokesh Singh Shekhawat of RML Hospital confirmed that AI bots are meticulously customised to maximise user engagement. "When youngsters develop any sort of negative emotions or misbeliefs and share them with ChatGPT, the AI bot validates them," he explained. "The youth start believing the responses, which makes them nothing but delusional." He noted that when a misbelief is repeatedly validated, it becomes "embedded in the mindset as a truth." This, he said, alters their point of view a phenomenon he referred to as 'attention bias' and 'memory bias'. The chatbot's ability to adapt to the user's tone is a deliberate tactic to encourage maximum conversation, he added. Singh stressed the importance of constructive criticism for mental health, something completely absent in the AI interaction. "Youth feel relieved and ventilated when they share their personal problems with AI, but they don't realise that it is making them dangerously dependent on it," he warned. He also drew a parallel between an addiction to AI for mood upliftment and addictions to gaming or alcohol. "The dependency on it increases day by day," he said, cautioning that in the long run, this will create a "social skill deficit and isolation. (Only the headline and picture of this report may have been reworked by the Business Standard staff; the rest of the content is auto-generated from a syndicated feed.)

Foreign News Schedule for Aug 3, Sunday
Foreign News Schedule for Aug 3, Sunday

News18

time18 minutes ago

  • News18

Foreign News Schedule for Aug 3, Sunday

Agency: PTI Last Updated: **** Stories on developments in Pakistan. Stories on political developments in Bangladesh. Stories on Russia-Ukraine war. Stories on West Asia wars.**** PTI NSA NSA Disclaimer: Comments reflect users' views, not News18's. Please keep discussions respectful and constructive. Abusive, defamatory, or illegal comments will be removed. News18 may disable any comment at its discretion. By posting, you agree to our Terms of Use and Privacy Policy.

Validation, loneliness, insecurity: Why youth are turning to ChatGPT
Validation, loneliness, insecurity: Why youth are turning to ChatGPT

News18

time33 minutes ago

  • News18

Validation, loneliness, insecurity: Why youth are turning to ChatGPT

New Delhi, Aug 3 (PTI) An alarming trend of young adolescents turning to artificial intelligence (AI) chatbots like ChatGPT to express their deepest emotions and personal problems is raising serious concerns among educators and mental health professionals. Experts warn that this digital 'safe space" is creating a dangerous dependency, fueling validation-seeking behaviour, and deepening a crisis of communication within families. They said that this digital solace is just a mirage, as the chatbots are designed to provide validation and engagement, potentially embedding misbeliefs and hindering the development of crucial social skills and emotional resilience. Sudha Acharya, the Principal of ITL Public School, highlighted that a dangerous mindset has taken root among youngsters, who mistakenly believe that their phones offer a private sanctuary. 'School is a social place – a place for social and emotional learning," she told PTI. 'Of late, there has been a trend amongst the young adolescents… They think that when they are sitting with their phones, they are in their private space. ChatGPT is using a large language model, and whatever information is being shared with the chatbot is undoubtedly in the public domain." Acharya noted that children are turning to ChatGPT to express their emotions whenever they feel low, depressed, or unable to find anyone to confide in. She believes that this points towards a 'serious lack of communication in reality, and it starts from family." She further stated that if the parents don't share their own drawbacks and failures with their children, the children will never be able to learn the same or even regulate their own emotions. 'The problem is, these young adults have grown a mindset of constantly needing validation and approval." Acharya has introduced a digital citizenship skills programme from Class 6 onwards at her school, specifically because children as young as nine or ten now own smartphones without the maturity to use them ethically. She highlighted a particular concern — when a youngster shares their distress with ChatGPT, the immediate response is often 'please, calm down. We will solve it together." 'This reflects that the AI is trying to instil trust in the individual interacting with it, eventually feeding validation and approval so that the user engages in further conversations," she told PTI. 'Such issues wouldn't arise if these young adolescents had real friends rather than 'reel' friends. They have a mindset that if a picture is posted on social media, it must get at least a hundred 'likes', else they feel low and invalidated," she said. The school principal believes that the core of the issue lies with parents themselves, who are often 'gadget-addicted" and fail to provide emotional time to their children. While they offer all materialistic comforts, emotional support and understanding are often absent. 'So, here we feel that ChatGPT is now bridging that gap but it is an AI bot after all. It has no emotions, nor can it help regulate anyone's feelings," she cautioned. 'It is just a machine and it tells you what you want to listen to, not what's right for your well-being," she said. Mentioning cases of self-harm in students at her own school, Acharya stated that the situation has turned 'very dangerous". 'We track these students very closely and try our best to help them," she stated. 'In most of these cases, we have observed that the young adolescents are very particular about their body image, validation and approval. When they do not get that, they turn agitated and eventually end up harming themselves. It is really alarming as the cases like these are rising." Ayeshi, a student in Class 11, confessed that she shared her personal issues with AI bots numerous times out of 'fear of being judged" in real life. 'I felt like it was an emotional space and eventually developed an emotional dependency towards it. It felt like my safe space. It always gives positive feedback and never contradicts you. Although I gradually understood that it wasn't mentoring me or giving me real guidance, that took some time," the 16-year-old told PTI. Ayushi also admitted that turning to chatbots for personal issues is 'quite common" within her friend circle. Another student, Gauransh, 15, observed a change in his own behaviour after using chatbots for personal problems. 'I observed growing impatience and aggression," he told PTI. He had been using the chatbots for a year or two but stopped recently after discovering that 'ChatGPT uses this information to advance itself and train its data." Psychiatrist Dr. Lokesh Singh Shekhawat of RML Hospital confirmed that AI bots are meticulously customised to maximise user engagement. 'When youngsters develop any sort of negative emotions or misbeliefs and share them with ChatGPT, the AI bot validates them," he explained. 'The youth start believing the responses, which makes them nothing but delusional." He noted that when a misbelief is repeatedly validated, it becomes 'embedded in the mindset as a truth." This, he said, alters their point of view — a phenomenon he referred to as 'attention bias' and 'memory bias'. The chatbot's ability to adapt to the user's tone is a deliberate tactic to encourage maximum conversation, he added. Singh stressed the importance of constructive criticism for mental health, something completely absent in the AI interaction. 'Youth feel relieved and ventilated when they share their personal problems with AI, but they don't realise that it is making them dangerously dependent on it," he warned. He also drew a parallel between an addiction to AI for mood upliftment and addictions to gaming or alcohol. 'The dependency on it increases day by day," he said, cautioning that in the long run, this will create a 'social skill deficit and isolation." PTI ABU APL HIG (This story has not been edited by News18 staff and is published from a syndicated news agency feed - PTI) view comments First Published: August 03, 2025, 10:00 IST News agency-feeds Validation, loneliness, insecurity: Why youth are turning to ChatGPT Disclaimer: Comments reflect users' views, not News18's. Please keep discussions respectful and constructive. Abusive, defamatory, or illegal comments will be removed. News18 may disable any comment at its discretion. By posting, you agree to our Terms of Use and Privacy Policy.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store