logo
Copy, paste, forget

Copy, paste, forget

Express Tribune17 hours ago
When Jocelyn Leitzinger had her university students write about times in their lives they had witnessed discrimination, she noticed that a woman named Sally was the victim in many of the stories.
"It was very clear that ChatGPT had decided this is a common woman's name," said Leitzinger, who teaches an undergraduate class on business and society at the University of Illinois in Chicago.
"They weren't even coming up with their own anecdotal stories about their own lives," she told AFP.
Leitzinger estimated that around half of her 180 students used ChatGPT inappropriately at some point last semester — including when writing about the ethics of artificial intelligence (AI), which she called both "ironic" and "mind-boggling".
So she was not surprised by recent research which suggested that students who use ChatGPT to write essays engage in less critical thinking.
The preprint study, which has not been peer-reviewed, was shared widely online and clearly struck a chord with some frustrated educators.
The team of MIT researchers behind the paper have received more than 3,000 emails from teachers of all stripes since it was published online last month, lead author Nataliya Kosmyna told AFP.
'Soulless' AI essays
For the small study, 54 adult students from the greater Boston area were split into three groups. One group used ChatGPT to write 20-minute essays, one used a search engine, and the final group had to make do with only their brains.
The researchers used EEG devices to measure the brain activity of the students, and two teachers marked the essays.
The ChatGPT users scored significantly worse than the brain-only group on all levels. The EEG showed that different areas of their brains connected to each other less often.
And more than 80 per cent of the ChatGPT group could not quote anything from the essay they had just written, compared to around 10 per cent of the other two groups.
By the third session, the ChatGPT group appeared to be mostly focused on copying and pasting.
The teachers said they could easily spot the "soulless" ChatGPT essays because they had good grammar and structure but lacked creativity, personality and insight.
However Kosmyna pushed back against media reports claiming the paper showed that using ChatGPT made people lazier or more stupid.
She pointed to the fourth session, when the brain-only group used ChatGPT to write their essay and displayed even higher levels of neural connectivity.
Kosmyna emphasised it was too early to draw conclusions from the study's small sample size but called for more research into how AI tools could be used more carefully to help learning.
Ashley Juavinett, a neuroscientist at the University of California San Diego who was not involved in the research, criticised some "offbase" headlines that wrongly extrapolated from the preprint.
"This paper does not contain enough evidence nor the methodological rigour to make any claims about the neural impact of using LLMs (large language models such as ChatGPT) on our brains," she told AFP.
Thinking outside the bot
Leitzinger said the research reflected how she had seen student essays change since ChatGPT was released in 2022, as both spelling errors and authentic insight became less common.
Sometimes students do not even change the font when they copy and paste from ChatGPT, she said.
But Leitzinger called for empathy for students, saying they can get confused when the use of AI is being encouraged by universities in some classes but is banned in others.
The usefulness of new AI tools is sometimes compared to the introduction of calculators, which required educators to change their ways.
But Leitzinger worried that students do not need to know anything about a subject before pasting their essay question into ChatGPT, skipping several important steps in the process of learning.
A student at a British university in his early 20s who wanted to remain anonymous told AFP he found ChatGPT was a useful tool for compiling lecture notes, searching the internet and generating ideas.
"I think that using ChatGPT to write your work for you is not right because it's not what you're supposed to be at university for," he said.
The problem goes beyond high school and university students.
Academic journals are struggling to cope with a massive influx of AI-generated scientific papers. Book publishing is also not immune, with one startup planning to pump out 8,000 AI-written books a year.
"Writing is thinking, thinking is writing, and when we eliminate that process, what does that mean for thinking?" Leitzinger asked.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

India plans $230m drone incentive
India plans $230m drone incentive

Express Tribune

time13 hours ago

  • Express Tribune

India plans $230m drone incentive

Industrial disasters are common in India, with experts blaming poor planning and lax enforcement of safety rules. PHOTO: AFP Listen to article India will launch a $234 million incentive programme for civil and military drone makers to reduce their reliance on imported components and counter rival Pakistan's programme built on support from China and Turkey, three sources told Reuters. New Delhi will launch a 20 billion Indian rupees ($234 million) programme for three years that will cover manufacturing of drones, components, software, counter drone systems, and services, two government and one industry source, who did not want to be named, told Reuters. Through the incentives, India is aiming to have at least 40% of key drone components made in the country by the end of fiscal year 2028 (April-March), the two government sources said. Reuters previously reported that India plans to invest heavily in local industry and could spend as much as $470 million on unmanned aerial vehicles over the next 12 to 24 months, in what government and military officers said would be a staggered approach.

Copy, paste, forget
Copy, paste, forget

Express Tribune

time17 hours ago

  • Express Tribune

Copy, paste, forget

When Jocelyn Leitzinger had her university students write about times in their lives they had witnessed discrimination, she noticed that a woman named Sally was the victim in many of the stories. "It was very clear that ChatGPT had decided this is a common woman's name," said Leitzinger, who teaches an undergraduate class on business and society at the University of Illinois in Chicago. "They weren't even coming up with their own anecdotal stories about their own lives," she told AFP. Leitzinger estimated that around half of her 180 students used ChatGPT inappropriately at some point last semester — including when writing about the ethics of artificial intelligence (AI), which she called both "ironic" and "mind-boggling". So she was not surprised by recent research which suggested that students who use ChatGPT to write essays engage in less critical thinking. The preprint study, which has not been peer-reviewed, was shared widely online and clearly struck a chord with some frustrated educators. The team of MIT researchers behind the paper have received more than 3,000 emails from teachers of all stripes since it was published online last month, lead author Nataliya Kosmyna told AFP. 'Soulless' AI essays For the small study, 54 adult students from the greater Boston area were split into three groups. One group used ChatGPT to write 20-minute essays, one used a search engine, and the final group had to make do with only their brains. The researchers used EEG devices to measure the brain activity of the students, and two teachers marked the essays. The ChatGPT users scored significantly worse than the brain-only group on all levels. The EEG showed that different areas of their brains connected to each other less often. And more than 80 per cent of the ChatGPT group could not quote anything from the essay they had just written, compared to around 10 per cent of the other two groups. By the third session, the ChatGPT group appeared to be mostly focused on copying and pasting. The teachers said they could easily spot the "soulless" ChatGPT essays because they had good grammar and structure but lacked creativity, personality and insight. However Kosmyna pushed back against media reports claiming the paper showed that using ChatGPT made people lazier or more stupid. She pointed to the fourth session, when the brain-only group used ChatGPT to write their essay and displayed even higher levels of neural connectivity. Kosmyna emphasised it was too early to draw conclusions from the study's small sample size but called for more research into how AI tools could be used more carefully to help learning. Ashley Juavinett, a neuroscientist at the University of California San Diego who was not involved in the research, criticised some "offbase" headlines that wrongly extrapolated from the preprint. "This paper does not contain enough evidence nor the methodological rigour to make any claims about the neural impact of using LLMs (large language models such as ChatGPT) on our brains," she told AFP. Thinking outside the bot Leitzinger said the research reflected how she had seen student essays change since ChatGPT was released in 2022, as both spelling errors and authentic insight became less common. Sometimes students do not even change the font when they copy and paste from ChatGPT, she said. But Leitzinger called for empathy for students, saying they can get confused when the use of AI is being encouraged by universities in some classes but is banned in others. The usefulness of new AI tools is sometimes compared to the introduction of calculators, which required educators to change their ways. But Leitzinger worried that students do not need to know anything about a subject before pasting their essay question into ChatGPT, skipping several important steps in the process of learning. A student at a British university in his early 20s who wanted to remain anonymous told AFP he found ChatGPT was a useful tool for compiling lecture notes, searching the internet and generating ideas. "I think that using ChatGPT to write your work for you is not right because it's not what you're supposed to be at university for," he said. The problem goes beyond high school and university students. Academic journals are struggling to cope with a massive influx of AI-generated scientific papers. Book publishing is also not immune, with one startup planning to pump out 8,000 AI-written books a year. "Writing is thinking, thinking is writing, and when we eliminate that process, what does that mean for thinking?" Leitzinger asked.

Japan plans ‘world first' deep-sea mineral extraction
Japan plans ‘world first' deep-sea mineral extraction

Business Recorder

time2 days ago

  • Business Recorder

Japan plans ‘world first' deep-sea mineral extraction

TOKYO: Japan will from January attempt to extract rare earth minerals from the ocean floor in the deepest trial of its kind, the director of a government innovation programme said Thursday. Earlier this week the country pledged to work with the United States, India and Australia to ensure a stable supply of critical minerals, as concern grows over China's dominance in resources vital to new technologies. Rare earths — 17 metals difficult to extract from the Earth's crust — are used in everything from electric vehicles to hard drives, wind turbines and missiles. China accounts for almost two-thirds of rare earth mining production and 92 percent of global refined output, according to the International Energy Agency. A Japanese deep-sea scientific drilling boat called the Chikyu will from January conduct a 'test cruise' to retrieve ocean floor sediments that contain rare earth elements, said Shoichi Ishii, director of Japan's Cross-ministerial Strategic Innovation Promotion Programme. 'The test to retrieve the sediments from 5,500 metres (3.4 miles) water depth is the first in the world,' he told AFP. 'Our goal... of this cruise is to test the function of all mining equipment,' so the amount of sediment extracted 'doesn't matter at all', Ishii added. The Chikyu will drill in Japanese economic waters around the remote island of Minami Torishima in the Pacific — the easternmost point of Japan, also used as a military base. Japan's Nikkei business daily reported that the mission aims to extract 35 tonnes of mud from the sea floor over around three weeks. Each tonne is expected to contain around two kilograms (4.4 pounds) of rare earth minerals, which are often used to make magnets that are essential in modern electronics. Deep-sea mining has become a geopolitical flashpoint, with anxiety growing over a push by US President Donald Trump to fast-track the practice in international waters. Beijing has since April required licences to export rare earths from China, a move seen as retaliation for US curbs on the import of Chinese goods.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store