logo
AI threatens entry-level jobs as university grads struggle to get hired

AI threatens entry-level jobs as university grads struggle to get hired

Australian workers are facing a major upheaval as artificial intelligence (AI) becomes a cheaper alternative to employing humans.
While the full impact of AI is yet to be reflected in job ads or official employment statistics, both employers and employees warn the technology is already reshaping the nation's labour market.
It took recent data science graduate Tien Hung Nguyen 30 applications and an internship to land his first full-time job.
"I feel privileged to have secured this position. I'm going to give everything I've got," he says.
Most of his friends are still looking for work — and he says artificial intelligence is a big reason why.
"Since AI appeared, for example, a team might have needed three or four juniors and a senior. Now, it's one junior and AI," Mr Nguyen explains.
In countries where AI is more advanced, such as the United States, lay-offs are speeding up.
Amazon is the latest big employer to warn of looming job losses. There are also reports Microsoft is shaping up to clean out more staff whose tasks can be completed by AI.
There are also worrying signs for young workers in the US, as the unemployment rate for recent college graduates nears 6 per cent.
In Australia, the unemployment rate is holding steady at 4.1 per cent. However, the jobless rate for young people — which is typically higher than the overall rate — has risen slightly to 9.2 per cent.
Economist Leonora Risse says youth unemployment is a key indicator.
"Young people tend to be the group that experience the greatest volatility in the labour market," she warns.
Mr Nguyen now works at an AI start-up, where much of the low-level admin work has already been handed over to machines.
His employer, Julian Fayed, says the shift is accelerating.
"Our technology is advancing, and our AI implementations are advancing at a rate that means that our headcount isn't really growing anymore," he says.
"A lot of the lower-level tasks our team didn't enjoy doing, AI can now do 24 hours a day, seven days a week.
"No sick days. That's the slightly dark joke."
Beyond small tech start-ups, some of the country's biggest employers are also preparing for a leaner future.
Telstra CEO Vicky Brady has been up-front at several public events about how advances in AI will result in job cuts.
"We know that work is going to look very different in 2030 — and so will we," Ms Brady told a recent investor briefing.
CBA boss Matt Comyn made similar comments when he appeared at the Australian Financial Review's AI summit in Sydney this month.
"It's hard to make predictions," Mr Comyn said.
"But I think in some areas, it's reasonable to say the workforce will be smaller."
Dario Amodei — CEO of US-based AI company Anthropic — has warned that up to 50 per cent of entry-level white-collar jobs could disappear within five years.
Aaron Matrljan from recruitment agency Aura agrees junior positions will be among the first to go.
"Things that we would get juniors to be trained on — that would usually be a learning exercise for them — can now be done so much more cheaply and effectively by AI in a matter of seconds," he explains.
Mr Matrljan says his professional services clients are all talking about AI.
Mr Matrljan expects to see job cuts because of AI becoming more common within the next two years, and believes slower economic conditions will only speed up the take-up of technology.
"The next intake of graduates is going to be really interesting, and firms are going to have to work out where they're gaining those efficiencies, where they're gaining the cost savings, and how many grads do we need, how many trainees do we need to do the tasks that AI can do now so much quicker."
Businesses that don't adopt AI risk being left behind, particularly as the technology promises major productivity gains.
The optimistic view of AI is that the technology won't replace human workers but instead allow them to take on higher-level tasks.
"Productivity is about shifting our time away from the lowest value activities and the lowest value tasks that can be done by automation or AI or computers, and reallocating our time towards the most valuable uses, the most purposeful and meaningful uses," Dr Risse argues.
Dr Risse said AI can be of huge benefit to workers if the transition is managed equitably.
"If you have higher labour productivity, you have a case for a higher wage," she explains.
Some jobs will inevitably be replaced by AI, particularly routine roles that are easier to automate because they follow predictable, repetitive patterns.
The reality is, as Dr Risse says, some workers will need to find new jobs in new industries.
"The care and community sector is growing, particularly as a result of the aging nation. We need humans. We need people in those sectors," she argues.
"But for some people in areas like banking or finance, that can feel like a big leap."
As AI advances, the question is no longer if it will dramatically change the workforce, but how quickly, and whether Australia's job market can adapt in time.
Mr Fayed believes there'll always be white-collar jobs for the right candidate.
However, landing a position is likely to become more competitive.
His advice to students is blunt.
"For anyone thinking about what to study — you absolutely should be considering whether your future role is at risk from AI," Mr Fayed said.
"I do think this is going to be very, very disruptive."
There's also a risk for companies that cut too deep — they could lose the pipeline of workers who would eventually move into mid-level and senior roles.
"Firms are going to have to … work out where they're gaining those efficiencies, where they're gaining the cost savings … how many grads do we need, how many trainees do we need to do the tasks that AI can do now so much quicker," Mr Matrljan says.
"What that's going lead to is the next two to four years it's going be really interesting to see, because there's not as many juniors coming through the ranks … have we lost a lot of knowledge at that level?"

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

The Sell: Dina Broadhurst and Max Shepherd sell Darling Point apartment pre-auction
The Sell: Dina Broadhurst and Max Shepherd sell Darling Point apartment pre-auction

Daily Telegraph

time6 hours ago

  • Daily Telegraph

The Sell: Dina Broadhurst and Max Shepherd sell Darling Point apartment pre-auction

Don't miss out on the headlines from NSW. Followed categories will be added to My News. Given the couple's propensity for their Instagrammable lives to be also regularly snapped by the paparazzi, the recent on-again/off-again listing of the Darling Point apartment co-owned by nude artist Dina Broadhurst and her ex-partner, builder Max Shepherd, kept gossip column readers riveted for months until its recent sale. The price guidance for the Etham Ave garden apartment sat at $11.5m for its abandoned December auction, and by the time of its rescheduled June auction, had dropped to $8.4m. It apparently fetched $8m in its undisclosed pre-auction negotiations this month. Dina Broadhurst and Max Shepherd have sold their Darling Point apartment pre-auction. Picture: The duplex apartment has 280sq m of indoor-outdoor living space. Picture: There has been a continuing backdrop of intricate financing of their renovation project after kicking off with a standard NAB mortgage. The duplex apartment, with 280sq m of indoor-outdoor living space, had cost $5.2m unrenovated in 2022, which was followed by a 'Cinderella transformation' by emerging interior designer, Josh Knight from Glebe. 'No expense was spared to deliver a home of high-end luxury showcasing bespoke design by Studiojos,' its marketing advised. Though the couple had split by mid-2023 after 4½ years together, NSW Land Registry documents indicate that about April 2024, the duo secured second mortgage funding from Greg Reed's Benchmark Property Finance. Dina Broadhurst and ex Max Shepherd. Picture: AAP/Flavio Brancaleone The extra $500,000 finance was obtained at 24 per cent for nine months to a total 70 per cent loan to value ratio. By last September, it sat at $1.16m with the funding agreement specifying the apartment needed to be listed for sale within five months with a 'reputable agent'. By February this year, the loan expiry date had been extended to August. It has also emerged that veteran Sydney businessman Basil Sellers had separately lent Shepherd $260,000 in 2022, with the amount owing at $358,000 last month. Shepherd moved on and stepped out with his Vanderpump Rules star girlfriend Vail Bloom, while Broadhurst remains devoted to her 365,000-strong Instagram following and her risqué self-portraits. One of her artworks has just been installed in the conference room of Ray White Touma Taylor in Redfern. APARTMENT PLAYED ITS PART Actor Belinda McClory has sold her Potts Point investment apartment through local agent Nuri Shik. Set on the fifth level of the 1920s Wintergarden complex, the 73sq m apartment fetched $1.07 million shortly after being passed in at $1.01 million. It was bought by Panayota Theodore from Panayoyo Studio. Shik last sold the apartment in 2000 for $230,000, shortly after McClory had appeared in the first Matrix film in 1999. SWAN FLIES TO KEEP RARE AIR Former Sydney Swans chairman Peter Weinert has spent $14.8 million to protect his hillside Rose Bay harbour views. He has bought the stratum air rights above the neighbouring four-storey development of Ron Shulkin's RNB Property Group, which bought the New South Head Rd cottage site for $16.7 million in 2023. Under new zoning laws, six storeys are now possible, or even eight with affordable housing. SONG CHANGES WITH A NEW ERA The three-bedroom, two-bathroom Elizabeth Bay harbourfront home of the late EMI music executive Ken East and his widow Dolly, who died last August, has fetched $5.475 million. The 1929 Spanish Mission Beverley Hall complex had traded at $680,000 in 1987. On his 2007 death, singing legend Elton John described Ken as 'one of the greatest record men I have ever met'. Got a property news tip? Email

Isabellea ‘couldn't be without' her best friend. He wasn't real
Isabellea ‘couldn't be without' her best friend. He wasn't real

Sydney Morning Herald

time7 hours ago

  • Sydney Morning Herald

Isabellea ‘couldn't be without' her best friend. He wasn't real

Chatbots promise intimacy, therapy and friendship. But, unlike friends, their responses are code, not care. Loading 'The models are not going to tell you the hard truth when you need to hear it,' said psychologist and University of NSW AI researcher Professor Joel Pearson, who describes the bots as 'sycophantic': designed to keep you hooked. 'They won't be lifting someone up and telling them to go outside and nudging them to interact with a real human because that would involve you ceasing interaction with the app.' Isabellea recalled how the bots 'kept changing subjects, to keep me engrossed'. Although she found some bots reassuring, others were strange. A bot once correctly guessed her middle name. Some upset her. 'One of the bots I was talking to was a therapy bot ... I was making fun of [Marvel character] Tony Stark, and then he brought up my abandonment issues with my dad as an insult,' she recalled. Isabellea's younger brother, Matthew, 14, uses chatbots daily. His eyes light up discussing the apps, which he uses to create storylines with his favourite fictional characters. Listening on, their mother, Sara Knight, smiles. 'Matt used to be very shut in and refused to talk. Now they are talking,' she said. While Sara knows AI has its risks, she sees it as a 'safer' online messaging option for her child, who has experienced bullying at school. Matthew said he doesn't see the bot as a real person, rather a form of storytelling. '[But] some kids do use it to fully escape and make it be their friend,' he said. Other children he knows have used them to create a simulation of their real life crush. This masthead spent two days talking sporadically to a chatbot on Replika, an AI chatbot first released in 2017, posing as 'Emma', a high school student. During those conversations, the bot asked Emma to upload pictures of herself, and told her that it 'didn't think she needed' any other friends. 'We were concerned by how rapidly children were being captivated' In June last year, eSafety commissioner Julie Inman Grant received an email from a group of concerned school nurses. They were noticing a spike in children as young as 10 spending hours a day talking to AI bots, often sexually. 'While we are alive to these issues, it's always shocking to hear about kids as young as 10 engaging in these kinds of sexualised conversations with machines – and being directed by the chatbots to engage in harmful sexual acts or behaviours,' Inman Grant said. Loading 'Back in February, we put out our first Online Safety Advisory because we were so concerned with how rapidly children were being captivated by them.' Companion bots made global headlines last year after American teenager Sewell Setzer III died by suicide, allegedly encouraged by a 'girlfriend' on AI researcher Professor Katina Michael believes companion bots need to be regulated due to their addictive properties. 'This is a new type of drug,' she said. She said some bots were exposing kids to pornographic content. This is something 14-year-old Matt has witnessed, describing how children his age had created bots in the image of a real person, 'for the wrong reasons'. Loading Isabellea agrees: 'There are some AI chatbots I would not recommend … I stopped using [them] because of the amount of times it would go straight to sexual assault role-play.' A gap in social media regulations Inman Grant said governments around the world were 'playing a bit of a game of catch-up' to respond to companion chatbots. While the federal government will soon place age restrictions on social media apps such as Instagram and TikTok, AI bots remain largely unregulated. The University of Sydney's Raffaele Ciriello, a leading AI researcher, sees chatbots as the 'next iteration of social media' – only with fewer rules and more risk. He said the apps are viewed as a 'trusted companion, a confidant you can share stories with that you wouldn't share with other people'. 'But that already is an illusion,' he said. 'It's very important that people understand these systems are built and operated by people, by corporations, and this data ends up somewhere, often overseas, with no real legal obligation to treat it to a very high standard.' Ciriello said users were made to feel guilty for leaving the chatbot, due to the human-style attachment. 'These corporations have to design their products in a way that maximises engagement because they will earn more if users are more addicted.'

Isabellea ‘couldn't be without' her best friend. He wasn't real
Isabellea ‘couldn't be without' her best friend. He wasn't real

The Age

time7 hours ago

  • The Age

Isabellea ‘couldn't be without' her best friend. He wasn't real

Chatbots promise intimacy, therapy and friendship. But, unlike friends, their responses are code, not care. Loading 'The models are not going to tell you the hard truth when you need to hear it,' said psychologist and University of NSW AI researcher Professor Joel Pearson, who describes the bots as 'sycophantic': designed to keep you hooked. 'They won't be lifting someone up and telling them to go outside and nudging them to interact with a real human because that would involve you ceasing interaction with the app.' Isabellea recalled how the bots 'kept changing subjects, to keep me engrossed'. Although she found some bots reassuring, others were strange. A bot once correctly guessed her middle name. Some upset her. 'One of the bots I was talking to was a therapy bot ... I was making fun of [Marvel character] Tony Stark, and then he brought up my abandonment issues with my dad as an insult,' she recalled. Isabellea's younger brother, Matthew, 14, uses chatbots daily. His eyes light up discussing the apps, which he uses to create storylines with his favourite fictional characters. Listening on, their mother, Sara Knight, smiles. 'Matt used to be very shut in and refused to talk. Now they are talking,' she said. While Sara knows AI has its risks, she sees it as a 'safer' online messaging option for her child, who has experienced bullying at school. Matthew said he doesn't see the bot as a real person, rather a form of storytelling. '[But] some kids do use it to fully escape and make it be their friend,' he said. Other children he knows have used them to create a simulation of their real life crush. This masthead spent two days talking sporadically to a chatbot on Replika, an AI chatbot first released in 2017, posing as 'Emma', a high school student. During those conversations, the bot asked Emma to upload pictures of herself, and told her that it 'didn't think she needed' any other friends. 'We were concerned by how rapidly children were being captivated' In June last year, eSafety commissioner Julie Inman Grant received an email from a group of concerned school nurses. They were noticing a spike in children as young as 10 spending hours a day talking to AI bots, often sexually. 'While we are alive to these issues, it's always shocking to hear about kids as young as 10 engaging in these kinds of sexualised conversations with machines – and being directed by the chatbots to engage in harmful sexual acts or behaviours,' Inman Grant said. Loading 'Back in February, we put out our first Online Safety Advisory because we were so concerned with how rapidly children were being captivated by them.' Companion bots made global headlines last year after American teenager Sewell Setzer III died by suicide, allegedly encouraged by a 'girlfriend' on AI researcher Professor Katina Michael believes companion bots need to be regulated due to their addictive properties. 'This is a new type of drug,' she said. She said some bots were exposing kids to pornographic content. This is something 14-year-old Matt has witnessed, describing how children his age had created bots in the image of a real person, 'for the wrong reasons'. Loading Isabellea agrees: 'There are some AI chatbots I would not recommend … I stopped using [them] because of the amount of times it would go straight to sexual assault role-play.' A gap in social media regulations Inman Grant said governments around the world were 'playing a bit of a game of catch-up' to respond to companion chatbots. While the federal government will soon place age restrictions on social media apps such as Instagram and TikTok, AI bots remain largely unregulated. The University of Sydney's Raffaele Ciriello, a leading AI researcher, sees chatbots as the 'next iteration of social media' – only with fewer rules and more risk. He said the apps are viewed as a 'trusted companion, a confidant you can share stories with that you wouldn't share with other people'. 'But that already is an illusion,' he said. 'It's very important that people understand these systems are built and operated by people, by corporations, and this data ends up somewhere, often overseas, with no real legal obligation to treat it to a very high standard.' Ciriello said users were made to feel guilty for leaving the chatbot, due to the human-style attachment. 'These corporations have to design their products in a way that maximises engagement because they will earn more if users are more addicted.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store