logo
Family of outback killer Bradley John Murdoch speaks out

Family of outback killer Bradley John Murdoch speaks out

The Age16-07-2025
'He was deeply loved. He will be deeply missed,' the statement concluded.
Police say Falconio was shot on a remote stretch of the Stuart Highway near Barrow Creek, about 300 kilometres north of Alice Springs, in July 2001. Falconio's blood was found where police believe he was murdered before his body was moved.
The British backpacker was travelling around the country with his girlfriend Joanne Lees, who survived Murdoch's attack. The pair, both from Yorkshire, had travelled across South-East Asia before arriving in Australia.
Lees told police that at about 7pm on July 14, 2001, the pair became aware that a car was following them as they travelled north up the Stuart Highway towards Devil's Marbles in their orange Kombi van.
Driving a white Toyota 4WD ute, Murdoch gestured at Falconio, who was driving the van, to pull over, which he did. Murdoch then told Falconio he'd seen sparks shooting out of the Kombi's exhaust.
Lees was sitting in the front of the parked van when the two men went to examine the exhaust, and she heard a loud bang. Murdoch then appeared in the front window, brandishing a silver handgun, which he pointed at Lees' head.
'I just kept thinking this was not happening to me. I couldn't believe that this was happening. I felt alone. I kept shouting for Pete and thought I was going to die,' Lees told the jury at Murdoch's 2005 trial.
'I was more scared of being raped than being shot by the man,' she said.
Murdoch moved Lees to his vehicle and tied her wrists behind her back, punching her in the head as she struggled. Murdoch then became distracted, with Lees reporting that she heard 'gravel scraping on the ground, as if he was moving something'.
Lees slid out of the vehicle, dropped to the ground and scrambled to a hiding spot behind a bush where she stayed for up to five hours in the dark. Once she was sure Murdoch was gone, she flagged down a truck that took her to Barrow Creek.
A widespread manhunt was launched, and the search for Falconio's body began. The case received intense media interest, both in Australia and the UK, with Lees facing particular scrutiny over her recounting of the attack.
The murder is cited as one of the inspirations for the 2005 Australian horror film Wolf Creek.
The first breakthrough came early in the investigation when a man reported that Bradley John Murdoch was responsible for the crimes.
Murdoch was under arrest in South Australia, facing charges over the abduction and rape of a 12-year-old girl and her mother. A DNA sample was taken in the hopes it could be linked to evidence found at the Northern Territory crime scenes.
While Murdoch has always maintained his innocence, his defence was ultimately undone by his decision to keep an elastic hair tie that belonged to Lees.
The case's lead investigator, former NT police officer Colleen Gwynne, told the ABC in 2016 that an officer had noticed the hair tie wrapped around Murdoch's holster in a search of his possessions, speculating that he was likely to have kept it as a 'trophy'.
In 2003, Murdoch was acquitted of the South Australia rapes and immediately rearrested and extradited to the Northern Territory, where he was charged with Falconio's murder.
In 2005, Bradley John Murdoch was convicted of murdering Falconio, and assaulting and attempting to kidnap Lees. He was serving a life sentence in Alice Springs prison with a non-parole period of 28 years when he died.
'Your conduct in murdering Mr Falconio and attacking Ms Lees was nothing short of cowardly in the extreme,' Northern Territory Supreme Court Justice Brian Martin said in his sentencing.
Loading
Murdoch never revealed the location of Falconio's body, and under the Northern Territory's 2016 'no body, no parole laws', he may have never been granted parole.
He twice appealed to overturn his convictions, but was unsuccessful.
Born in the West Australian town of Northampton in 1958, Murdoch spent most of his life in Broome working as a mechanic.
Murdoch had a history of violent crime, serving time in a Western Australian jail in the mid-1990s for shooting at a crowd of Aboriginal football fans.
As with all deaths in custody, Murdoch's death will be investigated by the Northern Territory Coroner.
On Tuesday this week, Luciano Falconio pleaded for assistance in locating his son's body so that Peter could be buried while he and his wife are still alive.
'I still hope, yeah I still hope, but I don't know if we [will] live long enough', he told News Corp.
'I wish I could find him and make an end to it, bury him.'
In a statement, NT Police said it was 'deeply regrettable' that Murdoch had died without ever disclosing the location of Peter Falconio's remains.
'His silence has denied the Falconio family the closure they have so long deserved. Our thoughts are with the Falconio family in the United Kingdom, whose grief continues,' the statement read.
'The Northern Territory Police Force remains committed to resolving this final piece of the investigation.'
Less than a month ago, NT Police upped its cash reward to $500,000 for information that would lead to the discovery of Falconio's remains.
'We recognise the passage of time that's transpired, however it's never too late to reach out and start that conversation with police,' NT Police Acting Commander Mark Grieve told a press conference on June 25, adding that he still had hope.
'You just never know how beneficial that information that you may hold, may be – essentially, you just don't know what you know.'
The renewed bid for information was made amid reports that Murdoch was in palliative care in Alice Springs Hospital.
Grieve said Murdoch had never positively engaged with the police despite 'numerous approaches' including in the same week.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

'Why do we need that?': Push to ban AI nudity apps
'Why do we need that?': Push to ban AI nudity apps

The Advertiser

timean hour ago

  • The Advertiser

'Why do we need that?': Push to ban AI nudity apps

Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028

‘No appetite at all': King delivers brutal message to Prince Andrew over Epstein scandal
‘No appetite at all': King delivers brutal message to Prince Andrew over Epstein scandal

Sky News AU

timean hour ago

  • Sky News AU

‘No appetite at all': King delivers brutal message to Prince Andrew over Epstein scandal

Entertainment Reporter Bronte Coy claims there's 'no appetite' from the British public for Prince Andrew to return to royal duties. Ms Coy's comments come as Prince Andrew's hopes of returning to royal duties have been firmly shut down by King Charles, despite the FBI confirming it will not pursue any further charges in the Jeffrey Epstein case. 'There's no appetite from the British public over here for him to return to royal duties. There seems to be no appetite at all from the King,' Ms Coy told Sky News host Caroline Di Russo. 'He doesn't have that public life, and there really doesn't look like they'll ever be a return to it.'

'Why do we need that?': Push to ban AI nudity apps
'Why do we need that?': Push to ban AI nudity apps

Perth Now

timean hour ago

  • Perth Now

'Why do we need that?': Push to ban AI nudity apps

Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store