
Train derails in deadly crash in Germany
Four Mexicans facing charges in relation to the killing of two Australian surfers had yet another preliminary hearing delayed as prosecutors talk about plea deal.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Sky News AU
23 minutes ago
- Sky News AU
Webjet fined $9m for misleading Aussie travellers after the ACCC took the online travel agency to Federal Court
A major Australian travel company has been fined $9m for misleading Aussies about the price of flights and booking confirmations after an investigation from the consumer watchdog. The Australian Competition and Consumer Commission (ACCC) in November took online travel agency (OTA) Webjet to Federal Court for misleading advertisements it ran between 2018 and 2023 that excluded compulsory fees. Webjet also admitted it misled 118 customers between 2019 and 2024 by providing flight bookings for travel plans it had not actually confirmed. The OTA then asked for additional payments upwards of $2120 for customers to complete the booking. Webjet has since handed back this money. ACCC chair Gina Cass-Gottlieb said the investigation into Webjet began after a traveller complained about a ticket price advertised as 'from $18' ended costing three times as much after fees were added. 'We took this case because we considered that Webjet used misleading pricing by excluding or not adequately disclosing compulsory fees in its ads,' Ms Cass-Gottlieb said. 'Seeking to lure in customers with prices that don't tell the whole story is a serious breach of the Australian Consumer Law.' The OTA hit customers with a 'servicing fee' and a 'booking price guarantee' fee ranging between $34.90-$54.90 per booking. These additional fees were not disclosed in Webjet's social media posts and varied depending on where the traveller was heading. Some users had to scroll to the fine print near the bottom of the screen in their booking to see information about the fees. 'Retailers must ensure their advertised prices are accurate,' Ms Cass-Gottlieb said. "They should clearly disclose additional fees and charges." These fees made up 36 per cent of Webjet's total revenue from November 2018 to November 2023. The consumer watchdog noted that Webjet had co-operated with the ACCC throughout the investigation, admitted liability and agreed to make joint submissions to the Court about orders, including the penalty.


The Advertiser
3 hours ago
- The Advertiser
'Why do we need that?': Push to ban AI nudity apps
Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028


Perth Now
3 hours ago
- Perth Now
'Why do we need that?': Push to ban AI nudity apps
Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028