
Global Renault boss quits for role at Gucci
The automaker confirmed the news in an official statement after news reports from French news outlet Le Figaro leaked the Italian's departure from the company.
'Luca de Meo has announced his decision to step down and pursue new challenges outside the automotive sector,' the company said in a statement.
Hundreds of new car deals are available through CarExpert right now. Get the experts on your side and score a great deal. Browse now. Supplied Credit: CarExpert
'The Board of Directors … expressed their gratitude to Luca de Meo for the turnaround and transformation of Renault Group and accepted that his departure would be effective from July 15, 2025. Luca de Meo will continue to perform his duties until that date.'
According to Le Figaro, Mr De Meo – who has worked in the automotive industry for decades in roles at both Fiat and the Volkswagen Group – will become the CEO of luxury brand Kering, owner of Gucci.
The move follows recent leadership changes at other automakers including Renault-owned Nissan, Volvo and Stellantis, which owns several brands including Renault rivals Citroen – which is no longer sold in Australia – and Peugeot.
The 58-year-old Italian became Renault Group CEO in 2020, overseeing the Dacia and Alpine sub-brands as well as the broader alliance with Japanese automakers Nissan and Mitsubishi. Supplied Credit: CarExpert
Dacia vehicles – which are cheaper than equivalent Renaults – are set to be offered in Australia by local Renault importer Ateco Automotive, although they will be badged as Renaults.
Meanwhile, Alpine will make a comeback to Australia after a brief absence with the Alpine A390 electric SUV in 2026.
Mr De Meo brought stability to Renault leadership after replacing Thierry Bollore, who was in the role only 12 months before being dismissed for reasons that weren't made public.
Mr Bollore had been outspoken about his predecessor Carlos Ghosn, who was infamously smuggled out of Japan after he was arrested and accused of misleading investors and misusing company assets for personal gain, before he escaped to Lebanon which has no extradition treaty with Japan. Supplied Credit: CarExpert
During his tenure, Mr De Meo strengthened Renault's portfolio and focussed on hybrid models, leaving the brand in a healthier position than when he took over the top job, and being praised by some as Renault's 'saviour'.
The admiration followed his moves to somewhat insulate the automaker from the threat of Chinese electric vehicles and significant US import tariffs.
While the Renault brand does not sell cars in the US, North America is a key market for its Mitsubishi and Nissan partners, with Nissan operating three factories in the US.
His move may also impact the Alpine brand that has Formula 1 and World Endurance Championship campaigns, which he was heavily engaged with. Supplied Credit: CarExpert
Renault is represented by the Sydney-based Ateco group in Australia, where the Renault Trafic and Master commercial vans are its best-sellers. The aged Koleos mid-size SUV is its most popular passenger vehicle year-to-date.
Mr De Meo's replacement is yet to be announced, following a resignation that seemingly caught the company off guard.
'The Board of Directors has expressed its confidence in the quality and experience of the management team to continue and accelerate Renault Group's transformation strategy into this new phase,' it said in a statement.
MORE: Everything Renault
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

9 News
2 hours ago
- 9 News
Australia regulator says YouTube, others 'turning a blind eye' to child abuse material
Your web browser is no longer supported. To improve your experience update it here Australia 's internet watchdog has said the world's biggest social media firms are still "turning a blind eye" to online child sex abuse material on their platforms, and said YouTube in particular had been unresponsive to its enquiries. In a report released on Wednesday, the eSafety Commissioner said YouTube , along with Apple , failed to track the number of user reports it received of child sex abuse appearing on their platforms and also could not say how long it took them to respond to such reports. The Australian government decided last week to include YouTube in its world-first social media ban for teenagers, following eSafety's advice to overturn its planned exemption for the Alphabet-owned Google's GOOGL.O video-sharing site. Australia's internet watchdog has said the world's biggest social media firms are still "turning a blind eye" to online child sex abuse material on their platforms, with YouTube in particular, unresponsive to its enquiries (SOPA Images/LightRocket via Gett) "When left to their own devices, these companies aren't prioritising the protection of children and are seemingly turning a blind eye to crimes occurring on their services," eSafety Commissioner Julie Inman Grant said in a statement. "No other consumer-facing industry would be given the licence to operate by enabling such heinous crimes against children on their premises, or services." Google has said previously that abuse material has no place on its platforms and that it uses a range of industry-standard techniques to identify and remove such material. Meta - owner of Facebook, Instagram and Threads, three of the biggest platforms with more than 3 billion users worldwide - says it prohibits graphic videos. Google has said before that its anti-abuse measures include hash-matching technology and artificial intelligence. (Smith Collection/Getty) The eSafety Commissioner, an office set up to protect internet users, has mandated Apple, Discord, Google, Meta, Microsoft, Skype, Snap and WhatsApp to report on the measures they take to address child exploitation and abuse material in Australia. The report on their responses so far found a "range of safety deficiencies on their services which increases the risk that child sexual exploitation and abuse material and activity appear on the services". Safety gaps included failures to detect and prevent livestreaming of the material or block links to known child abuse material, as well as inadequate reporting mechanisms. It said platforms were also not using "hash-matching" technology on all parts of their services to identify images of child sexual abuse by checking them against a database. The Australian regulator said some providers had not made improvements to address these safety gaps on their services despite it putting them on notice in previous years. (Getty) Google has said before that its anti-abuse measures include hash-matching technology and artificial intelligence. The Australian regulator said some providers had not made improvements to address these safety gaps on their services despite it putting them on notice in previous years. "In the case of Apple services and Google's YouTube, they didn't even answer our questions about how many user reports they received about child sexual abuse on their services or details of how many trust and safety personnel Apple and Google have on-staff," Inman Grant said. national Australia social media youtube CONTACT US Property News: Rubbish-strewn house overtaken by mould asks $1.2 million.

Sydney Morning Herald
7 hours ago
- Sydney Morning Herald
Forget a social media ban. If tech companies won't stop targeting teens like me, block them
And my algorithm is relatively benign. In the US, parents who are suing social media companies for allegedly causing their children to take their own life have reported that their children's feeds were filled with material about 'suicide, self-harm, and eating disorders'. Loading For social media companies, profits clearly come before teens' mental health. So perhaps seriously jeopardising those profits would be the most effective way to force change. While the impending social media ban threatens fines of up to $50 million for social media companies that do not take 'reasonable steps' to prevent workarounds, that probably isn't going to be enough of a punishment to create change. The term 'reasonable steps' is too vague, and the profits made from having under-16s illegally using social media apps would likely outweigh the fines. It's instead worth looking to some of the more drastic steps that have been taken in the US against social media companies, for various reasons. The US government's banning of TikTok, though relating to data privacy concerns rather than mental health, did effectively lead to the app going offline in US for a day (the ban was then postponed, but is due to come back into effect in September, unless its parent company ByteDance sells its American operations to a US-owned company.) This kind of broad government action against social media companies, threatening to entirely suspend their operations unless they cease recommending distressing or disturbing content to teenagers, might be worth trying in Australia. But even if this doesn't happen – if there's no effective legislation from the government, and we can't change the fact that kids will be exposed to dangerous content – one of the easiest and most important ways to reduce the harm of social media is education. Parents and schools often warn us about online predators, but not about how we should deal with content that makes us feel bad about ourselves or other people. And that's probably because adults and authorities don't fully understand what we're being exposed to. If schools partnered with social media experts and psychologists to learn what kinds of content social media is promoted to young people, what warning signs parents should look for if their child is at risk of internet-induced mental health issues, and how young people can disengage from harmful content or learn how to better deal with it, then we might make some progress. It's akin to giving kids and teenagers a vaccine against the social media virus, rather than trying to keep it out of the country. Loading Because, after all, social media doesn't cease being a cesspit of negativity and danger once children turn 16. These highly powerful algorithms profit off worsening our mental health, and they're relentless. Educating young people on how to critically engage with or distance themselves from harmful online content is a long-term form of protection. Crisis support is available from Lifeline 13 11 14.

The Age
7 hours ago
- The Age
Forget a social media ban. If tech companies won't stop targeting teens like me, block them
And my algorithm is relatively benign. In the US, parents who are suing social media companies for allegedly causing their children to take their own life have reported that their children's feeds were filled with material about 'suicide, self-harm, and eating disorders'. Loading For social media companies, profits clearly come before teens' mental health. So perhaps seriously jeopardising those profits would be the most effective way to force change. While the impending social media ban threatens fines of up to $50 million for social media companies that do not take 'reasonable steps' to prevent workarounds, that probably isn't going to be enough of a punishment to create change. The term 'reasonable steps' is too vague, and the profits made from having under-16s illegally using social media apps would likely outweigh the fines. It's instead worth looking to some of the more drastic steps that have been taken in the US against social media companies, for various reasons. The US government's banning of TikTok, though relating to data privacy concerns rather than mental health, did effectively lead to the app going offline in US for a day (the ban was then postponed, but is due to come back into effect in September, unless its parent company ByteDance sells its American operations to a US-owned company.) This kind of broad government action against social media companies, threatening to entirely suspend their operations unless they cease recommending distressing or disturbing content to teenagers, might be worth trying in Australia. But even if this doesn't happen – if there's no effective legislation from the government, and we can't change the fact that kids will be exposed to dangerous content – one of the easiest and most important ways to reduce the harm of social media is education. Parents and schools often warn us about online predators, but not about how we should deal with content that makes us feel bad about ourselves or other people. And that's probably because adults and authorities don't fully understand what we're being exposed to. If schools partnered with social media experts and psychologists to learn what kinds of content social media is promoted to young people, what warning signs parents should look for if their child is at risk of internet-induced mental health issues, and how young people can disengage from harmful content or learn how to better deal with it, then we might make some progress. It's akin to giving kids and teenagers a vaccine against the social media virus, rather than trying to keep it out of the country. Loading Because, after all, social media doesn't cease being a cesspit of negativity and danger once children turn 16. These highly powerful algorithms profit off worsening our mental health, and they're relentless. Educating young people on how to critically engage with or distance themselves from harmful online content is a long-term form of protection. Crisis support is available from Lifeline 13 11 14.