
‘One of the largest retail underpayment settlements' in South Australia's history to go to regional employees
The underpayment claims were first brought to the Federal Court in 2021, after staff working for Eudunda Farmers Limited (EFL)-owned Foodland supermarkets in South Australia approached The Shop, Distributive and Allied Employees' Association (SDA) with their concerns.
Initially 64 current and former EFL employees claimed they had been underpaid or misclassified and were seeking $1 million in back pay.
The legal action eventually involved more than 500 current and former workers across 23 regional supermarkets, who together with the SDA agreed to the $5.5 million settlement with the Eudunda Farmers owners.
The settlement means each will receive on average $11,0000.
However, the level of entitlements vary, with one worker set to receive more than $145,000 according to the ABC.
EFL operates 23 supermarkets and retail stores in country South Australia and employs more than 700 South Australians, with a majority living in regional areas.
The case related only to EFL-owned Foodland stores. Other Foodland-branded supermarkets across South Australia which are separately owned and operated were not accused of underpayments and were not part of the case.
In Kingston SE, a small regional town 294km from Adelaide, Tahlia Troeth worked at her local Foodland for five years until 2022 and said the money owed in backpay would greatly benefit her.
'This will make a real difference for me and will help me pay off the remainder of my HECS debt,' she said.
'I worked for Eudunda Farmers part-time, mostly as a junior employee. I had no idea the underpayments were this large, I thought it was just a few missed allowances here and there.
'I'm glad that I, and many other workers, are finally getting the money that we deserved in the first place.'
SDA argued EFL misclassified workers, incorrectly paid overtime and allowances, and breached minimum shift rules over a six-year period.
After the settlement was agreed, SDA secretary Josh Peak said he was glad to see justice done for workers.
'The SDA is proud to have secured $5.5 million in backpay and deliver wage justice for workers at Eudunda Farmers supermarkets,' he said in a statement.
'This is one of the largest retail underpayment cases in South Australian history.
'This is a massive outcome for these workers and will be life-changing for many of them.'
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

9 News
an hour ago
- 9 News
Australia regulator says YouTube, others 'turning a blind eye' to child abuse material
Your web browser is no longer supported. To improve your experience update it here Australia 's internet watchdog has said the world's biggest social media firms are still "turning a blind eye" to online child sex abuse material on their platforms, and said YouTube in particular had been unresponsive to its enquiries. In a report released on Wednesday, the eSafety Commissioner said YouTube , along with Apple , failed to track the number of user reports it received of child sex abuse appearing on their platforms and also could not say how long it took them to respond to such reports. The Australian government decided last week to include YouTube in its world-first social media ban for teenagers, following eSafety's advice to overturn its planned exemption for the Alphabet-owned Google's GOOGL.O video-sharing site. Australia's internet watchdog has said the world's biggest social media firms are still "turning a blind eye" to online child sex abuse material on their platforms, with YouTube in particular, unresponsive to its enquiries (SOPA Images/LightRocket via Gett) "When left to their own devices, these companies aren't prioritising the protection of children and are seemingly turning a blind eye to crimes occurring on their services," eSafety Commissioner Julie Inman Grant said in a statement. "No other consumer-facing industry would be given the licence to operate by enabling such heinous crimes against children on their premises, or services." Google has said previously that abuse material has no place on its platforms and that it uses a range of industry-standard techniques to identify and remove such material. Meta - owner of Facebook, Instagram and Threads, three of the biggest platforms with more than 3 billion users worldwide - says it prohibits graphic videos. Google has said before that its anti-abuse measures include hash-matching technology and artificial intelligence. (Smith Collection/Getty) The eSafety Commissioner, an office set up to protect internet users, has mandated Apple, Discord, Google, Meta, Microsoft, Skype, Snap and WhatsApp to report on the measures they take to address child exploitation and abuse material in Australia. The report on their responses so far found a "range of safety deficiencies on their services which increases the risk that child sexual exploitation and abuse material and activity appear on the services". Safety gaps included failures to detect and prevent livestreaming of the material or block links to known child abuse material, as well as inadequate reporting mechanisms. It said platforms were also not using "hash-matching" technology on all parts of their services to identify images of child sexual abuse by checking them against a database. The Australian regulator said some providers had not made improvements to address these safety gaps on their services despite it putting them on notice in previous years. (Getty) Google has said before that its anti-abuse measures include hash-matching technology and artificial intelligence. The Australian regulator said some providers had not made improvements to address these safety gaps on their services despite it putting them on notice in previous years. "In the case of Apple services and Google's YouTube, they didn't even answer our questions about how many user reports they received about child sexual abuse on their services or details of how many trust and safety personnel Apple and Google have on-staff," Inman Grant said. national Australia social media youtube CONTACT US Property News: Rubbish-strewn house overtaken by mould asks $1.2 million.

Sydney Morning Herald
6 hours ago
- Sydney Morning Herald
Forget a social media ban. If tech companies won't stop targeting teens like me, block them
And my algorithm is relatively benign. In the US, parents who are suing social media companies for allegedly causing their children to take their own life have reported that their children's feeds were filled with material about 'suicide, self-harm, and eating disorders'. Loading For social media companies, profits clearly come before teens' mental health. So perhaps seriously jeopardising those profits would be the most effective way to force change. While the impending social media ban threatens fines of up to $50 million for social media companies that do not take 'reasonable steps' to prevent workarounds, that probably isn't going to be enough of a punishment to create change. The term 'reasonable steps' is too vague, and the profits made from having under-16s illegally using social media apps would likely outweigh the fines. It's instead worth looking to some of the more drastic steps that have been taken in the US against social media companies, for various reasons. The US government's banning of TikTok, though relating to data privacy concerns rather than mental health, did effectively lead to the app going offline in US for a day (the ban was then postponed, but is due to come back into effect in September, unless its parent company ByteDance sells its American operations to a US-owned company.) This kind of broad government action against social media companies, threatening to entirely suspend their operations unless they cease recommending distressing or disturbing content to teenagers, might be worth trying in Australia. But even if this doesn't happen – if there's no effective legislation from the government, and we can't change the fact that kids will be exposed to dangerous content – one of the easiest and most important ways to reduce the harm of social media is education. Parents and schools often warn us about online predators, but not about how we should deal with content that makes us feel bad about ourselves or other people. And that's probably because adults and authorities don't fully understand what we're being exposed to. If schools partnered with social media experts and psychologists to learn what kinds of content social media is promoted to young people, what warning signs parents should look for if their child is at risk of internet-induced mental health issues, and how young people can disengage from harmful content or learn how to better deal with it, then we might make some progress. It's akin to giving kids and teenagers a vaccine against the social media virus, rather than trying to keep it out of the country. Loading Because, after all, social media doesn't cease being a cesspit of negativity and danger once children turn 16. These highly powerful algorithms profit off worsening our mental health, and they're relentless. Educating young people on how to critically engage with or distance themselves from harmful online content is a long-term form of protection. Crisis support is available from Lifeline 13 11 14.

The Age
6 hours ago
- The Age
Forget a social media ban. If tech companies won't stop targeting teens like me, block them
And my algorithm is relatively benign. In the US, parents who are suing social media companies for allegedly causing their children to take their own life have reported that their children's feeds were filled with material about 'suicide, self-harm, and eating disorders'. Loading For social media companies, profits clearly come before teens' mental health. So perhaps seriously jeopardising those profits would be the most effective way to force change. While the impending social media ban threatens fines of up to $50 million for social media companies that do not take 'reasonable steps' to prevent workarounds, that probably isn't going to be enough of a punishment to create change. The term 'reasonable steps' is too vague, and the profits made from having under-16s illegally using social media apps would likely outweigh the fines. It's instead worth looking to some of the more drastic steps that have been taken in the US against social media companies, for various reasons. The US government's banning of TikTok, though relating to data privacy concerns rather than mental health, did effectively lead to the app going offline in US for a day (the ban was then postponed, but is due to come back into effect in September, unless its parent company ByteDance sells its American operations to a US-owned company.) This kind of broad government action against social media companies, threatening to entirely suspend their operations unless they cease recommending distressing or disturbing content to teenagers, might be worth trying in Australia. But even if this doesn't happen – if there's no effective legislation from the government, and we can't change the fact that kids will be exposed to dangerous content – one of the easiest and most important ways to reduce the harm of social media is education. Parents and schools often warn us about online predators, but not about how we should deal with content that makes us feel bad about ourselves or other people. And that's probably because adults and authorities don't fully understand what we're being exposed to. If schools partnered with social media experts and psychologists to learn what kinds of content social media is promoted to young people, what warning signs parents should look for if their child is at risk of internet-induced mental health issues, and how young people can disengage from harmful content or learn how to better deal with it, then we might make some progress. It's akin to giving kids and teenagers a vaccine against the social media virus, rather than trying to keep it out of the country. Loading Because, after all, social media doesn't cease being a cesspit of negativity and danger once children turn 16. These highly powerful algorithms profit off worsening our mental health, and they're relentless. Educating young people on how to critically engage with or distance themselves from harmful online content is a long-term form of protection. Crisis support is available from Lifeline 13 11 14.