Burgertory owner Hash Tayeh steps down as CEO
In an Instagram post on Wednesday night, Tayeh said the decision to step down as CEO of Burgertory and food and beverage company QSR Collective was not made lightly.
'The reality is, I've been subjected to targeted attacks, politically motivated smears, and ongoing harassment, not because of any wrongdoing, but because I've dared to speak out against injustice,' Tayeh wrote.
In April this year, police launched an investigation into an alleged threat by a defence contractor to bomb one of Tayeh's Burgertory outlets.
Loading
The threat to 'park a couple of b0mb$' outside the hamburger restaurant in Tullamarine was allegedly made on LinkedIn and posted under the profile of army veteran-turned-defence industry expert Guy Langford.
In two separate incidents in 2023 and 2024, Tayeh's home and the now-closed Caulfield Burgertory outlet were firebombed.
Following the incidents, The Age revealed the alleged arsonist accused of firebombing the Burgertory restaurant told covert police officers the attack was linked to the conflict in the Middle East, contradicting authorities' repeated claims there was no racial, religious or political motive.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Sydney Morning Herald
4 hours ago
- Sydney Morning Herald
Hospital toilet spying investigation expanded to further sites
The investigation into a young doctor accused of filming hospital colleagues in a toilet in Melbourne's north-eastern suburbs has expanded to additional hospitals in Melbourne and regional Victoria. A Victoria Police spokesperson told The Age that several hospitals had been identified as 'workplaces of interest' after Ryan Cho, who most recently worked at the Austin Hospital, was last week arrested and charged with stalking and use of an optical surveillance device. 'The 27-year-old worked at these hospitals located in Melbourne's CBD and regional Victoria between 2020 and 2025,' the police spokesperson said. 'Police are in the process of contacting the additional hospitals and those potentially impacted during the time of the man's employment.' Staff at the Austin uncovered a mobile phone allegedly recording them on Thursday, July 3. The Age revealed this week that staff at the Royal Melbourne Hospital had since been informed that Cho was employed at the CBD-adjacent Parkville site between February 2024 and February 2025. The Age has seen signage at the Royal Melbourne Hospital, placed on entrances and exits to these facilities this year that says: 'Please do not leave your belongings in here. Please use your staff lockers.' A Royal Melbourne spokesperson said the hospital had been in contact with Victoria Police and would support their investigation as required. 'The RMH takes the wellbeing and safety of our staff very seriously. As this is now a police matter, it would be inappropriate to make any further comment.'


Perth Now
8 hours ago
- Perth Now
New blow for hate speech preacher
A Muslim preacher who made 'fundamentally racist' and 'anti-Semitic' remarks in a series of speeches will be prevented from any attempt to 'bury' an admission he broke the law. Wissam Haddad was found to have breached the racial discrimination act following a four-day hearing in the Federal Court last month over a series of speeches he gave in November 2023, in which he described Jewish people as 'vile', 'treacherous', 'mischievous' and 'wicked and scheming'. The Executive Council of Australian Jewry's (ECAJ) co-chief executive Peter Wertheim AM and deputy president Robert Goot AO SC to launched court action against Mr Haddad over the speeches, arguing they constituted unlawful discrimination; Mr Haddad claimed he was referring to Islamic scripture in most cases. Wissam Haddad was found in breach of the racial discrimination act. NewsWire / Nikki Short Credit: News Corp Australia The proceedings also extended to the Al Madina Dawah Centre (AMDC) for posting videos of the speeches online via Facebook and video-sharing platform Rumble. Justice Angus Stewart found Mr Haddad and AMDC breached the racial discrimination act in delivering and publishing the speeches online, which he said included 'fundamentally racist and anti-Semitic' age-old tropes against Jewish people which were reasonably likely to offend, insult, humiliate and intimidate Jewish people in Australia. More than two weeks following the judgment, Justice Stewart has now moved to prevent corrective notices from being 'deliberately buried' on Mr Haddad's and AMDC's social media pages. Mr Haddad was ordered to post a corrective notice to Instagram and Soundcloud for 30 days, which is required to be given prominence by being 'pinned' to the top of his Instagram profile and added as a story highlight. Wissam Haddad fronted a four-day hearing in the Federal Court in June. Christian Gilles / NewsWire Credit: News Corp Australia The notices include words expressing Mr Haddad and AMDC breached the racial discrimination act and were required to remove the speeches and not to repeat or continue 'unlawful' behaviour. AMDC were similarly ordered to add the notice as a 'featured' post on Facebook, and a featured video on Rumble. 'In short, the 'pinning' and 'featuring' of the posts will prevent them from disappearing from view in a short period of time, and it will prevent them from being deliberately buried by way of successive further posts,' Justice Stewart said in his judgment. While Mr Haddad and AMDC accepted changes to the wording of the notices put forth by ECAJ, they argued the feature and pin tools on Instagram and Facebook were typically used for entrepreneurial and marketing purposes, and therefore would force them to 'advertise' and 'promote' the notice and go beyond what they considered an appropriate redress measure. However Justice Stewart was satisfied pinning and featuring posts on the platforms was 'not onerous', 'complicated' or 'time consuming', and doesn't require the payment of any fees. 'I am also not persuaded that to require the respondents to 'pin' and 'feature' the corrective notices would be unduly burdensome from the perspective of dominating or cluttering their relevant accounts,' Justice Stewart's judgment stated. 'Other posts will still be able to be made, and it is proposed that the notices are required to be published for only 30 days.' He noted the notice would be promoted to an extent, but said Mr Haddad and AMDC had 'promoted the unlawful lectures and it is not disproportionate to require them to promote the corrective notice'. He also rejected Mr Haddad and AMDC's arguments the lectures weren't 'directly' posted to Instagram and Facebook, finding the links posted on the platforms to those lectures facilitated 'easy access for anyone interested in seeing the lectures'. Executive Council of Australian Jewry members Robert Goot and Peter Wertheim leave Federal Court. NewsWire / Jeremy Piper Credit: News Corp Australia Executive Council of Jewry Deputy President Robert Goot. NewsWire / Nikki Short Credit: News Corp Australia Mr Wertheim welcomed the new order in a statement on Thursday afternoon. Executive Council of Australian Jewry co-chief executive Peter Wertheim said: 'We see this as an essential part of counteracting the harm that was caused by their online promotion and reproduction of Haddad's anti-Semitic speeches,' Mr Wertheim said. In his judgment at the beginning of July, Justice Stewart discarded arguments by Mr Haddad and AMDC that the speeches in question were exempt under 18D of the racial discrimination act as they had a genuine purpose in the public interest, and in AMDC's case, that the speeches were a 'fair and accurate report' on a matter of public interest. They had also submitted the relevant sections of the racial discrimination act were beyond parliamentary powers due to the implied freedom of political communication, and additionally the freedom to exercise any religion as per the constitution, both of which Justice Stewart found to have failed. Both Mr Haddad and AMDC were ordered to remove the lectures from their social media and to take all reasonable steps to request any re-publishers also remove the speeches if they become aware of their redistribution. He moved to restrain Mr Haddad from discriminating against Jewish people in the future, barring him from causing words, sounds or images to be communicated anywhere 'otherwise than in private' which attribute characteristics to Jewish people that convey any of the disparaging imputations identified from the speeches. Mr Haddad and AMDC were also ordered to cover the costs of the Federal Court proceedings.

ABC News
14 hours ago
- ABC News
Meta falsely accuses Instagram user of breaking child exploitation rules
An Australian beautician says she is frustrated and disheartened after Meta suspended her business and personal accounts, falsely accusing her of posting child exploitation material. The morning of the ban, Madison Archer posted a video capturing her life as a mother and businesswoman, including a brief shot of her holding her daughter. She said she never posted anything untoward on her account and did not know what exactly triggered the suspension. But shortly after the June 14 post, Ms Archer received an email that her business's Instagram account had been suspended. Her case reflects a growing number of people in Australia and around the world complaining of harsh enforcement of account-banning and a lack of thorough review when mistakes are made. The email Ms Archer received from Meta said she had not followed community standards on child sexual exploitation, abuse and nudity. "When I saw the email I initially thought it was a scam, so I didn't open it," Ms Archer told the ABC. "But when I logged into the Instagram app, a pop-up came up saying action was needed and I was like, 'What the hell?' "I felt sick because I'm so conscious of protecting my daughter as it is that I would never do anything they were accusing me of." In an effort to reinstate her account, she immediately appealed against the decision, believing the social media platform would see it was an error. An email from Meta said it would take 24–48 hours to provide a response. But within 15 minutes, an email said the appeal was unsuccessful and that the account would be permanently disabled on Instagram, leaving Ms Archer to suspect the process was entirely handled by AI. "If a human actually did the review they would see that I'm not sexually exploiting children or haven't even looked at any of that stuff on my platform," she said. Because all of Ms Archer's Meta platforms are linked, all her pages were suspended, including her personal Instagram and Facebook accounts. An option to escalate her appeal was never given, and her access to the accounts was only restored after the ABC contacted Meta for comment. When the accounts were disabled after the appeal, Ms Archer reached out to every Meta staff member whose contact details she could find in order to dispute the decision. She said the process was "incredibly difficult". "I had to create a new Facebook page and pay for Meta verification to even get in contact with a real person. "When I did manage to talk to someone, I was always met with the same answer: that it's a separate team and that I need to wait for the system to cool down and then hopefully I'll be given another chance to re-appeal." This advice seemed to contradict the initial information provided to Ms Archer when her first appeal failed — that further escalation would not be possible. At one point, she was even temporarily banned from contacting the support team because it said she had reached her "limit". "It's so disheartening that you put your trust in a platform that is designed to connect you with others, you follow the rules, and then your account can just be taken away with no proper support," she said. In a bid to rebuild nine years' worth of work building up a customer and subscriber base, Ms Archer created a new Instagram account, but was only able to attract about 2,000 of her followers — 9,000 shy of the previous number. "A lot of people that come across my profile were inclined to book [my business] from the portfolio of work they saw on my account. "They were also able to see the large following I had which backed my reputation. The ABC raised Ms Archer's case with Meta on July 15. Within hours, her account was reinstated. In an email sent to Ms Archer, seen by the ABC, the tech giant said: "We're sorry that we've got this wrong and that you weren't able to use Instagram for a while. Sometimes, we need to take action to help keep our community safe." Before her account was restored, Ms Archer also went through the process of paying a third party. She lost $1,500 to a scammer, who was referred to her by someone in the industry who is reputable and was successful in recovering their accounts. When asked by ABC News, Meta declined to comment on Ms Archer's case. "We take action on accounts that violate our policies, and people can appeal if they think we've made a mistake," a Meta spokesperson said. Ms Archer's social media account is not the only one to have been falsely accused of breaching community standards. Other Australians contacted the ABC about the issue, including Katie, who said her personal Facebook account was suspended for "misrepresenting your identity". Katie said she had always used her own name, profile picture, email and phone number on the account she created in 2007. "I have 18 years of contacts on Facebook, which includes people from different cities and countries I've lived in, as well as deceased families and relatives," she said. "I use Marketplace to buy, sell and swap household items and children's clothing for my young family." In a similar case to Ms Archer, Katie said the Meta support team was "incredibly unhelpful". "Sometimes I can't access any support, and then when I do, the tickets are closed without a resolution," she said. "My account is still under review and it's been two and a half weeks with no updates." More than 30,000 people have signed a petition accusing Meta's moderation system of wrongly banning accounts and giving people no functional way to appeal. Thousands of people are also in Reddit forums dedicated to the subject. Many users have also posted on social media platforms such as X about being banned. Meta has previously acknowledged a "technical problem" with Facebook groups, but has denied its platforms are more widely affected. It is understood Meta has not seen evidence of a significant increase in incorrect enforcement of its rules. A Meta spokesperson did not answer ABC questions about why it wrongly accused Ms Archer of violating its policies. But according to an X post from Korean National Assembly member Minhee Choi, Meta has acknowledged the possibility of wrongful suspensions of accounts in her country: "Meta is currently conducting a global crackdown on activities related to child and youth pornography on social media platforms, including Instagram. During this process, they have acknowledged that some user accounts are being excessively blocked and are being restored sequentially, while they are working to identify specific issues," she said. University of Melbourne deputy head of school of computing and information systems Shaanan Cohney said large companies such as Meta had been using AI to identify accounts in breach of their guidelines for many years. "But what does change from time to time, though, is the specific AI techniques that they use," Dr Cohney said. "What these large platforms are doing behind the scenes is they're collecting a whole bunch of things called signals that might be an indicator that something is wrong or dangerous about an account. "Even if your account is innocent, but for some reason has a lot of these signals associated with it, it might be automatically picked up by one of these algorithms." Exactly what goes into the latest version of these algorithms, how they change and when they change are not known to the public. Dr Cohney said information like this was a "trade secret". "You could say that the public should have a right to know what's in these algorithms because it can impact people pretty deeply because of the way in which these platforms are integrated into our lives," he said. "But there is a legitimate counter to that. "The effectiveness of these technologies to identify harmful behaviour is partially predicated on the ability of these companies to keep the methods secret so people can't devise tricky ways to get around them." When it comes to appealing against a breach, Dr Cohney said major social media players had fairly opaque procedures for users which generally involved some sort of semi-automated process. "It would be a very, very large undertaking to require a large company like this to provide everyone with a human appeal," he said. Meta told the ABC it used a combination of people and technology to find and remove accounts that broke its rules. Ms Archer is relieved to be posting content on her original business account again. But the feeling of uncertainty that this could happen again remains. "The fear of losing it again still sits heavy," she said. "It's hard to fully relax when you've already seen how quickly it can be taken without warning. "I'm following the rules, as I have always done, and will be keeping my backup account I created just in case."