logo
AGC objects to Ilham Tower's bid to challenge MACC seizure of Menara Ilham

AGC objects to Ilham Tower's bid to challenge MACC seizure of Menara Ilham

Malay Mail23-07-2025
KUALA LUMPUR, July 23 — The Attorney General's Chambers (AGC) has objected to Ilham Tower Sdn Bhd's application for leave to initiate a judicial review against the Malaysian Anti-Corruption Commission (MACC) over the seizure of Menara Ilham.
Senior Federal Counsel Nurhafizza Azizan informed the High Court of the objection during today's proceedings before Judge Datuk Amarjeet Singh.
The court directed both parties to file written submissions by Aug 13, with any replies to be submitted by Aug 20. The hearing is set for Sept 18.
Ilham Tower was represented by counsel Datuk Dr Gurdial Singh Nijar. It filed the ex-parte application through Messrs Raj & Sach on June 23.
Toh Puan Na'imah Abdul Khalid, widow of former finance minister the late Tun Daim Zainuddin, is listed as one of the company's directors.
The other respondents named were MACC chief commissioner Tan Sri Azam Baki; senior officer Mohd Razi Rahhim @ Rahim; deputy public prosecutor Datuk Ahmad Akram Gharib; the Public Prosecutor; Prime Minister Datuk Seri Anwar Ibrahim; and the Government of Malaysia.
The company is seeking a declaration that the seizure order or notice issued by the second respondent (Ahmad Akram), on June 4, 2025, under Section 51(1) of the Anti-Money Laundering, Anti-Terrorism Financing and Proceeds of Unlawful Activities Act 2001, in relation to the seizure of Menara Ilham at Jalan Binjai, is unlawful and therefore null and void.
It is also seeking a certiorari order to quash the seizure notice along with all related decisions or actions.
In addition, a mandamus order is sought to compel the second respondent, or any officer, employee, agent, or person authorised by him, to cancel the said notice.
The applicant further seeks a court order to stay the enforcement of the seizure notice and all related decisions or actions pending disposal of the judicial review leave application, as well as damages and costs. — Bernama
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Sg Kajah Land Committee again cries foul over alleged logging on NCR land
Sg Kajah Land Committee again cries foul over alleged logging on NCR land

Borneo Post

time40 minutes ago

  • Borneo Post

Sg Kajah Land Committee again cries foul over alleged logging on NCR land

A photo allegedly showing several longhouse chiefs accompanying logging company representatives to the field on July 14. — Photo from Kujat Dudang KANOWIT (Aug 4): The Sungai Kajah Land Committee (SKLC) again raised concerns over ongoing logging activities allegedly taking place on gazetted Native Customary Rights (NCR) land under their stewardship despite repeated complaints lodged with the relevant authorities. Its chairman Kujat Dudang said the activity, which began in mid-2021, was being carried out under the guise of a Licence for Planted Forests (LPF) issued over Block 1 Lot 8 of the Sepali Land District. 'The issuance and continuation of this LPF clearly contravenes Section 65(3) of the Sarawak Forestry Ordinance and Section 5(3) of the Sarawak Land Code (1958), both of which prohibit commercial activities on gazetted NCR land without proper consultation and consent,' Kujat said in a statement yesterday. He explained that the land in question was surveyed and recognised under Section 6 of the Sarawak Land Code (Amendment) 2018 prior to the LPF's issuance. 'Despite this, the Forest Department Sarawak (FDS) approved the LPF licence, raising serious legal, ethical, and governance concerns,' he added. Kujat also claimed that although multiple police reports had been lodged, no enforcement action had been taken to date. In addition, SKLC has submitted a complaint to the Malaysian Anti-Corruption Commission (MACC), backed by photographs and witness testimonies. The complaint alleged that certain longhouse chiefs were receiving monthly allowances and commission-based payments from the licence holder to influence villagers into allowing logging on NCR land. 'This blatant conflict of interest undermines community leadership and deceives landowners into giving consent under false pretences,' Kujat said. He stressed that the continued extraction of timber has caused environmental degradation, including deforestation, water pollution, and biodiversity loss — posing threats to the ecosystem, food security, and cultural heritage of the Iban community in Sungai Kajah. In a statement on April 21, the FDS said it was 'awaiting confirmation of NCR land status' on the matter. However, Kujat pointed out that this contradicted an earlier statement from the state Land and Survey Department (LSD), which stated that LPF licenses should not be issued over gazetted NCR land. 'In light of these issues, the SKLC is calling for immediate action from the FDS to revoke the LPF license issued over the gazetted NCR land, LSD to publicly reaffirm the Section 6 NCR status and enforce its protection, MACC and the Royal Malaysia Police to investigate corruption allegations involving the licencee and implicated longhouse chiefs, the federal Ministry of Natural Resources and Environmental Sustainability to uphold environmental laws and indigenous land rights and the Premier of Sarawak's Office to ensure that development does not override legally recognised customary rights,' he said. Kujat emphasised that the people of Sungai Kajah were not against development but urged that it be carried out transparently and lawfully. 'The government must not allow large corporations to continue exploiting our forests with impunity, nor should grassroots voices be ignored.' Photographic evidence allegedly showing several longhouse chiefs accompanying logging company representatives to the field on July 14 has been submitted as part of the MACC complaint. The individuals involved are reportedly under investigation, according to SKLC. 'This is not just a protest. It is a plea for justice and a call for the rule of law to be upheld,' Kujat concluded.

‘It sounded just like my brother': How deepfake voices are fuelling money scams
‘It sounded just like my brother': How deepfake voices are fuelling money scams

Malay Mail

time40 minutes ago

  • Malay Mail

‘It sounded just like my brother': How deepfake voices are fuelling money scams

KUALA LUMPUR, Aug 4 — Imagine receiving a voice note on WhatsApp from someone who sounds exactly like your younger brother — his voice, tone, and even the way he says your name are all spot on. He says he's stuck at work, left his wallet behind, and needs RM1,500 urgently to sort something out before his boss finds out. There's even a familiar laugh at the end and you don't think twice because you really think it was him. But what if that voice was not real? CyberSecurity Malaysia (CSM) chief executive officer Datuk Amirudin Abdul Wahab has warned about a rise in scams involving AI-generated voice cloning, where scammers use artificial intelligence to impersonate family members, friends or colleagues. In many cases, the goal is to trick victims into sending money by creating a false sense of urgency and trust. 'Scammers use AI-generated voices to mimic friends, family members, or colleagues often via WhatsApp or phone calls to request urgent transfers or loans. 'Since early 2024, the police have investigated over 454 such cases, with losses totalling approximately RM2.72 million,' he said when contacted by Malay Mail. He then went on to say in the first three months of 2025, the country recorded 12,110 online fraud cases, involving scams such as fake e-commerce deals, bogus loans, and non-existent investment schemes, with total losses amounting to RM573.7 million. Citing Bukit Aman's Commercial Crime Investigation Department (CCID), he said generative AI tools including deepfake videos, cloned voices, fake digital identities, and chatbots are increasingly being used to carry out these scams. 'There has also been a rise in scams involving AI-generated cloned voices. In one case, scammers mimicked the voice of a family member to simulate an emergency situation via WhatsApp voice notes, urging the recipient to urgently transfer funds,' he said. He noted that the voice was cloned from short public TikTok videos. Amirudin added that deepfake scams have also involved national icons like Datuk Seri Siti Nurhaliza and Datuk Lee Chong Wei, whose altered images and voices were used in fake advertisements promoting cryptocurrency and investment platforms. 'As of March 2025, CCID Bukit Aman confirmed the discovery of at least five deepfake videos impersonating both national and international personalities. Among the names falsely used were Prime Minister Datuk Seri Anwar Ibrahim, Elon Musk, Donald Trump, Teresa Kok, and a senior Petronas executive. 'The manipulated clips were widely circulated online to promote fake investment platforms, many of which falsely promised returns of up to 100 times the original amount,' he added. He said the scams relied heavily on the authority and familiarity of well-known figures to convince unsuspecting viewers, especially on social media platforms where verification is often overlooked. Why it poses a serious threat, Amirudin explained that the rise of deepfake technology is alarming not just for its technical sophistication, but for the far-reaching impact it can have on society. At the individual level, he said deepfakes are being used to exploit public emotions, especially in scams that mimic the voices of family members, government officials, or well-known personalities. These tactics create a false sense of urgency, pushing victims into making quick decisions often involving money before they have a chance to think critically. 'Beyond personal safety, there is growing concern over the effect deepfakes have on public trust in the media. As manipulated content becomes increasingly indistinguishable from real footage or audio, it blurs the line between fact and fiction,' Amirudin said. He also said that this erosion of trust can sow confusion, making it easier for false narratives, misinformation, and disinformation to spread particularly on social media. At a broader level, he highlighted that national security is also at stake because the content that convincingly imitates political leaders or high-ranking officials could be weaponised to stir panic, manipulate public sentiment, or create political instability. How to verify and report suspicious AI-generated content With deepfakes becoming more difficult to detect, CSM is urging the public to stay vigilant and take advantage of available resources to verify suspicious content. He said the agency's Cyber999 Incident Response Centre supports both individuals and organisations in identifying cyber threats that involve technical components such as phishing, malware, or manipulated digital content. Members of the public can report suspicious activity through several channels: Online form and mobile application Email: cyber999[@] Hotline: 1-300-88-2999 (during office hours) or +60 19-266 5850 (24/7) 'Cyber999 also provides technical analysis of suspicious emails which users are encouraged to forward the full email header and content for expert review. 'In addition, the team shares regular security advisories and best practices, helping Malaysians keep up with the latest online threats and how to avoid them,' he said. He explained that Cyber999 handles technical cyber threats like phishing and malware, while deepfake cases without clear technical elements are usually referred to law enforcement or regulators. For small businesses, Amiruddin said CSM has developed the CyberSAFE SME Guidelines, which offer a simple checklist to help organisations detect, verify, and respond to suspicious online content. Wrapping up in our final part: It's not just tech — it's trust. We look at why media literacy is your best line of defence in the age of deepfakes, and how you can help protect not just yourself — but your family too. Recommended reading: Why seeing isn't believing anymore: What are deepfakes, and how to protect yourself from AI-generated scams AI scams are getting real: Here are the cases happening in Malaysia that you should know about

AI scams are getting real: Here are the cases happening in Malaysia that you should know about
AI scams are getting real: Here are the cases happening in Malaysia that you should know about

Malay Mail

time40 minutes ago

  • Malay Mail

AI scams are getting real: Here are the cases happening in Malaysia that you should know about

KUALA LUMPUR, Aug 4 — Scams used to be easy to spot — all it took was some bad grammar, a weird link, or a dodgy phone call. But in today's digital era, fraudsters are using artificial intelligence (AI) to impersonate people we know and trust in order to steal money or personal data. Malay Mail has compiled some of the real-life scams behind AI-powered fraud wave: Voice-cloning scams via phone or WhatsApp In May this year, a woman in Selangor lost RM5,000 after falling victim to a sophisticated voice cloning scam that used AI to mimic her employer's voice, The Rakyat Post reported. The incident occurred during a routine workday at a local shop when the company phone rang repeatedly. On the line was someone who sounded exactly like her boss and he requested several Touch 'n Go (TnG) PINs, claiming it was an urgent matter. It wasn't the first time he had made such requests, so she didn't hesitate. The woman quickly went from one convenience store to another, purchasing RM5,000 worth of TnG top-up codes and sending them as instructed. Then the line went dead. When she eventually managed to contact her real boss through a different channel, he confirmed he had never made the call. His phone had been off the entire time. Police later confirmed it was an AI-driven scam. As of 2024, The Star had reported at least three AI voice scam cases where victims lost thousands of ringgit. In Kuala Terengganu, a travel agent lost RM49,800 after receiving a highly convincing phone call from someone who sounded exactly like her close friend. Believing her friend was in urgent trouble, she transferred the money without hesitation. In Kuala Lumpur, a 26-year-old interior designer was scammed out of RM3,000 in a similar incident, where the caller impersonated a trusted contact using AI-generated audio. In Penang, a 50-year-old housewife fell victim to the same tactic, losing RM4,800 after speaking with a familiar-sounding voice on the other end of the line. Last year, the police investigated 454 fraud cases involving deepfake technology, with total reported losses amounting to RM2.72 million, according to Bukit Aman Commercial Crime Investigation Department (CCID) director Datuk Seri Ramli Mohamed Yoosuf. He said these scams frequently involve the use of AI-generated voices to impersonate family members, friends, or acquaintances, often via WhatsApp voice calls or messages. Scammers typically claim to be in urgent need of help and request money through bank transfers or prepaid top-up PINs Deepfake video investment scams featuring VIPs Scammers are now leveraging AI to produce highly convincing videos of politicians, business leaders, and celebrities to trick victims into bogus investment schemes. These AI-generated deepfake videos commonly feature well-known figures including Prime Minister Datuk Seri Anwar Ibrahim, tycoon Tan Sri Robert Kuok, former chief justice Tun Tengku Maimun Tuan Mat, and Capital A Bhd CEO Tan Sri Tony Fernandes, appearing to endorse fake investment opportunities and quick-money schemes. Even the monarchy wasn't spared — on July 10, the Johor Royal Press Office issued a public warning after detecting an AI-generated deepfake video of His Majesty Sultan Ismail, King of Malaysia on Facebook, falsely promoting an investment scheme. The palace reminded the public that impersonating the King is a serious offence and urged people not to fall for these scams. On Saturday (July 5), MCA Public Services and Complaints Department head Datuk Seri Michael Chong said Malaysians lost a staggering RM2.11 billion to such scams last year, with 13,956 cases reported. 'The AI-generated videos look so real that people can't tell the difference. Anyone watching would think it is the prime minister himself asking the public to invest, unaware that it's an AI-generated fake.' Chong was quoted as saying by News Straits Times. He also said 85 per cent of victims were convinced to invest after watching fake promotional videos featuring seemingly genuine endorsements from public figures. Recommended reading:

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store