logo
#

Latest news with #JagsKandasamy

China far ahead of India on using AI for defence technology, says LatentAI CEO
China far ahead of India on using AI for defence technology, says LatentAI CEO

Hindustan Times

timea day ago

  • Business
  • Hindustan Times

China far ahead of India on using AI for defence technology, says LatentAI CEO

Washington: Jags Kandasamy, the founder of US-based defence tech company LatentAI, sat down with HT for an interview on India's defence tech ecosystem and the use of AI in modern warfare. Kandasamy — who is partnering with Indian firms to win Indian defence contracts — highlights the bureaucratic and procedural challenges that make it difficult for global tech firms to work with India. He pointed to China's progress on deploying AI for defence purposes and argues that India needs to leapfrog ahead in defence technologies. Jags Kandasamy, the founder of US-based defence tech company LatentAI (Supplied photo) Q. What sort of projects do you intend to work on for the Indian military? So let me give you an example. At this year's Aero India, I met with a colonel in the Indian Army and he was demonstrating an indigenous capability that the Army had created, which was using computer vision on an automated weapon system to be deployed on India's borders. So you define a zone of activity and if there is human movement over there, the machine shoots them down. So in that case, they were using heavy computing and you cannot scale that weapon system the way that they had designed because of hardware limitations. That is a critical use case that our company can solve. We worked with the US Navy on a battery operated underwater vehicle with very limited space and so there were limitations on how much computing power it had. Our job was to compress and optimise an AI model to fit that limited hardware to allow the vehicle to run efficiently. We did that successfully. So that is what we bring to the table. The idea is that autonomous warfare is going to be the future. Ukraine has shown, for example, that drone warfare is here to stay. AI will be used for target recognition, surveillance, intelligence gathering and the like and there will be a lot of demands on military hardware to run AI models efficiently. We're solving that problem. Q. You're an American defence firm that has partnered up with InferQ, which is an Indian firm. We've seen a new defence tech ecosystem come up between America and India. The governments have pushed for it through new initiatives like INDUSX and IDEX. How well has it worked? Absolutely, so I was in Coimbatore last summer, and I met with Forge ventures, which is an Indian defence-focused venture firm and Vish — who heads Forge — was part of an INDUSX cohort that came over to America. So Vish is clearly plugged in with the Ministry of Defence as well as America's Defence Department over here. Vish was one who made introductions to several companies in India and our partner InferQ was one of them. That is the clear, direct path that helped me forge this relationship, Q. How is it working with the Indian defence bureaucracy as a foreign firm? Look, there are a lot of challenges that still need to be overcome. One simple thing is that we don't have a clear mechanism of obtaining clearances for a non-Indian passport holders to work with the Indian Ministry of Defence. Also the bureaucracy in India can be difficult. It takes 18-24 months to even move things forward, right? It is an absolute pain to work with bureaucracy. There are no questions about that. Transparency doesn't exist. So when we're in the US, when we submit a proposal, we know all the different steps that we have to go through and all the decision makers along the way. All of that is very transparent in the US ecosystem. In India, I don't get to see that as a foreign company. And even Indian companies don't get to see that. I am advising a couple of founders in the Indian defence ecosystem. And one guy, he almost had to shut down his company because he built a prototype that the army wanted to see built. He showed it to them. It took them two years, literally two years, to get it certified. And that is the certification. It does not mean he won a contract and then he has to wait again before he gets a contract. I actually think India can become a great proving ground for defence technology. India has the geographical distribution that no other country can have from swamps to deserts to the high range mountains and everything in between. You have a wonderful variety of landscape that one can test technology out. How can US companies leverage that? How does India make it much easier for US companies to come and do business that way? Q. How do you see the China challenge from the Indian perspective? Can you compare the levels of progress from the two sides on defence AI? My personal opinion is that, when it comes to AI applications of defence, the Chinese are like a third year PhD student while India is still in elementary school. India's defence forces have the intent to get new technologies but I don't know if they have the mechanisms they need. I was talking to a few flag officers in India. Since India doesn't have a proper battle management system today, they were looking at hardcore maps, like paper maps, to look at what things are happening. I'll tell you a story. Back in 2009-2010, I was starting another company in Atlanta, and I was approached by the Chinese Communist Party to build a company in Shanghai. They offered to provide me with resources and support for three years, including picking up the payroll costs but I would need to build the company there. Now obviously I declined but you see the way China was approaching this issue even in the 2010s. I was a big, vocal critic of this, and nobody in the US government was paying attention to this. What India needs today is a quantum leap on defence technology. India has leapfrogged in the past like with the telecom sector and it needs to do that again. Q. So what exactly is working in the Indian defence tech ecosystem? I think the IDEX platform launched by the Minister of Defence is uncovering a lot of new technologies that are being homegrown. IDEX is able to expose a lot of talent and technologies and that is important since there is a lot of talent in India and there are a lot of smart people there from a technology point of view. Q. How could India and America work more effectively together on this front? India and the US are the linchpins of democratic society for the next century and there is a threat to that. So just using that as a foundation, there are things India and America can do to work more closely here. For example, can there be reciprocity around clearances? If somebody is cleared over here in America, can they be given priority such that India does its own due diligence and provide a level of clearance that gives them access, or gives them permission to work on defence technology. That would be my first recommendation. Secondly, I was born in India but I'm now an American citizen. Now, there is always this gripe from the American side about India having a lot of Russian hardware. So how does interoperability work? Because India is not going to get rid of Russian hardware. So if we are providing our technology, how does it co-exist within the Indian ecosystem, next to a Russian system? That needs to be solved. I don't think there is a clear, easy solution. But can we define interfaces that are transparent, that protect sensitive secrets for both sides? That would be something to work on.

Edge AI deployment enables enterprises to save $2.07 million
Edge AI deployment enables enterprises to save $2.07 million

Techday NZ

time26-05-2025

  • Business
  • Techday NZ

Edge AI deployment enables enterprises to save $2.07 million

An economic analysis by Latent AI has identified a shift within enterprise artificial intelligence from cloud-centric models to edge deployments, highlighting significant potential cost savings and scalability improvements for organisations. The study, entitled "From Cloud-First to Edge-First: The Future of Enterprise AI," presents 2025 as a pivotal year for this transition, citing factors such as rising cloud computing costs, ongoing shortages of GPUs, and increasing energy prices as contributing elements making edge AI deployments more financially attractive. Latent AI, which focuses on edge AI solutions for national security and defence, conducted the research to investigate both the challenges and the opportunities in adopting edge AI within industries where timely AI adoption is regarded as a key competitive factor. Jags Kandasamy, Chief Executive Officer and Co-founder of Latent AI, said: "As enterprise leaders scale AI deployments, they must weigh performance gains against infrastructure investments when evaluating edge versus cloud strategies. With tighter budgets and growing demands for real-time processing, organizations can no longer afford the heavy computational costs of cloud-only solutions. This is where edge-optimized AI proves transformative. From Cloud-First to Edge-First explains how an edge-first approach reduces hardware requirements by 92% while preserving model accuracy, enabling broader AI deployment and enhancing competitive advantage." The analysis assesses both direct costs, such as hardware and energy, and indirect benefits, such as operational continuity, deployment speed, and lifecycle management. To illustrate the economic impact, the report examines a manufacturing company that initially used a cloud-based AI system with 50 GPUs to process 100 image streams for defect detection across its production line. The company incurred hardware costs of USD $224,000 per site, which rendered expansion across multiple locations impractical due to financial constraints. Applying edge AI optimisation and advanced quantisation techniques allowed the manufacturer to reduce its GPU requirement from 50 to 4 per site. This represented a 92% reduction, lowering hardware expenditure to USD $18,000 per deployment and saving USD $207,000 per site, or a total of USD $2.07 million if scaled across ten locations. Along with the decrease in hardware, memory use fell by 73%—from 14.1GB to 3.8GB per model—while inference speed improved by 73%, boosting defect detection time from 55.2 milliseconds to 14.7 milliseconds. The report noted that model accuracy remained nearly identical, with AU-ROC scores of 0.99127 versus 1.0. The efficiencies also extended to energy, with savings between 65% and 80%, and the elimination of network transfer costs as a result of data being processed locally. The analysis identifies additional factors driving the economic inflection point for edge AI. Among these are optimisation technologies such as mature quantisation and pruning, which enhance edge model performance and reduce resource requirements; AI accelerators designed by companies like NVIDIA that allow powerful AI processing on resource-constrained devices; and the need for reliability in mission-critical deployments, where edge AI can offer resilience and eliminate dependence on cloud infrastructure. The research also highlights the impact of evolving data privacy regulations, including those resembling GDPR, which increasingly favour local data processing to reduce both data transmission requirements and compliance concerns. More mature development frameworks are now available, which Latent AI says simplifies edge deployment and makes access easier for a broader range of organisations. Kandasamy commented further: "Technology shifts don't happen overnight. They build momentum until a tipping point emerges. For edge AI, 2025 is widely recognized as that moment, mirroring the rise of cloud computing in the early-to-mid 2000s. We're seeing technological maturity, economic pressures, and market needs align to drive rapid adoption, offering enterprises a rare chance to gain a lasting edge." The report makes several recommendations for organisations seeking to leverage the cost advantages of edge AI. These include investing in infrastructure tailored to edge deployments, such as servers and IoT devices designed for scalability; optimising AI models for low-latency, energy-efficient performance at the edge; and balancing the use of cloud and edge resources, with a focus on training in the cloud and inference at the edge to reduce ongoing costs.

The Edge-First Era Begins: How AI's Future Saves Millions and Amplifies Competitive Advantage
The Edge-First Era Begins: How AI's Future Saves Millions and Amplifies Competitive Advantage

Associated Press

time21-05-2025

  • Business
  • Associated Press

The Edge-First Era Begins: How AI's Future Saves Millions and Amplifies Competitive Advantage

New economic analysis reveals 92% hardware reduction with edge AI deployment, turning budget-busting pilots into scalable, profit-driven solutions PRINCETON, N.J., May 21, 2025 /PRNewswire/ -- Enterprise artificial intelligence (AI) is poised for a significant economic shift, moving from cloud-centric models to edge-focused deployments that deliver substantial cost savings and scalability. " From Cloud-First to Edge-First: The Future of Enterprise AI,' a new economic impact analysis by Latent AI, identifies 2025 as the critical inflection point where multiple factors: soaring cloud costs, persistent GPU shortages, and escalating energy expenses – converge to make edge AI financially advantageous. As a leader in edge AI solutions for national security and defense, Latent AI conducted this research to explore implementation challenges and opportunities in industries where rapid AI adoption is essential for maintaining competitive advantage. 'As enterprise leaders scale AI deployments, they must weigh performance gains against infrastructure investments when evaluating edge versus cloud strategies,' said Jags Kandasamy, CEO and Co-founder of Latent AI. 'With tighter budgets and growing demands for real-time processing, organizations can no longer afford the heavy computational costs of cloud-only solutions. This is where edge-optimized AI proves transformative. From Cloud-First to Edge-First explains how an edge-first approach reduces hardware requirements by 92% while preserving model accuracy, enabling broader AI deployment and enhancing competitive advantage.' Download the Report: From Cloud-First to Edge-First: The Future of Enterprise AI Economic Advantages: A Data-Driven Perspective From Cloud-First to Edge-First: The Future of Enterprise AI builds a business case for edge-first AI strategies by examining both direct costs like hardware and energy, alongside indirect economic benefits including operational continuity, deployment velocity, and lifecycle management. To illustrate these advantages, the analysis examines a large manufacturing company struggling with production waste and poor yield - a common industry challenge. Initially, the manufacturer implemented a cloud-connected system using 50 Graphics Processing Units (GPUs) to run 100 image streams concurrently for anomaly detection throughout their production pipeline. With hardware costs alone reaching $224,000 per site, scaling this solution across multiple facilities was financially prohibitive. The transformation began when the company adopted edge AI optimization. By applying advanced quantization techniques, they dramatically reduced their GPU requirements from 50 units to just four - a 92% reduction. This slashed hardware costs to $18,000 per deployment, generating savings of $207,000 per site or $2.07 million across ten facilities. The efficiency gains extended beyond hardware: memory utilization decreased by 73% (from 14.1GB to 3.8GB per model), inference speed improved by 73% (from 55.2ms to 14.7ms) enabling real-time defect detection, while accuracy remained virtually unchanged (AU-ROC 0.99127 vs. 1.0). With additional benefits of 65-80% energy savings and eliminated network transfer costs, edge AI successfully converted an unsustainable pilot into a scalable, profit-generating solution. From Cloud-First to Edge-First details edge AI's wider economic promise. Additional key drivers fueling the economic tipping point include: Strategic Recommendations for Enterprises Kandasamy adds, 'Technology shifts don't happen overnight. They build momentum until a tipping point emerges. For edge AI, 2025 is widely recognized as that moment, mirroring the rise of cloud computing in the early-to-mid 2000s. We're seeing technological maturity, economic pressures, and market needs align to drive rapid adoption, offering enterprises a rare chance to gain a lasting edge.' To leverage these economic advantages, the research recommends: Learn More: About Latent AI Latent AI delivers edge AI solutions that enable rapid deployment of artificial intelligence capabilities on any device. Founded in 2018, the company's developer platform helps government and commercial organizations implement efficient, secure AI solutions at the edge. Latent AI's tools enable developers to build and update secure, adaptive models for field or laboratory use, serving defense and commercial customers. For more information, visit View original content to download multimedia: SOURCE Latent AI

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store