Latest news with #edgeAI

National Post
5 days ago
- Business
- National Post
Liquid AI Releases World's Fastest and Best-Performing Open-Source Small Foundation Models
Article content Next-generation edge models outperform top global competitors; now available open source on Hugging Face Article content CAMBRIDGE, Mass. — Liquid AI announced today the launch of its next-generation Liquid Foundation Models (LFM2), which set new records in speed, energy efficiency, and quality in the edge model class. This release builds on Liquid AI's first-principles approach to model design. Unlike traditional transformer-based models, LFM2 is composed of structured, adaptive operators that allow for more efficient training, faster inference and better generalization – especially in long-context or resource-constrained scenarios. Article content Article content Liquid AI open-sourced its LFM2, introducing the novel architecture in full transparency to the world. LFM2's weights can now be downloaded from Hugging Face and are also available through the Liquid P layground for testing. Liquid AI also announced that the models will be integrated into its Edge AI platform and an iOS-native consumer app for testing in the following days. Article content 'At Liquid, we build best-in-class foundation models with quality, latency, and memory efficiency in mind,' said Ramin Hasani, co-founder and CEO of Liquid AI. 'LFM2 series of models is designed, developed, and optimized for on-device deployment on any processor, truly unlocking the applications of generative and agentic AI on the edge. LFM2 is the first in the series of powerful models we will be releasing in the coming months.' Article content The release of LFM2 marks a milestone in global AI competition and is the first time a U.S. company has publicly demonstrated clear efficiency and quality gains over China's leading open-source small language models, including those developed by Alibaba and ByteDance. Article content In head-to-head evaluations, LFM2 models outperform state-of-the-art competitors across speed, latency and instruction-following benchmarks. Key highlights: Article content LFM2 exhibits 200 percent higher throughput and lower latency compared to Qwen3, Gemma 3n Matformer and every other transformer- and non-transformer-based autoregressive models available to date, on CPU. The model not only is the fastest, but also on average performs significantly better than models in each size class on instruction-following and function calling (the main attributes of LLMs in building reliable AI agents). This places LFM2 as the ideal choice of models for local and edge use-cases. LFMs built based on this new architecture and the new training infrastructure show 300 percent improvement in training efficiency over the previous versions of LFMs, making them the most cost-efficient way to build capable general-purpose AI systems. Article content Shifting large generative models from distant clouds to lean, on‑device LLMs unlocks millisecond latency, offline resilience, and data‑sovereign privacy. These are capabilities essential for phones, laptops, cars, robots, wearables, satellites, and other endpoints that must reason in real time. Aggregating high‑growth verticals such as edge AI stack in consumer electronics, robotics, smart appliances, finance, e-commerce, and education, before counting defense, space, and cybersecurity allocations, pushes the TAM for compact, private foundation models toward the $1 trillion mark by 2035. Article content Liquid AI is engaged with a large number of Fortune 500 companies in these sectors. They offer ultra‑efficient small multimodal foundation models with a secure enterprise-grade deployment stack that turns every device into an AI device, locally. This gives Liquid AI the opportunity to obtain an outsized share on the market as enterprises pivot from cloud LLMs to cost-efficient, fast, private and on‑prem intelligence. Article content Article content Article content Article content Article content Article content
Yahoo
08-07-2025
- Business
- Yahoo
Advantech Unveils Next-Generation Edge AI Compute Solutions Powered by Qualcomm Snapdragon X Elite
TAIPEI, July 7, 2025 /PRNewswire/ -- Advantech is proud to introduce its latest suite of high-performance edge AI compute solutions powered by the Snapdragon® X Elite platform – the AOM-6731, AIMB-293, and SOM-6820. Built on this groundbreaking platform, these innovative products are engineered to meet the demanding requirements of modern industrial applications by delivering exceptional processing power, integrated AI acceleration with up to 45 TOPS of AI performance, and robust and lightning-fast 5G and Wi-Fi 7 connectivity into an industrial PC. The solutions powered by the Snapdragon X Elite 12-core and Snapdragon® X Plus 10-core with a leading Qualcomm Oryoton™ CPUs, reaching speeds of up to 3.4GHz. This high-performance processing not only ensures enables rapid data handling and seamless multitasking but also outperforms traditional x86 solutions—using 28% less power on average for everyday tasks, including Teams video calls, local video playback, web browsing, and Microsoft 365. Enhancing AI capabilities, these devices are integrated with Qualcomm® Hexagon™ NPU, providing up to 45 TOPS. The solutions, which contain Snapdragon X Elite platforms, are equipped with LPDDR5X memory, offering a 1.3× speed boost—from 6400MT/s to 8533MT/s—while cutting power consumption by 20% compared to standard LPDDR5. In addition, the integration of UFS 3.1 Gear 4 storage dramatically increases data transfer speeds from 1,000Mbps (PCIe Gen3 NVMe) to an impressive 16,000Mbps. For even greater durability and shock resistance, UFS 4.0 storage solutions are available, ensuring optimal performance in harsh industrial environments. For multimedia-intensive applications, the integrated Snapdragon Adreno 5th Generation VPU supports 4K60p full-duplex H.264 video encoding/decoding. Additionally, the Adreno GPU—equipped with OpenCL, OpenGL, and Microsoft DirectX 12 support—ensures superior graphics performance for vision-centric tasks. Advantech's products take connectivity to the next level with integrated Wi-Fi 7 and 5G technologies, delivering ultra-fast, low-latency network performance, ensuring uninterrupted data streaming and real-time communication even in the most demanding industrial settings. With Wi-Fi 7's multi-gigabit speeds and enhanced network reliability combined with the expansive coverage and high-speed capabilities of 5G, these solutions support data-intensive AI applications and robust remote operations. The result is a truly agile and future-ready infrastructure that optimizes real-time processing and connectivity, empowering industries to harness the full potential of edge AI in today's fast-paced digital landscape. The AI module AOM-6731, the Mini-ITX motherboard AIMB-293, and the COM Express Type 6 module SOM-6820—will be available for engineering evaluations starting in March 2025. For further details on these models, please visit Advantech website. View original content to download multimedia: SOURCE Advantech Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
07-07-2025
- Business
- Yahoo
Ambiq Micro, Inc. Announces Filing of Registration Statement For Proposed Initial Public Offering
AUSTIN, Texas, July 07, 2025 (GLOBE NEWSWIRE) -- Ambiq Micro, Inc. ('Ambiq'), a technology leader in ultra-low-power semiconductor solutions for edge AI, today announced that it has filed a registration statement on Form S-1 with the U.S. Securities and Exchange Commission (the 'SEC') relating to the proposed initial public offering of its common stock. The proposed offering is subject to market and other conditions and there can be no assurance as to whether or when the proposed offering may be completed. The number of shares of common stock to be offered and the price range for the proposed offering have not yet been determined. Ambiq intends to apply to have its common stock listed on the New York Stock Exchange under the symbol 'AMBQ.' BofA Securities and UBS Investment Bank will act as joint lead book-running managers for the proposed offering. Needham & Company and Stifel will act as joint book-running managers for the proposed offering. The proposed offering will be made only by means of a prospectus. When available, copies of the preliminary prospectus relating to the proposed offering may be obtained by contacting: BofA Securities, NC1-022-02-25, 201 North Tryon Street, Charlotte, North Carolina 28255-0001, Attention: Prospectus Department, or by email at or UBS Securities LLC, Attention: Prospectus Department, 1285 Avenue of the Americas, New York, New York 10019, by telephone at (888) 827-7275 or by emailing ol-prospectus-request@ A registration statement relating to these securities has been filed with the SEC but has not yet become effective. These securities may not be sold, nor may offers to buy be accepted, prior to the time the registration statement becomes effective. This press release shall not constitute an offer to sell or the solicitation of an offer to buy these securities, nor shall there be any sale of these securities in any state or jurisdiction in which such offer, solicitation or sale would be unlawful prior to registration or qualification under the securities laws of any such state or jurisdiction. About Ambiq Ambiq's mission is to enable intelligence (artificial intelligence (AI) and beyond) everywhere by delivering the lowest power semiconductor solutions. Ambiq enables its customers to deliver AI compute at the edge where power consumption challenges are the most profound. Ambiq's technology innovations, built on the patented and proprietary subthreshold power optimized technology (SPOT®), fundamentally deliver a multi-fold improvement in power consumption over traditional semiconductor designs. Ambiq has powered over 270 million devices to date. Contact: Charlene Wan VP of Corporate Marketing and Investor Relations cwan@ A photo accompanying this announcement is available at in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
04-07-2025
- Business
- Yahoo
Edge AI Hardware Market worth $58.90 billion by 2030 - Exclusive Report by MarketsandMarkets™
DELRAY BEACH, Fla., July 4, 2025 /PRNewswire/ -- The edge AI hardware market is projected to reach USD 58.90 billion by 2030 from USD 26.14 billion in 2025, at a compound annual growth rate (CAGR) of 17.6% according to a new report by MarketsandMarkets™. A key market driver for edge AI hardware is the growing deployment of IoT devices across various industries, including smart homes, industrial automation, healthcare, and transportation. Many of these applications require real-time data processing so decision-making can occur locally rather than in the cloud. Also, with IoT device implementations, we are seeing more and more demands for lower latency with enhanced privacy and data security as edge AI allows sensitive data and information to be processed locally on devices rather than sent to another location to be processed externally. Further, improvements in hardware architectures that strengthen energy-efficient designs and provide solutions designed based on specific industries have significantly impacted market growth. The upsurge of new, high-performance processors and new software platforms is altering operational workflows and broadening the scope of AI-based applications at the edge. Download PDF Brochure: Browse in-depth TOC on "Edge AI Hardware Market" 260 – Tables70 – Figures303 – Pages Edge AI Hardware Market Report Scope: Report Coverage Details Market Revenue in 2025 $ 26.14 billion Estimated Value by 2030 $ 58.90 billion Growth Rate Poised to grow at a CAGR of 17.6% Market Size Available for 2021–2030 Forecast Period 2025–2030 Forecast Units Value (USD Million/Billion) Report Coverage Revenue Forecast, Competitive Landscape, Growth Factors, and Trends Segments Covered By device, power consumption, processor, function, vertical, and region Geographies Covered North America, Europe, Asia Pacific, and Rest of World Key Market Challenge Optimizing power consumption in edge AI systems Key Market Opportunities Market Potential for Ultra-Low Latency AI in 5G-Powered Edge Infrastructure Key Market Drivers Rising emphasis on launching innovative AI co-processors for edge AI applications CPU processors to capture the most significant market share throughout the forecast period. CPUs have the largest market share of processors for edge AI hardware. CPUs' versatility, scalability, and general-purpose nature allow them to process any type of AI task in real time as needed by smart factories, autonomous devices, and industrial IoT applications. CPUs will see mass adoption due to the requirement for on-board computing, which can process large data volumes and execute workloads simultaneously. In other words, CPUs are important processing engines for consumer edge AI platforms and crucial processing engines for enterprise applications requiring on-board computing. CPUs are not as specialized as other processors. They can support a heterogeneous environment of diverse edge applications, whether a smartphone, wearable device, or much more advanced industrial systems, or simple two-factor authentication applications. In short, the adaptiveness and ubiquity of CPUs will ensure that CPUs will be the heavyweight champion of processing units for any edge AI deployment (despite many newer types of processors becoming popular for specialized applications, such as GPUs and ASICs). Robots witness the second-highest CAGR during the forecast period. Based on several existing technological changes and industry trends, Robots will record the second-highest CAGR in the edge AI hardware market growth over the forecast period. However, as robotics will increasingly combine artificial intelligence (AI) with edge computing, previous examples provided in this report of "edge AI" hybridity will be more likely to accelerate this convergence away from cloud (and hapless potential for near-real-time decision making). Most robots will leverage the growing capacity to process large local data volumes. This will enable robots to make split-second decisions without the caveat of cloud infrastructure. Timely local choices are crucial for autonomous vehicles, industrial automation, and healthcare robotics applications. In many of these applications, milliseconds matter for safety and efficiency. Edge AI decisions made in real time based on large datasets will allow robots to adapt and learn from their increasingly changing surrounding environments rapidly. Robots will also have a better chance of optimizing the dynamic decision-making process with improved accuracy while running advanced machine learning algorithms on the device. Inquiry Before Buying: Asia Pacific to account for the most significant share of edge AI hardware market. Asia Pacific has the largest share of the edge AI hardware industry, driven by increased adoption of IoT devices and significant investments in AI-driven technologies in many countries (China, Japan, South Korea, and India). Furthermore, the region provides a growing consumer electronics market (smartphone and wearable technology) that requires efficient AI processing, preferably at the edge. Major economies are turning to edge AI applications for smart homes, smart factories, healthcare, and autonomous vehicles. Strong government initiatives and collaborations with leading global technology companies will support the approach. Therefore, with the fastest CAGR and ongoing product and inherent process innovation, the Asia Pacific region is likely to remain ahead in terms of edge AI hardware as it caters to growing industries that value real-time analytics, data privacy, and reduced latency. Major vendors in the edge AI hardware companies include Qualcomm Technologies, Inc. (US), Huawei Technologies Co., Ltd. (China), SAMSUNG (South Korea), Apple Inc. (US), MediaTek Inc. (Taiwan), Intel Corporation (US), NVIDIA Corporation (US), IBM (US), Micron Technology, Inc. (US), and Advanced Micro Devices, Inc. (US). Get 10% Free Customization on this Report: Browse Adjacent Market: Semiconductor and Electronics Market Research Reports &Consulting See More Latest Semiconductor Reports: Embodied AI Market by Product Type [Robots (Humanoid Robots, Mobile Robots, Industrial Robots, Service Robots, Cobots), Exoskeletons, Autonomous Systems, Smart Appliances], Level of Embodiment (Level 1, Level 2, Level 3) - Global Forecast to 2030 AI Data Center Market by Offering (Compute Server (GPU-Based, FPGA-Based, ASIC-based), Storage, Cooling, Power, DCIM), Data Center Type (Hyperscale, Colocation), Application (GenAI, Machine Learning, NLP, Computer Vision) - Global Forecast to 2030 About MarketsandMarkets™ MarketsandMarkets™ has been recognized as one of America's Best Management Consulting Firms by Forbes, as per their recent report. MarketsandMarkets™ is a blue ocean alternative in growth consulting and program management, leveraging a man-machine offering to drive supernormal growth for progressive organizations in the B2B space. With the widest lens on emerging technologies, we are proficient in co-creating supernormal growth for clients across the globe. Today, 80% of Fortune 2000 companies rely on MarketsandMarkets, and 90 of the top 100 companies in each sector trust us to accelerate their revenue growth. With a global clientele of over 13,000 organizations, we help businesses thrive in a disruptive ecosystem. The B2B economy is witnessing the emergence of $25 trillion in new revenue streams that are replacing existing ones within this decade. We work with clients on growth programs, helping them monetize this $25 trillion opportunity through our service lines – TAM Expansion, Go-to-Market (GTM) Strategy to Execution, Market Share Gain, Account Enablement, and Thought Leadership Marketing. Built on the 'GIVE Growth' principle, we collaborate with several Forbes Global 2000 B2B companies to keep them future-ready. Our insights and strategies are powered by industry experts, cutting-edge AI, and our Market Intelligence Cloud, KnowledgeStore™, which integrates research and provides ecosystem-wide visibility into revenue shifts. To find out more, visit or follow us on Twitter , LinkedIn and Facebook . Contact: Mr. Rohan SalgarkarMarketsandMarkets™ INC. 1615 South Congress 103, Delray Beach, FL 33445USA: +1-888-600-6441Email: sales@ Our Web Site: Insight: Source: Logo: View original content: SOURCE MarketsandMarkets Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


Forbes
26-06-2025
- Business
- Forbes
Edge AI Applications As The Catalyst For AI PC Market Growth
Ajith Sankaran, Executive Vice President, C5i. getty Despite all the buzz, the adoption of high-performance AI PCs with powerful neural processing units (NPUs) has been especially sluggish. Since their launch in mid-2024, these devices have captured just 5% of AI PC market sales. This can be attributed to several factors: • AI PCs typically command a significant price premium without clearly articulated benefits. Many users remain unconvinced that these costs translate to meaningful improvements in computing experiences. • Compatibility concerns persist, particularly with first-generation advanced RISC machine (ARM)-based systems that may not support legacy software. • There is a scarcity of software applications that fully harness AI PC capabilities. According to a 2024 ICD report, the global market for personal computing devices was "set to grow 3.8% in 2024, reaching 403.5 million units." However, this growth is primarily driven by a nearly double-digit growth in tablets. According to Jitesh Ubrani of IDC, 'There seems to be a big disconnect between supply and demand as PC and platform makers are gearing up for AI PCs and tablets to be the next big thing, but the lack of clear use cases and a bump in average selling prices has buyers questioning the utility.' I believe the answer to realizing the potential of AI PCs in enterprise scenarios lies in understanding and utilizing edge AI. To understand why, let's take a closer look at how these systems operate. Edge AI And Its Relationship With AI PCs Edge AI represents the convergence of AI and edge computing, enabling AI algorithms to run directly on local devices rather than in remote data centers. This approach processes data where it's generated, eliminating the need to send information to the cloud for analysis and returning results almost instantaneously. AI PCs are well-positioned to serve as powerful edge AI platforms due to their unique hardware architecture. They integrate three processing components: • A central processing unit (CPU) for general computing tasks. • A graphics processing unit (GPU) for parallel processing workloads. • A neural processing unit (NPU) optimized for AI computations. This triad of capabilities allows AI PCs to handle edge AI applications with efficiency. The performance benefits can be substantial; security company CrowdStrike reported that its software's CPU consumption dropped from 35% to 1% when running on machines equipped with Intel NPUs. Global shipments of AI PCs are projected to reach 114 million units in 2025, accounting for 43% of all PC shipments. I believe that edge AI that incorporates the latest advances in generative AI and agentic AI could provide tangible benefits that justify the premium pricing of AI PC for consumers and enterprises. As more developers create software that leverages NPUs and other specialized AI hardware, the value proposition should become clearer, driving increased adoption across both consumer and enterprise segments. Emerging Edge AI Applications Driving AI PC Demand • Manufacturing Intelligence Manufacturing environments are proving to be fertile ground for edge AI applications. AI systems running locally on AI PCs can monitor equipment health in real time, detecting anomalies and predicting potential failures before they occur. This can reduce costly downtime. Quality control represents another application. AI-powered cameras connected to edge computing systems can inspect products for defects with precision and consistency. • Healthcare Innovations The healthcare sector also stands to benefit from edge AI. Portable diagnostic devices equipped with edge5 AI can analyze medical images such as X-rays, MRIs, and CT scans locally, providing rapid insights without requiring cloud connectivity. This is particularly valuable in remote areas. And wearable health devices using edge AI can analyze biometric data locally, detect anomalies and alert healthcare providers without transmitting sensitive patient information to remote servers. • Retail Transformation In retail, edge AI applications are revolutionizing operations and customer experiences. AI-powered cameras and sensors can track inventory levels in real time, optimizing stock replenishment. The same infrastructure can analyze customer behavior patterns, enabling retailers to deliver personalized recommendations and promotions. These capabilities require significant local processing power that can be provided by AI PCs to analyze video feeds and sensor data in real time. • Security and Privacy Protection Edge AI can deliver faster performance while keeping sensitive data local instead of sending it to cloud services. For example, Bufferzone NoCloud "uses local NPU resources to analyze websites for phishing scams using computer vision and natural language processing." Edge AI applications can enhance banking security by detecting unusual transactions and immediately alerting users. Recommendations For Effective AI PC and Edge AI Adoption 1. Develop edge-native AI applications for real-time decision-making. Prioritize building edge-native AI applications that leverage the NPUs in your organization's AI PCs to execute machine learning models locally. For example, manufacturing firms can deploy vision systems on AI PCs to perform real-time quality inspections directly on production lines, reducing defect rates while eliminating cloud dependency. 2. Deploy agentic AI systems for autonomous workflow optimization. Agentic AI excel at autonomously managing complex, multi-step processes. In supply chain, running agentic AI systems on AI PCs can allow you to dynamically reroute shipments based on real-time traffic data processed at the edge, reducing delivery delays. Financial institutions can also combine agentic AI with edge computing to autonomously monitor transactions for fraud patterns, triggering immediate alerts while keeping sensitive financial data localized. 3. Implement privacy-centric AI architectures for regulated industries. Consider adopting hybrid edge-cloud AI architectures to balance computational demands with regulatory compliance. For example, banks can deploy on-premise AI PC clusters to run agentic AI fraud detection systems, ensuring customer transaction data never leaves internal networks. 4. Build scalable edge AI infrastructure with modular hardware. Invest in AI-optimized hardware ecosystems that support both current and emerging workloads. For instance, consider deploying AI PCs with dedicated NPUs for employee productivity tools and pairing them with edge servers containing GPU/TPU arrays for heavy computational tasks. 5. Integrate generative AI with edge computing for adaptive systems. By fusing generative AI with edge computing, you can enable dynamic system adaptation within your company. For example, manufacturers can deploy small language models on AI PCs to generate equipment repair instructions tailored to real-time sensor data, reducing machine downtime. Conclusion While initial adoption of AI PCs has been slow due to high costs, compatibility issues and a lack of applications, the emergence of edge AI use cases is beginning to demonstrate the value of local AI processing. As developers increasingly leverage NPUs to build edge-native and agentic AI solutions, I believe the value proposition of AI PCs will become more evident, driving broader adoption across consumer and enterprise markets. Forbes Business Council is the foremost growth and networking organization for business owners and leaders. Do I qualify?