logo
#

Latest news with #Axelera

Axelera AI Accelerators Smoke Competitors In Machine Vision Research Study
Axelera AI Accelerators Smoke Competitors In Machine Vision Research Study

Forbes

time08-07-2025

  • Business
  • Forbes

Axelera AI Accelerators Smoke Competitors In Machine Vision Research Study

Axelera CEO Fabrizio Del Maffeo Holds The Company's PCIe AI Accelerator As AI-accelerated workloads proliferate across edge environments—from smart cities to retail and industrial surveillance—choosing the right inference accelerator has become a mission-critical decision for many businesses. In a new competitive benchmark study conducted by our analysts at HotTech Vision and Analysis, we put several of today's leading edge AI acceleration platforms to the test in a demanding, real-world scenario: multi-stream computer vision inference processing of high-definition video feeds. The study evaluated AI accelerators from Nvidia, Hailo, and Axelera AI across seven object detection models, including SSD MobileNet and multiple versions of YOLO, to simulate a surveillance system with 14 concurrent 1080p video streams. The goal was to assess real-time throughput, energy efficiency, deployment complexity and detection accuracy of these top accelerators, which all speak to a product's overall TCO value proposition. Measuring AI Accelerator Performance In Machine Vision Applications All of the accelerators tested provided significant gains over CPU-only inference—some up to 30x faster—underscoring how vital dedicated hardware accelerators have become for AI inference. Among the tested devices, PCIe and M.2 accelerators from Axelera showed consistently stronger throughput across every model, especially with heavier YOLOv5m and YOLOv8l workloads. Notably, the Axelera PCIe card maintained performance levels where several other accelerators tapered off, and it consistently smoked the competition across all model implementations tested. SSD MobileNet v2 Machine Vision AI Model Inferencing Test Results Show Axelera In The Lead YOLOv5s Machine Vision AI Model Results Shows The Axelera PCIe Card Wins Hands-Down But Nvidia Is ... More Competitive That said, Nvidia's higher-end RTX A4000 GPU maintained competitive performance in certain tests, particularly with smaller models like YOLOv5s. Hailo's M.2 module offered a compact, low-power alternative, though it trailed in raw throughput. Overall, the report illustrates that inference performance can vary significantly depending on the AI model and hardware pairing—an important takeaway for integrators and developers designing systems for specific image detection workloads. It also shows how dominant Axelera's Metis accelerators are in this very common AI inference application use case, versus major incumbent competitors like NVIDIA. Power consumption is an equally important factor, especially in AI edge deployments, where thermal and mechanical constraints and operational costs can limit design flexibility. Using per-frame energy metrics, our research found that all accelerators delivered improved efficiency over CPUs, with several using under one Joule per frame of inferencing. SSD MobileNet v2 Power Efficiency Results Shows Axelera Solutions Win In A Big Way YOLOv5s Power Efficiency Results Show Axelera Solutions Ahead But Nvidia And Hailo Close The Gap Here, Axelera's solutions out-performed competitors in all tests, offering the lowest energy use per frame in all AI models tested. NVIDIA's GPUs closed the gap somewhat in YOLO inferencing models, while Hailo maintained respectable efficiency, particularly for its compact form factor. The report highlights that AI performance gains do not always have to come at the cost of power efficiency, depending on architecture, models and workload optimizations employed. Beyond performance and efficiency, our report also looked at the developer setup process—an often under-appreciated element of total deployment cost. Here, platform complexity diverged more sharply. Axelera's SDK provided a relatively seamless experience with out-of-the-box support for multi-stream inference and minimal manual setup. Nvidia's solution required more hands-on configuration due to model compatibility limitations with DeepStream, while Hailo's SDK was Docker-based, but required model-specific pre-processing and compilation. The takeaway: development friction can vary widely between platforms and should factor into deployment timelines, especially for teams with limited AI or embedded systems expertise. Here Axelera's solutions once again demonstrated simplicity in its out-of-box experience and setup that the other solutions we tested could not match. Our study also analyzed object detection accuracy using real-world video footage. While all platforms produced usable results, differences in detection confidence and object recognition emerged. Axelera's accelerators showed a tendency to detect more objects and draw more bounding boxes across test scenes, likely a result of its model tuning and post-processing defaults that seemed more refined. Still, our report notes that all tested platforms could be further optimized with custom-trained models and threshold adjustments. As such, out-of-the-box accuracy may matter most for proof-of-concept development, whereas other, more complex deployments might rely on domain-specific model refinement and tuning. Axelera AI's Metis PCI Express Card And M.2 Module AI Inference Accelerators Our AI research and performance validation report underscores the growing segmentation in AI inference hardware. On one end, general-purpose GPUs like those from NVIDIA offer high flexibility and deep software ecosystem support, which is valuable in heterogeneous environments. On the other, dedicated inference engines like those from Axelera provide compelling efficiency and performance advantages for more focused use cases. As edge AI adoption grows, particularly in vision-centric applications, demand for energy-efficient, real-time inference is accelerating. Markets such as logistics, retail analytics, transportation, robotics and security are driving that need, with form factor, power efficiency, and ease of integration playing a greater role than raw compute throughput alone. While this round of testing (you can find our full research paper here) favored Axelera on several fronts—including performance, efficiency, and setup simplicity—this is not a one-size-fits-all outcome. Platform selection will depend heavily on use case, model requirements, deployment constraints, and available developer resources. What the data does make clear is that edge AI inference is no longer an exclusive market GPU acceleration. Domain-specific accelerators are proving they can compete, and in some cases lead, in the metrics that matter most for real-world deployments.

Axelera AI Launches Global Partner Program to Accelerate Development of Customer-Ready Edge AI Inference Solutions
Axelera AI Launches Global Partner Program to Accelerate Development of Customer-Ready Edge AI Inference Solutions

Yahoo

time12-06-2025

  • Business
  • Yahoo

Axelera AI Launches Global Partner Program to Accelerate Development of Customer-Ready Edge AI Inference Solutions

Broad ecosystem of leading ISVs, technology providers, systems integrators and channel partners to offer solutions based on Axelera technology EINDHOVEN, Netherlands, June 09, 2025--(BUSINESS WIRE)--Axelera AI, the leading provider of purpose-built AI hardware acceleration technology for generative AI and computer vision inference at the edge, today launched the Axelera Partner Accelerator Network, a global partner program designed to accelerate the development of customer-ready solutions at the edge using Axelera technology. The program will provide training, co-marketing and technical support for a broad range of partners, creating a rich ecosystem of solution providers for customers in a variety of markets who want to transition proof-of-concept (POC) edge AI inference projects into full production. Founding partners include Aetina, Arduino, Astute, C&T Solution, Eurocomposant, Macnica ATD Europe, Rutronic, Seco, and Silicon Applications Group Corp (a member of WPG Holdings). The global market for edge AI solutions is expected to reach USD $269.82 billion by 2032 at an annual compounded growth rate of 33.3%. In markets such as retail, industrial, manufacturing, security, healthcare and others, there is strong demand for high-performance, affordable edge AI solutions that can scale to deliver meaningful, near-term business impact. Axelera's Metis AI Processing Unit (AIPU) platform is now shipping and ideal for broad adoption across industry segments, combining high performance, energy efficiency and affordability. "Democratizing access to AI is a core principle of our company," said Axelera AI Chief Marketing Officer, Alexis Crowell. "With the launch of our Partner Accelerator Network, Axelera AI is bringing together the industry's most innovative minds to unlock the full potential of edge AI. By harnessing the network effects of collaboration, each partner's strength amplifies the others—creating a powerful ecosystem where shared innovation leads to exponential opportunity for all." The ISVs, technology providers, system integrators and advisors and channel partners participating in the Partner Acceleration Network are the ideal partners to help customers scale their edge AI projects from POC to full-scale production with hardware and software solutions optimized to take full advantage of the Metis AI accelerator platform. At launch, the PAN program will include more than 15 participants, and join our existing ecosystem of solutions including companies like Lenovo, Dell, Advantech, Seco and Arduino. "Our partnership with Axelera AI enables us to deliver sovereign, high-performance Edge AI technology tailored to the real-world needs of our industrial clients," says Mélanie Chupin, VP Communication & Marketing, Eurocomposant. "With direct training from Axelera, our teams support each project with agility and expertise, ensuring reliable, efficient, and production-ready embedded AI solutions." Companies wishing to join Axelera's Partner Accelerator Network can visit Axelera AI Ecosystem | Axelera AI for more information. About Axelera AI Axelera AI is the leading provider of purpose-built AI hardware acceleration technology for AI inference, including computer vision and generative AI applications. Its first-generation product is the game-changing Metis™ AI platform – a holistic hardware and software solution for Edge AI inference which delivers world's highest performance and power-efficiency at a fraction of the cost of alternative solutions. Headquartered in the AI Innovation Center of the High Tech Campus in Eindhoven, The Netherlands, Axelera AI has R&D offices in Belgium, Switzerland, Italy and the UK, with more than 180 employees in 18 countries. Its team of experts in AI software and hardware hail from top AI firms and Fortune 500 companies. View source version on Contacts Media Axelera@ Sign in to access your portfolio

Axelera AI Launches Global Partner Program to Accelerate Development of Customer-Ready Edge AI Inference Solutions
Axelera AI Launches Global Partner Program to Accelerate Development of Customer-Ready Edge AI Inference Solutions

Business Wire

time09-06-2025

  • Business
  • Business Wire

Axelera AI Launches Global Partner Program to Accelerate Development of Customer-Ready Edge AI Inference Solutions

EINDHOVEN, Netherlands--(BUSINESS WIRE)-- Axelera AI, the leading provider of purpose-built AI hardware acceleration technology for generative AI and computer vision inference at the edge, today launched the Axelera Partner Accelerator Network, a global partner program designed to accelerate the development of customer-ready solutions at the edge using Axelera technology. The program will provide training, co-marketing and technical support for a broad range of partners, creating a rich ecosystem of solution providers for customers in a variety of markets who want to transition proof-of-concept (POC) edge AI inference projects into full production. Founding partners include Aetina, Arduino, Astute, C&T Solution, Eurocomposant, Macnica ATD Europe, Rutronic, Seco, and Silicon Applications Group Corp (a member of WPG Holdings). The global market for edge AI solutions is expected to reach USD $269.82 billion by 2032 at an annual compounded growth rate of 33.3%. In markets such as retail, industrial, manufacturing, security, healthcare and others, there is strong demand for high-performance, affordable edge AI solutions that can scale to deliver meaningful, near-term business impact. Axelera's Metis AI Processing Unit (AIPU) platform is now shipping and ideal for broad adoption across industry segments, combining high performance, energy efficiency and affordability. 'Democratizing access to AI is a core principle of our company,' said Axelera AI Chief Marketing Officer, Alexis Crowell. 'With the launch of our Partner Accelerator Network, Axelera AI is bringing together the industry's most innovative minds to unlock the full potential of edge AI. By harnessing the network effects of collaboration, each partner's strength amplifies the others—creating a powerful ecosystem where shared innovation leads to exponential opportunity for all.' The ISVs, technology providers, system integrators and advisors and channel partners participating in the Partner Acceleration Network are the ideal partners to help customers scale their edge AI projects from POC to full-scale production with hardware and software solutions optimized to take full advantage of the Metis AI accelerator platform. At launch, the PAN program will include more than 15 participants, and join our existing ecosystem of solutions including companies like Lenovo, Dell, Advantech, Seco and Arduino. 'Our partnership with Axelera AI enables us to deliver sovereign, high-performance Edge AI technology tailored to the real-world needs of our industrial clients,' says Mélanie Chupin, VP Communication & Marketing, Eurocomposant. 'With direct training from Axelera, our teams support each project with agility and expertise, ensuring reliable, efficient, and production-ready embedded AI solutions.' Companies wishing to join Axelera's Partner Accelerator Network can visit Axelera AI Ecosystem | Axelera AI for more information. About Axelera AI Axelera AI is the leading provider of purpose-built AI hardware acceleration technology for AI inference, including computer vision and generative AI applications. Its first-generation product is the game-changing Metis™ AI platform – a holistic hardware and software solution for Edge AI inference which delivers world's highest performance and power-efficiency at a fraction of the cost of alternative solutions. Headquartered in the AI Innovation Center of the High Tech Campus in Eindhoven, The Netherlands, Axelera AI has R&D offices in Belgium, Switzerland, Italy and the UK, with more than 180 employees in 18 countries. Its team of experts in AI software and hardware hail from top AI firms and Fortune 500 companies.

Dutch chipmaker AxeleraAI gets $66 million EU grant
Dutch chipmaker AxeleraAI gets $66 million EU grant

Reuters

time06-03-2025

  • Automotive
  • Reuters

Dutch chipmaker AxeleraAI gets $66 million EU grant

AMSTERDAM, March 6 (Reuters) - AxeleraAI, one of Europe's few companies making computer chips for artificial intelligence, has been awarded a grant of up to 61.6 million euros ($66 million) to develop a chip for use in data centres, as part of EU efforts to boost the sector. Europe is trying to address an AI competitiveness gap with the United States and China, including by funding domestic chipmakers and building publicly funded data centres referred to as " AI factories" accessible to European scientists, companies and startups. "It's a moment of pride," said Axelera CEO Fabrizio Del Maffeo in a phone interview - and a chance for his company to expand its business. The Eindhoven, Netherlands-based firm won funding from EuroHPC - the agency overseeing Europe Union's network of supercomputers and AI factories - to bring out a chip efficient at "inference" AI computing. Inference can be compared to the usage or "thinking" step in AI that companies, such as Google (GOOGL.O), opens new tab and France's Mistral, need when they build a large model, like a brain, that typically is trained on Nvidia (NVDA.O), opens new tab chips. "We are not here to challenge Nvidia in the data centre space, in the training," Del Maffeo said. "But when the network is ready and you want to run it, we are developing a solution that can deliver extremely high performance ... we can do that." The emergence of Chinese large AI model DeepSeek, which claimed cutting edge performance at a lower cost, may increase demand for inference computing as AI models become more affordable. Axelera's new Titania chip will be built on the open source RISC-V standard that is gaining traction in the auto industry and in China as an alternative to systems dominated by Intel (INTC.O), opens new tab and Arm. Axelera's current chip "Metis" is used in "edge AI" applications outside data centres, such as inside factories analysing CCTV footage to identify safety lapses. Axelera has previously raised $200 million from investors including Samsung since it was founded in 2021.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store