logo
#

Latest news with #AlfredoRamos

Clarifai Unveils "AI Runners": Unmatched Flexibility for Local AI Deployment
Clarifai Unveils "AI Runners": Unmatched Flexibility for Local AI Deployment

Cision Canada

time2 days ago

  • Business
  • Cision Canada

Clarifai Unveils "AI Runners": Unmatched Flexibility for Local AI Deployment

WASHINGTON, July 8, 2025 /CNW/ -- Clarifai, a global leader in AI and pioneer of the full-stack AI platform, today announced the launch of AI Runners, a groundbreaking new offering designed to provide developers and MLOps engineers with uniquely flexible options for deploying and managing their AI models. AI Runners enables users to connect models running on their local machines or private servers directly to Clarifai's robust and scalable platform via a seamless, publicly accessible API, offering a "best of both worlds" solution for modern AI development. The rise of agentic AI, capable of "thinking for themselves," setting goals, using tools with protocols like MCP, and solving complex problems, is driving an unprecedented demand for computing power. AI Runners directly address this challenge by providing a cost-effective and secure solution for managing the escalating demands of modern AI workloads. "Agentic AI is driving significant compute demands, and AI Runners provide a practical, secure solution for every developer. It's essentially ngrok for AI models, letting you build on your current setup and keep your models exactly where you want them, yet still get all the power and robustness of Clarifai's API for your biggest agentic AI ideas," said Alfredo Ramos, Chief Product and Technology Officer at Clarifai. "Our goal is to make advanced AI development – from your first line of code to a deployed application – genuinely easier and more budget-friendly for creators everywhere." AI Runners offer distinct advantages for developers and enterprises: Unmatched Flexibility & Control: Clarifai is the only platform currently offering developers their models or MCP tools anywhere – on a local development machine, an on-premises server, or a private cloud cluster – and connect them to the Clarifai API without complex networking. This allows users to keep sensitive data and custom models within their own environment and leverage existing compute infrastructure without vendor lock-in. Supercharge Your Models with a World-Class Platform: AI Runners allow instant serving of custom models through Clarifai's scalable, publicly accessible API, enabling integration into any application. Users can build complex multi-step AI workflows by chaining local models with thousands of models available on the Clarifai platform, all managed and monitored from a unified dashboard. A Streamlined and Economical Path to Production: Simplify the development workflow, making powerful AI development accessible and cost-effective by starting locally, and then scaling seamlessly to production in Kubernetes-based compute clusters on Clarifai. With a straightforward onboarding process and transparent pricing, allowing users to start small and scale their operations as needed. Customers can save even further by leveraging Clarifai's Compute Orchestration in multiple clouds and on-premise Kubernetes clusters, which enables traffic-based autoscale to/from zero, batching of requests, spot instances, and GPU fractioning so you can run more AI workloads on the same hardware. Pricing & Availability: Clarifai is introducing a new Developer Plan at a promotional price of $1 per month for the first year (standard price: $10 per month). This plan empowers developers to utilize AI Runners and provides access to hundreds of the latest AI models available on the Clarifai platform to create sophisticated AI workloads. Clarifai AI Runners are available now. For more information and to get started, visit the Clarifai website. About Clarifai Clarifai is a global leader in AI and the pioneer of the full-stack AI platform that helps organizations, teams, and developers build, deploy, and operationalize AI at scale. Clarifai's cutting-edge AI platform supports today's modern AI technologies like Large Language Models (LLMs), Large Vision Models (LVMs), Retrieval Augmented Generation (RAG), automated data labeling, high-volume production inference, and more. Founded in 2013, Clarifai is available in cloud, on-premises, or hybrid environments and has been used to build more than 1.5 million AI models with more than 400,000 users in 170 countries. Learn more at

Clarifai Joins Vultr Cloud Alliance to Deliver Scalable, Cost-Optimized, Full-Stack AI
Clarifai Joins Vultr Cloud Alliance to Deliver Scalable, Cost-Optimized, Full-Stack AI

Cision Canada

time20-05-2025

  • Business
  • Cision Canada

Clarifai Joins Vultr Cloud Alliance to Deliver Scalable, Cost-Optimized, Full-Stack AI

WASHINGTON, May 20, 2025 /CNW/ -- Clarifai, a global leader in AI and pioneer of the full-stack AI platform, today announced it has joined the Vultr Cloud Alliance. This collaboration enables enterprises to build, deploy, and scale AI workloads with enhanced flexibility and control over performance, governance, and cost, leveraging Clarifai's platform and Vultr's global high-performance cloud infrastructure. The collaboration brings together Clarifai's full-stack AI platform, which covers everything from model development to deployment and governance, with Vultr's global reach, security, regulatory compliance (including HIPAA, SOC 2+, and more), and operational excellence. Vultr's infrastructure includes high-performance CPUs, managed Kubernetes through Vultr Kubernetes Engine (VKE), managed databases like Apache ® Kafka, scalable storage, bare metal servers, and a wide choice of the latest AMD and NVIDIA GPUs, offering optimal price-to-performance. Together, Clarifai and Vultr offer organizations the ability to run any model in any environment with complete control over performance, governance, and offering up to 90% cost savings. "Combining capabilities with these leading industry partners means customers can now deploy and manage their AI workloads efficiently across Vultr's global cloud, gaining full control over costs and performance while getting access to a broader range of GPUs", said Alfredo Ramos, Chief Product & Technology Officer at Clarifai. "This is about enabling all clouds, all compute, and all AI models on one platform." By working together, joint Clarifai and Vultr customers can save at least 70% on the costs of the NVIDIA A100 80 GB compared to hyperscalers, with potential for greater savings through a longer-term commitment. Customers can purchase single A100 GPUs or in blocks of 8 GPUs. Key highlights of the partnership include: Any AI Model, Any GPU: Users can deploy any open-source, foundation, or custom AI model, including Clarifai's own, across Vultr's extensive GPU lineup, such as AMD Instinct ™ MI300X, MI325X, and NVIDIA HGX ™ B200, HGX ™ H100, A100 PCIe, and L40S. This allows optimization for performance, power efficiency, or cost, supporting AI workloads from inference to fine-tuning. Unified Compute Orchestration: Clarifai's compute orchestration allows deploying any model in a secure, scalable, containerized environment managed via a single interface. Models are deployed across Vultr resources using managed Kubernetes clusters or bare metal servers, with dynamic provisioning and automatic scaling via Vultr Kubernetes Engine (VKE). Built-in governance provides centralized visibility over performance, cost, and access, simplifying AI operations and improving efficiency. Edge AI Capabilities: Clarifai's edge AI platform enables deploying lightweight, high-performance models directly to edge devices, including air-gapped and offline environments. Combined with Vultr's global footprint of 32 data center regions reaching 90% of the global population with low latency (2-40 ms), this delivers real-time intelligence at the data source. This is particularly valuable for use cases like predictive maintenance, industrial quality control, public safety, and content moderation. "Our partnership with Clarifai is exactly what the Vultr Cloud Alliance is all about—bringing together best-of-breed technologies to give customers real choice, real performance, and real value. Clarifai's full-stack AI platform paired with Vultr's global GPU infrastructure means organizations can build and deploy AI models faster, scale efficiently, and reduce cost. It's a practical, high-impact solution for teams looking to take control of their AI workloads—whether in the cloud, at the edge, or across hybrid environments." said Kevin Cochrane, Chief Marketing Officer, Vultr. Joint solutions are applicable across various industries, including: Energy, Aerospace, and Manufacturing: Implement predictive maintenance and improve asset management using AI for visual inspection, edge AI, and global infrastructure. Media and Entertainment: Accelerate AI workloads for content moderation, metadata generation, and asset management on GPU instances. Leverage real-time image, video, and document analysis and state-of-the-art AI models for full motion video and sports analytics. Defense and Public Safety: Deploy AI models for security surveillance, object detection, and domain awareness in secure, air-gapped, or edge environments. Organizations ready to enhance their AI deployments can get started today at preferential pricing. Reserve their GPU cluster via: For more information on the Vultr Cloud Alliance, visit here. About Clarifai Clarifai is a global leader in AI and the pioneer of the full-stack AI lifecycle and orchestration platform that helps organizations create and control AI workloads on any environment with a unified platform. With over a decade of experience supporting millions of custom models and billions of operations for the largest enterprises and governments, Clarifai pioneered compute innovations like custom scheduling, batching, GPU fractioning, and autoscaling. Clarifai empowers users to efficiently run any model, anywhere, at any scale. Learn more at About Vultr Vultr is on a mission to make high-performance cloud infrastructure easy to use, affordable, and locally accessible for enterprises and AI innovators around the world. Vultr is trusted by hundreds of thousands of active customers across 185 countries for its flexible, scalable, global Cloud Compute, Cloud GPU, Bare Metal, and Cloud Storage solutions. Founded by David Aninowsky and self-funded for over a decade, Vultr has grown to become the world's largest privately held cloud infrastructure company. Learn more at

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store