Latest news with #CrewAI


Forbes
6 days ago
- Business
- Forbes
Docker Unifies Container Development And AI Agent Workflows
Docker Compose Docker, Inc. has positioned itself as the central orchestration platform for AI agent development, standardizing how developers build, deploy and manage intelligent applications through its enhanced compose framework and new infrastructure tools. Streamlining Agent Development Through Familiar Workflows Docker recently extended its compose specification to include a new 'models' element, allowing developers to define AI agents, large language models and Model Context Protocol tools within the same YAML files they already use for microservices. This integration eliminates the fragmented development experience that has plagued enterprise AI projects, where teams often struggle to move beyond proof-of-concept phases. The enhancement enables developers to deploy complete agentic stacks with a single 'docker compose up' command, treating AI agents as first-class citizens alongside traditional containerized applications. This approach addresses a fundamental challenge in enterprise AI development: the disconnect between experimental AI workflows and production deployment pipelines. Multi-Framework Integration Strategy Docker's approach centers on supporting multiple AI agent frameworks simultaneously, rather than favoring a single solution. The platform now integrates with LangGraph, CrewAI, Spring AI, Vercel AI SDK, Google's Agent Development Kit and Embabel. This framework-agnostic strategy reflects Docker's understanding that enterprise environments require flexibility to adopt different AI technologies based on specific use cases. The integration allows developers to configure different frameworks within the same compose file, enabling hybrid agent architectures. For instance, a financial services application might use LangGraph for complex reasoning workflows while employing CrewAI for multi-agent coordination tasks. Cloud Infrastructure and Scaling Capabilities Docker Offload represents a significant infrastructure investment, providing developers with access to NVIDIA L4 GPUs for compute-intensive AI workloads. The service charges $0.015 per GPU minute after an initial 300 free minutes, positioning it as a development-focused solution rather than a production hosting service. The company has established partnerships with Google Cloud and Microsoft Azure, enabling seamless deployment to Cloud Run and Azure Container Apps, respectively. This multi-cloud approach ensures organizations can leverage their existing cloud investments while maintaining consistency in their development workflows. Security and Enterprise Readiness Docker's MCP Gateway addresses enterprise security concerns by providing containerized isolation for AI tools and services. The gateway manages credentials, enforces access controls and provides audit trails for AI tool usage, addressing compliance requirements that often block enterprise AI deployments. The platform's security-by-default approach extends to its MCP Catalog, which provides curated and verified AI tools and services. This curation process addresses supply chain security concerns that have emerged as AI components are integrated into production systems. Implementation Challenges and Considerations Despite the streamlined development experience, organizations face several implementation challenges. The complexity of managing multiple AI frameworks within a single environment requires sophisticated dependency management and version control practices. Cold start latencies in containerized AI applications can introduce a few seconds of delay, requiring careful optimization strategies. Enterprise adoption also requires addressing data governance and model management practices. While Docker's platform simplifies deployment, organizations must still establish practices for model versioning, performance monitoring, observability and cost management across different AI workloads. Key Takeaways Docker's multi-framework approach represents a bet on ecosystem diversity rather than standardization around a single AI framework. This strategy acknowledges that enterprise AI applications will likely require multiple specialized tools rather than monolithic solutions. The platform's success depends on maintaining interoperability between different AI frameworks while providing consistent deployment and management experiences. The introduction of Docker Offload also signals Docker's expansion beyond traditional containerization into cloud infrastructure services. This evolution positions the company to capture more value from AI workloads while maintaining its focus on developer experience and workflow integration. For technology decision-makers, Docker's AI agent platform provides a mechanism to standardize AI development practices while maintaining flexibility in framework choice. The platform's emphasis on familiar workflows and existing tool integration reduces the learning curve for development teams, potentially accelerating AI adoption timelines within enterprise environments.


Business Wire
16-07-2025
- Business
- Business Wire
General Assembly Partners with CrewAI to Launch AI Agent-Building Workshop Series
NEW YORK--(BUSINESS WIRE)--General Assembly, the leading talent and upskilling community, and CrewAI, a multi-agent AI platform, today announced they will partner on a new workshop series designed to teach both technical and nontechnical professionals how to build and deploy AI agents. The partnership addresses the growing demand for practical AI implementation skills with a free, one-hour introductory webinar and a comprehensive three-hour hands-on workshop serving professionals across skill levels, with tailored learning tracks. 'Every business wants to use AI agents, but many lack the practical skills to implement agentic solutions effectively,' said Daniele Grassi, CEO of General Assembly. 'By combining our proven practitioner-led instructional model with CrewAI's platform, we're providing professionals with real skills that go beyond theoretical understanding provided by most AI trainings.' The workshop series includes two components: Introduction to Agentic AI with CrewAI: A free, one-hour introductory webinar will show participants how AI agents solve real business problems–no coding required. This component is designed for professionals who want to understand how AI agents can transform workflows and will offer practical frameworks to assess organizational AI readiness. Build Your First AI Agent: This comprehensive workshop will take learners from AI curious to agent builders in just three hours. During the course, participants will design, build and deploy their first AI agent team using real business scenarios. Tracks will be available for both technical (Python-based) and nontechnical professionals, who will walk away with the confidence to bring AI agents to their organization. CrewAI's platform powers systems across 60% of the Fortune 500 in the United States and is used by developers in 150+ countries, making it the ideal foundation for learning how to build AI agents. 'We're excited to partner with General Assembly to bring hands-on agentic AI education to the next generation of builders,' said João Moura, CEO of CrewAI. 'At CrewAI, we believe multi-agent systems will redefine how work gets done, and this curriculum empowers learners to design, deploy and scale real-world CrewAI workflows from day one–simple to start, reliable in results and built to scale.' These workshops are part of General Assembly's AI Academy, a comprehensive training program to address the widening AI skills gap threatening enterprise transformation initiatives. About General Assembly General Assembly (GA) is the leading talent and upskilling community that helps individuals and businesses acquire the real skills required to succeed in an increasingly complex technological era. Founded in 2011 to make tech-centric jobs accessible to anyone and meet the demand of fast-growing tech companies, GA evolved into a center of excellence in training people from all backgrounds to upgrade their practical knowledge of tech skills now required in every company and in any role. With a global presence, hands-on instruction, and a passionate alumni community, GA gives learners 360-degree support as they take the next step in their career journey. As part of the Adecco Group and partner of premier talent solutions provider LHH, GA matches the right talent to business needs. All day, every day: GA puts real skills to work. About CrewAI CrewAI is the leading multi-agent enterprise platform, powering systems across 60% of the U.S. Fortune 500 and used by developers in 150+ countries. The platform enables organizations to deploy sophisticated, collaborative groups of AI agents to automate real-world business workflows. CrewAI offers the infrastructure that teams need to run autonomous systems in production with a complete list of features required by enterprises that include low-code tools, user management, governance and security. CrewAI integrates with all major LLMs, hyperscalers (AWS, Azure & Google Cloud), and 1,000+ enterprise applications.


Business Upturn
10-07-2025
- Business
- Business Upturn
Docker Brings Agentic Apps to Life with New Compose Support, Cloud Offload, and Partner Integrations
By GlobeNewswire Published on July 10, 2025, 14:30 IST BERLIN, July 10, 2025 (GLOBE NEWSWIRE) — Docker, Inc.® , a provider of cloud-native and AI-native development tools, infrastructure, and services, today announced major new capabilities that make it dramatically easier for developers to build, run, and scale intelligent, agentic applications. By extending Docker Compose to support agents and AI models, introducing Docker Offload for cloud-scale execution, and collaborating with cloud providers like Google Cloud and Microsoft Azure and AI SDKs like CrewAI, Embabel, LangGraph, Spring AI, and Vercel AI SDK, Docker is delivering on its mission to simplify complex technology and empower developers. 'Agentic applications are rapidly evolving, but building production-grade agentic systems is still too hard,' said Tushar Jain, EVP of Engineering, Docker, Inc. 'Just like Docker democratized microservices a decade ago, we're now making agentic apps accessible to every developer by making agent-based development as easy, secure, and repeatable as container-based app development has always been. The next wave of software is powered by intelligent agents, and Docker makes it easy to turn that potential into real, running applications.' These advancements are more than just new tools. They help solve one of the biggest challenges facing developers today, which is moving agentic applications from local prototypes to secure and scalable production environments. Docker Compose Enters the Agent Era For over a decade, Docker Compose has been the go-to tool used by millions of developers for defining and running multi-container applications. Now, Docker is extending Compose into the agent era, enabling developers to define intelligent agent architectures consisting of models and tools in the same simple YAML files they already use for microservices and take those agents to production. With the new Compose capabilities, developers can: Define agents, models, and tools as services in a single Compose file Run agentic workloads locally or deploy seamlessly to cloud services like Google Cloud Run or Azure Container Apps Integrate with Docker's open source Model Context Protocol (MCP) Gateway for secure tool discovery and communication Share, version, and deploy agentic stacks across environments without rewriting infrastructure code This approach brings powerful agent orchestration into familiar workflows with no new languages or tools required. 'Making it just as straightforward for developers to take AI apps from prototype into production as it already is for regular code—that's the next big thing in app development,' said Torsten Volk, Principal Analyst at Enterprise Strategy Group. 'Expanding Docker Compose to give developers the same familiar, simple experience for AI deployments as they have for traditional apps is exactly what we need. Plus, the new capability to run AI models directly in the cloud—without clogging up your laptop—is another major step forward. This should make a real difference in how quickly enterprises can start adopting AI at scale.' Introducing Docker Offload: Cloud Power, Local Simplicity As agentic applications demand more GPU power for complex AI tasks, local machines frequently fall short of the necessary capacity, which has become a significant pain point for developers. To solve this, Docker today unveiled Docker Offload (Beta), a new capability that enables developers to offload AI and GPU-intensive workloads to the cloud without disrupting their existing workflows. With Docker Offload, developers can: Maintain local development speed while accessing cloud-scale compute and GPUs Run large models and multi-agent systems in high-performance cloud environments Choose where and when to offload workloads for privacy, cost, and performance optimization Keep data and workloads within specific regions to meet sovereignty requirements and ensure data does not leave designated zones across the globe. Docker Offload integrates directly into Docker Desktop, preserving the familiar docker compose up experience while delivering cloud horsepower under the hood. Built on a Thriving Ecosystem Docker's agentic capabilities are launching alongside new integrations with leading cloud and AI platforms. Key partnerships include: Google Cloud: Deploy agentic applications to production via serverless environments with the new gcloud compose up command Microsoft Azure: Seamless deployments via Azure Container Apps, arriving soon Popular agent frameworks: Compose integrations now support CrewAI, Embabel, Google's ADK, LangGraph, Spring AI, and Vercel AI SDK, and more. Steren Giannini, Director of Product Management, Google Cloud Run 'With Compose Spec support in Cloud Run, we're making it dramatically simpler to move sophisticated AI apps from local development straight to production. This collaboration brings the best of both worlds: Docker's local dev power combined with Cloud Run's serverless scale and reliability, all with one simple command.' Scott Hunter, Vice President Director of Product, Azure Developer Experience 'Microsoft has been collaborating with Docker for many years, and we're pleased to see Docker extend the Compose Spec to support agent-based application development. We're working together to make agentic app deployment seamless on Microsoft Azure Container Apps—helping developers easily build and scale AI applications and agents from local dev to the cloud. These integrations help ensure developers can easily adopt Docker's agentic tooling alongside the frameworks they already use.' Ram Venkatesh, CTO and Co-founder of 'Agent-based systems represent a transformative leap in how software interacts with the world: autonomous, goal-driven, and contextual. Docker's direction supporting agentic architectures is a major unlock for developers, making it radically easier to compose, scale, and iterate on multi-agent systems without compromising security or reinventing infrastructure. This is the kind of pragmatic innovation that accelerates agentic adoption for real-world use cases.' Craig McLuckie, CEO and Founder of Stacklok and Co-Creator of Kubernetes 'Enterprises that want to lift up their knowledge workers and create powerful new customer experiences need to connect the right data to AI models at the right times, and that requires use of MCP. Docker has a critical role to play in facilitating adoption of MCP through familiar constructs like containers and Docker Compose. By working together, in the open, we can bring simplicity and security to MCP that will unlock real enterprise adoption.' Availability Docker Compose enhancements for agentic applications are available today Docker Offload is available in closed beta for developers who request access. Google Cloud Run integration is live; Azure Container Apps support is coming soon MCP Gateway and Docker Hub MCP Server are open source and ready for use Resources About Docker Docker drives modern software development by making it easy to adopt container technology to radically boost productivity, security, testing, and collaboration at every step of the developer experience, including emerging AI workflows. Embraced by over 20 million developers worldwide, Docker's unmatched flexibility and choice make it the preferred tool for developers seeking efficiency and innovation for creating modern applications. Learn more about Docker at . A video accompanying this announcement is available at: Disclaimer: The above press release comes to you under an arrangement with GlobeNewswire. Business Upturn takes no editorial responsibility for the same. Ahmedabad Plane Crash GlobeNewswire provides press release distribution services globally, with substantial operations in North America and Europe.


Techday NZ
25-06-2025
- Business
- Techday NZ
Superwise launches AgentOps for secure & compliant AI agent management
SUPERWISE has announced the introduction of its open AgentOps platform designed to provide real-time observability, control, and compliance for companies deploying third-party AI agents. The new solution is intended to address what the company describes as a significant gap in the industry, as businesses ramp up their deployment of AI agents without adequate measures for risk mitigation and operational oversight. The AgentOps platform seeks to centralise and secure the management of AI agents, serving companies that increasingly rely on varied and decentralised agent architectures. Operational oversight The AgentOps release enables enterprises to deploy, serve, and manage AI agents created using a range of proprietary and open-source development platforms. Through this initiative, SUPERWISE provides built-in capabilities for compliance, monitoring, and operational management, positioning its service as a component for responsible and scalable AI deployment. Russ Blattner, Chief Executive Officer at SUPERWISE, highlighted the current challenges in the AI landscape. "Building AI agents is only half the equation," he said. "The real challenge, and where organizations often stumble, is in managing them responsibly once they are live. This is precisely where SUPERWISE's expertise and leadership have consistently distinguished us - at the operational layer. With this launch, SUPERWISE is enabling teams to use the best open-source tools to build agents, while relying on our enterprise-grade infrastructure to govern, observe, and scale them safely." Supporting diverse needs The AgentOps platform is aimed at a broad spectrum of stakeholders within the enterprise. AI developers and engineers are able to continue using their preferred frameworks and tools while maintaining operational visibility and controlled workflows. Enterprise IT and AI leaders are provided with centralised management, allowing them to encourage innovation while avoiding dependency on single vendors. C-level executives are presented with tools to balance agility, governance, security, scalability, and cost. The development philosophy behind AgentOps includes support for open-source software and low-code solutions, as well as built-in integrations and community-driven tooling. According to the company, the platform currently supports the deployment and management of agents developed through its Flowise framework, with planned compatibility for additional third-party frameworks such as Dify, CrewAI, Langflow, and N8n. Framework flexibility Oren Razon, Senior Director of Product at SUPERWISE, commented on the platform's role in letting developers maximise the investment in their tool choices. "Developers have their choices for open source frameworks. Rather than forcing them to switch in order to be governed, SUPERWISE allows developed agents to be run in our platform, which maximizes existing development investment without incurring the risks," he said. SUPERWISE claims that as the deployment of AI agents becomes more widespread, the need for integrated governance, risk management, and operational transparency will increase across industries. The AgentOps platform is positioned to offer enterprises a cohesive and extensible approach to agent oversight, aimed at supporting ongoing compliance requirements and auditability as regulatory frameworks evolve. The company points toward its experience in governance and operations as being central to this new release, making reference to the rising recognition within the industry that maintaining secure and auditable AI systems is becoming as critical as developing the agents themselves. SUPERWISE is recognised by analyst firms for its contributions to enterprise AI governance and MLOps, framing its platform as offering integrated guardrails and compliance functionality for enterprises seeking to embed responsible AI within their operational processes.


Techday NZ
12-05-2025
- Business
- Techday NZ
Extend unveils open-source AI toolkit for smarter finance
Extend has released an open-source AI toolkit aimed at enhancing how businesses manage and analyse financial data. The toolkit supports multiple frameworks, including Anthropic's Model Context Protocol (MCP), OpenAI, native integration with LangChain, and compatibility with CrewAI to facilitate complex multi-agent workflows. The company states that this versatility allows businesses to incorporate Extend's API seamlessly into their existing AI-driven systems, enabling more advanced spend analysis and automated finance processes. The toolkit is designed to offer flexibility to businesses, allowing them to interact with Extend while continuing to use their preferred banks or credit cards. Its intention is to help organisations adopt AI solutions tailored to their needs, supporting functions such as intelligent financial queries, custom reporting, and workflow automation. Jonathan Bailey, Extend's Chief Technology Officer, commented on the motivation behind the toolkit: "When I started to explore the multitude of use cases for AI in our industry, I zeroed in on the power of 'agentic frameworks', and realised we could enable tools like Claude to interact directly with Extend via our APIs and immediately unlock extensive AI functionality for our customers." Through the integration of these frameworks, users are able to query financial data using natural language input, conduct advanced analytics, and generate custom reports. Automation powered by AI agents can manage tasks such as expense categorisation and budget tracking. Businesses will also be able to analyse spending patterns, identify cost-saving opportunities, and obtain greater insights into areas such as cash flow, team spending, and overall budget allocations. Andrew Jamison, Extend's Chief Executive Officer and co-founder explained the broader company strategy: "At Extend, we believe in empowering businesses to do more with what they already have - whether that's credit lines, banking relationships, or software investments. With this AI toolkit, we're taking that mission to the next level, giving our customers the tools they need to make smarter, faster, and more informed decisions." Extend indicated that development efforts will continue to focus on expanding AI automation features within its platform, in response to increasing demand from companies seeking more streamlined financial management solutions. Extend is a modern spend and expense management platform that helps businesses gain control over spending - without changing their existing bank or credit card programs. Thousands of companies use Extend to create and manage virtual cards, streamline payment workflows, and get real-time visibility into team and vendor spend. According to the company, Extend powers billions of dollars in transactions while partnering with the financial institutions businesses already trust.