
Docker Unifies Container Development And AI Agent Workflows
Docker, Inc. has positioned itself as the central orchestration platform for AI agent development, standardizing how developers build, deploy and manage intelligent applications through its enhanced compose framework and new infrastructure tools.
Streamlining Agent Development Through Familiar Workflows
Docker recently extended its compose specification to include a new 'models' element, allowing developers to define AI agents, large language models and Model Context Protocol tools within the same YAML files they already use for microservices. This integration eliminates the fragmented development experience that has plagued enterprise AI projects, where teams often struggle to move beyond proof-of-concept phases.
The enhancement enables developers to deploy complete agentic stacks with a single 'docker compose up' command, treating AI agents as first-class citizens alongside traditional containerized applications. This approach addresses a fundamental challenge in enterprise AI development: the disconnect between experimental AI workflows and production deployment pipelines.
Multi-Framework Integration Strategy
Docker's approach centers on supporting multiple AI agent frameworks simultaneously, rather than favoring a single solution. The platform now integrates with LangGraph, CrewAI, Spring AI, Vercel AI SDK, Google's Agent Development Kit and Embabel. This framework-agnostic strategy reflects Docker's understanding that enterprise environments require flexibility to adopt different AI technologies based on specific use cases.
The integration allows developers to configure different frameworks within the same compose file, enabling hybrid agent architectures. For instance, a financial services application might use LangGraph for complex reasoning workflows while employing CrewAI for multi-agent coordination tasks.
Cloud Infrastructure and Scaling Capabilities
Docker Offload represents a significant infrastructure investment, providing developers with access to NVIDIA L4 GPUs for compute-intensive AI workloads. The service charges $0.015 per GPU minute after an initial 300 free minutes, positioning it as a development-focused solution rather than a production hosting service.
The company has established partnerships with Google Cloud and Microsoft Azure, enabling seamless deployment to Cloud Run and Azure Container Apps, respectively. This multi-cloud approach ensures organizations can leverage their existing cloud investments while maintaining consistency in their development workflows.
Security and Enterprise Readiness
Docker's MCP Gateway addresses enterprise security concerns by providing containerized isolation for AI tools and services. The gateway manages credentials, enforces access controls and provides audit trails for AI tool usage, addressing compliance requirements that often block enterprise AI deployments.
The platform's security-by-default approach extends to its MCP Catalog, which provides curated and verified AI tools and services. This curation process addresses supply chain security concerns that have emerged as AI components are integrated into production systems.
Implementation Challenges and Considerations
Despite the streamlined development experience, organizations face several implementation challenges. The complexity of managing multiple AI frameworks within a single environment requires sophisticated dependency management and version control practices. Cold start latencies in containerized AI applications can introduce a few seconds of delay, requiring careful optimization strategies.
Enterprise adoption also requires addressing data governance and model management practices. While Docker's platform simplifies deployment, organizations must still establish practices for model versioning, performance monitoring, observability and cost management across different AI workloads.
Key Takeaways
Docker's multi-framework approach represents a bet on ecosystem diversity rather than standardization around a single AI framework. This strategy acknowledges that enterprise AI applications will likely require multiple specialized tools rather than monolithic solutions. The platform's success depends on maintaining interoperability between different AI frameworks while providing consistent deployment and management experiences.
The introduction of Docker Offload also signals Docker's expansion beyond traditional containerization into cloud infrastructure services. This evolution positions the company to capture more value from AI workloads while maintaining its focus on developer experience and workflow integration.
For technology decision-makers, Docker's AI agent platform provides a mechanism to standardize AI development practices while maintaining flexibility in framework choice. The platform's emphasis on familiar workflows and existing tool integration reduces the learning curve for development teams, potentially accelerating AI adoption timelines within enterprise environments.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Forbes
a minute ago
- Forbes
Founded On Technology Innovation, AT&T Is Charting A Data And AI Future
AT&T How does an iconic American company that has been synonymous with technology innovation for nearly 150 years prepare to grow and thrive in an AI future? This is the question that I posed to Andy Markus, Chief Data and AI Officer at AT&T, a company that was essentially founded in 1876 when Alexander Graham Bell invented the telephone. For nearly a century and a half, AT&T has been a pioneer in technology innovation. Markus joined AT&T in 2020, having held technology leadership and transformation positions for leading media companies including WarnerMedia, Turner Broadcasting, and Time. In his role as Chief Data and AI Officer at AT&T, Markus supports the consumer and business lines of the $122b (as of the end of 2024) company, as well as back-office functions ranging from finance to legal to HR. 'We're responsible for developing and executing the data and AI strategy and governance for the firm' notes Markus. He adds, 'The big hat we wear is execution. We work across the firm horizontally to help all parts of the business. We solve their challenges with a data and AI first mindset.' The scope of responsibility of the Chief Data and AI Office is magnified by the size and the scale of data that AT&T manages. AT&T has a long history working with AI, dating back to pioneering work at Bells Labs, the former R&D arm, which was renowned for its groundbreaking innovations, including the invention of the transistor in 1947. Bell Labs revolutionized modern electronics and computing and played a pivotal role in the early development of AI. 'AT&T has a very rich history with AI. I like to use the line from Hamilton – 'we were in the room where it happened'. AT&T was right there when the term artificial intelligence was created' comments Markus. He adds, 'We have a rich history of technology innovation at AT&T. We recently ranked sixth in U.S. companies with AI patents and continue to turn out a considerable volume of intellectual property resulting from generative AI and agentic AI.' As has been the case during its long history, AT&T continues to pioneer technology innovation, now using AI. 'AI is a core part of the AT&T mandate and how AT&T runs its businesses' says Markus. 'We still have the spirit of Bell Labs.' He adds, 'It's remarkable that we're one degree removed from somebody that worked with John Tukey, the legendary Bell Labs mathematician and statistician that I studied in school.' The emergence of generative AI and agentic AI in the past few years has been accelerating transformation within AT&T. Markus notes, 'We recognized that generative AI would bring AI to everyone. Instead of having AI being run exclusively by technical people, we are creating a general-purpose AI that can apply to areas where we have never used AI before.' AT&T currently runs over 600 traditional Machine Learning and AI models in production across the firm, cutting across many lines of business. Markus explains, 'Where we were leveraging traditional or classical AI to run the business, now we're integrating every part of the firm and reimagining the things that we can do using generative AI.' Generative AI is also being employed by AT&T to help manage its data. 'Data functionality using generative AI is great for complex analytics. We are working with hard, complex, messy data sets' notes Markus. He continues, 'When we apply generative AI technology to a curated data product, the accuracy skyrockets. Generative AI technology enables us to do things that are at human level or actually exceed what can be done at a human level.' Markus adds, 'We are building on a foundation of being great with data, great with classical or traditional AI, and now great with generative AI and agentic AI. Each element builds off each other and complements each other'. He notes that AT&T has created over 2,000 generative AI use cases that have been submitted for internal reviews. Delivering business value from AI is central to the business mandate of AT&T. 'An AI first mindset starts with understanding the business needs' notes Markus. He continues, 'Once we understand what these needs are, we work to automate processes and make the lift lighter for the development community so that they can do their work faster.' Markus adds, 'Our partners in the business are truly the experts at creating a business case. Whatever you do, you've got to integrate with existing systems. We help evaluate the cost of the solution and then we work to help understand the benefit and that's where we really work hand-in-hand our business partners.' The AI use cases that AT&T is developing cut across the firm. Markus notes that the very first thing that AT&T did was to bring together the risk organizations of the company -- legal, compliance, privacy and security – to develop a unified approach to govern how the company should invest and execute in AI capabilities across the organization. He continues, 'We partnered with the business units to create a transformation program for this new era of AI'. AT&T has established a transformation office which reviews each use case for its business value to the organization. Markus adds, 'We prioritize our use cases and work on those that will have the most value for the company, working closely with the CFO office.' The result of these efforts is that AI is driving business value for AT&T across business lines. In one example, AT&T has developed a complex fraud detection system. Markus explains, 'Your phone is a very expensive piece of equipment. The bad guys want to find a way to get your information that's on it.' He continues, 'To address this, we created a very complex fraud detection system with well over 30 models, both generative AI models and traditional AI models, that protect customers from fraud.' AT&T is also using AI to manage robocalls. Markus comments, 'When I started with AT&T, one of the top complaints from our customers were robocalls. At this point, by using AI we're detecting these earlier and blocking them.' AT&T is also applying AI to deliver business value through its Ask AT&T platform. Markus explains, 'One of the areas where we've been successful in using generative AI to take human language and turn it into computer language and do complex analytics is with our Ask AT&T platform.' He elaborates, 'At the very beginning of generative AI, we saw that generative AI was going to touch all of our employees, so we created a formal AI policy.' Over 100 thousand employees and contractors now have access to the Ask AT&T platform. Markus adds, 'We are leading the pack in using AI to drive value. We do it in a very measured way across the firm.' Another example is dispatch optimization. AT&T operates one of the largest vehicle fleets in the United States, comprising over 50,000 vehicles and over 700 million possible routes on a given day. The company has developed an AI application that optimizes the dispatch process. Markus notes, 'The benefit of the application is good for society because by being very efficient, we're saving carbon emissions. We now have over 100 million pounds of emissions saved since we started this program by reducing the miles driven. We don't always need to send technicians to homes when people call in. We've used AI to become much smarter on how we solve issues proactively, which saves a technician dispatch in many situations.' Managing data as a business asset is core to the success of the AI transformation taking place at AT&T. 'There is an enormous amount of data that flows over the AT&T network every day – close to 900 petabytes of data that come over the network every day', says Markus. He explains, 'Our data must be safe and secure. We have this concept of a data product, which in our view is a curated set of raw data.' Markus adds, 'We need to do the right thing with how we manage our customer data to fully adhere to regulations and to drive business value for the company and our customers.' For most organizations, and particularly century-old companies, transformation and change is seldom an easy proposition. The greatest challenges that these firms face almost always relate to the business culture and readiness of an organization to adapt. Markus notes, 'Culture is one of the driving factors for us. It is where many organizations get stuck.' He elaborates, 'We are a 150-year-old company, so inevitably there will be some parts of the business that ask whether they can really benefit from leveraging AI.' To address some of the cultural challenges, AT&T has established AI training programs so that employees can start to understand how AI can augment their daily activities. AT&T now has employee education programs where 50,000 employees have completed AI training, and new trainings are continually being added. Markus notes that support for AI starts at the top, explaining, 'We have strong top-down support, beginning with our CEO, John Stankey. He has been a great leader, a coach and a person that evangelizes that AI is a technology we are going to embrace as a company.' While leadership from the top of the organization is essential, Markus notes that support at all levels of the company is required to ensure successful adoption of AI. AT&T is preparing for an AI future. This entails staying abreast of the latest AI capabilities, including agentic AI. Markus explains, 'Agentic AI is connecting together a workflow that actually could be deterministic, to break down bigger problems into smaller problems that can be solved more accurately, while having the ability to take action as part of the workflow.' He adds, 'People often try to make agentic AI sound like more than it is, but it really is breaking down the big problem and smaller problems so you can solve these more accurately with the ability to take action based off of your decisions.' Looking to the future of AI at AT&T, Markus observes, 'There's a lot of hype around whether there will be value coming from our AI investments. I think everyone's seeing that the technologies we're working with such as generative AI and the evolution of this into agentic AI are going to change things.' He comments, 'As we talk about the benefit of generative AI, for every dollar we invested, in that same year, we returned 2X ROI. This was free cash flow impacting ROI from multiple year business cases. A 2X return is going to grow to a run rate that contributes to AT&T's overall run-rate savings target of $3 billion by the end of 2027.' Markus continues, 'What's different this time is that we don't see the wall as we have with other technology evolutions, where you had a general idea of where the where the end was going to be while you're in the middle of it. In the case of AI, I don't think we see this wall yet. The wall keeps moving, if there even is a wall. That is something that I think is super exciting to be a part of.' He concludes, 'AT&T is an exciting place to work because the scale and complexity of our data is extremely unique. We see adoption across the board in using and reimagining how we do our work with an AI and data-driven mindset as a way to get to the next stage. We have business teams that you would never expect to be learning how to use AI that are doing so. They are chomping at the bit, knocking on our door to ask how they can be using AI to deliver better products and services to our customers. That's a great place to be!'


Forbes
a minute ago
- Forbes
Startup Morelle Markets 15-Minute-Charge E-Bike—Tech Powers Robots, Too
Morelle e-bikes promise 15-minute superfast charging. E-bikes tend to be hooked up to power overnight or throughout the day while at work, but for the founders of startup e-bike brand Morelle, this trickle charging is too slow. The Californian company has developed a $3,000 urban e-bike that can go from flat to almost full in minutes, rather than hours. The battery technology they're using isn't bought in; it's proprietary and, later this year, is set to power humanoid robots. Morelle's lithium-ion battery tech uses silicon instead of graphite. Compared to graphite, silicon stores up to ten times more energy, so using silicon powder instead of graphite for anodes—the part that releases electrons during discharge—can significantly improve a battery's energy density. Morelle isn't the only company using such silicon anodes. The black powder already powers the five-day battery life of the latest Whoop activity-tracking wearable, and it's the same kind of nanoscale powder that American companies, such as Sila, say could enable ten-minute recharges for electric cars. And it's not just a case of swapping one anode powder for another; there's a significant amount of chemical complexity built into the process. Morelle's technology was developed by battery scientist Kevin Hays, whose PhD was in silicon anodes, and who cofounded the company with tech development specialist Michael Sinkula. They previously worked together at Ionblox, a battery company specializing in large-format pouch cells using pre-lithiated silicon-dominant anodes for electric vertical take-off and landing (eVTOL) air taxis, a fiercely competitive sector yet to scale. The pair left to found Morelle, teaming up with legendary bicycle designer and entrepreneur Gary Fisher. Fisher, a notable San Francisco bicycle racer in the late 1960s, was one of the Marin County pioneers who, racing down a steep fire road near the forest town of Fairfax on modified 1930s Schwinn paperboy bikes known as 'clunkers," codeveloped the product and sport of mountain biking in the mid- to late-1970s with other legends such as Joe Breeze. Spotting a commercial opportunity to promote and sell clunkers more widely, Fisher coined the term 'mountain bike' in 1979 and, in the same year, cofounded MountainBikes, the first company to specialize in the manufacture of this type of 'balloon tire' off-road bicycle. Just 160 were sold in the first year. Fisher went on to found Gary Fisher Mountain Bikes, a brand later acquired by Trek Corporation. Fisher worked as a brand ambassador for Trek until 2022. Hays and Sinkula started working with Fisher in 2023, appointing him as Morelle's chief bike designer. 'We didn't know if [Fisher] would be interested in e-bikes at all,' said Hays, 'but he was super excited about what we're doing, and was very forward thinking, and wanted to get involved with what we were proposing.' Shrink it And what they were proposing was an e-bike that was lighter than other e-bikes and didn't look like one, either. 'We wanted to shrink the battery to get to a point where you're not carrying around all this excess battery,' said Hays. 'We wanted the bike to feel more like a bike, and less like an e-bike. When you're worried about range and charging times, you end up putting large batteries on the bike, but that makes the bike heavy and cumbersome.' Morelle principals Gary Fisher, left, with Michael Sinkula and Kevin Hays. Morelle bikes will be lighter and slimmer than traditional e-bikes, said Hays. And they'll charge much faster. E-bike batteries typically charge at a rate of 100-300W, while Morelle can charge through standard wall outlets at a rate of 1000-1200W. Using a proprietary wall charger will boost this recharge to 1500W, a rate that could prove attractive for fleet operators. 'We wanted to move away from the idea that you have to leave your e-bike battery charging for four to eight hours, perhaps even unattended," said Hays. 'With our bike, you'll charge for 10 or 15 minutes, so you're not sitting there worrying about leaving a battery charging for a long time.' Hays and Sinkula formed their own brand after being rejected by e-bike battery companies such as Bosch. 'We were hoping that the bike industry would be more forward thinking on [new technologies],' said Sinkula, who has a biomedical engineering backgound. 'But they were not forward-thinking at all. That pushed us to pursue this.' Self-funded so far, Sinkula said Morelle will be seeking venture capital. In addition to its own e-bikes, Morelle is also supplying batteries to Under Control Robotics (UCR), a US supplier of humanoid robots for deployment in challenging work environments such as construction, energy and mining. 'Performance-wise, the kind of battery pack we're putting in the e-bike is almost identical to what's required for untethered robotics,' said Sinkula. '[Makers of humanoid robots] are also somewhat restricted to volume and weight constraints, so energy density is also an important factor.' Morelle—derived from Morty and Ellie, the names of the dogs owned by the Hays and Sinkula families—is a direct-to-consumer brand with an initial focus on the US. The company's aluminum bikes will be manufactured in Taiwan. Pre-orders are now being taken, and the first batch of Morelle's $3,000 e-bikes will be available early next year, with a limited production run of 1,000 units. Morelle's batteries will be used in UCR's robots set for commercial release in the fall.


Forbes
a minute ago
- Forbes
Are The San Antonio Spurs Gearing Up For A Trade This Coming Season?
SAN FRANCISCO, CALIFORNIA - APRIL 09: Interim head coach Mitch Johnson of the San Antonio Spurs ... More looks on against the Golden State Warriors in the first quarter at Chase Center on April 09, 2025 in San Francisco, California. NOTE TO USER: User expressly acknowledges and agrees that, by downloading and or using this photograph, User is consenting to the terms and conditions of the Getty Images License Agreement. (Photo by) The San Antonio Spurs had a very strong draft, selecting Dylan Harper and Carter Bryant, both of whom could become foundational pieces moving forward. In free agency and on the trade market, however, the Spurs made some decisions that now, collectively, seem curious. Spending big on backup centers San Antonio first signed Luke Kornet to a four-year deal worth $41 million, albeit just under $24 million are guaranteed. They then traded for Kelly Olynyk, via the Washington Wizards, and took on his $13.4 million for 2025-2026 on top of it. That's $24.4 million for both of them this season, which is a fair bit of financial cheddar when you consider neither will have a starring role, as both play behind Victor Wembanyama and Jeremy Sochan. Of course, one can argue that Olynyk in particular should be able to play alongside the star Frenchman, but nevertheless, it seems optimistic to expect Olynyk to play an enormous amount of minutes given that he's 34 and played just over 20 minutes per game last season. As such, it's fair to wonder if the Spurs are gearing up to a mid-season trade, if they're in the hunt for a playoff spot. Endless possibilities The Spurs have a plethora of trade assets, both in form of player contracts and draft selections, so it'd make sense if the team is actively looking to make a move near the February trade deadline, especially by using the contracts of Kornet or Olynyk to facilitate a deal. After all, investing over $24 million on backup big men seem at-best optimistic, even if it's only for one year. It thus lends itself to the theory that the Spurs did it to keep their options open. Last trade deadline, San Antonio acquired point guard De'Aaron Fox, and it isn't inconceivable that they're planning on making an acquisition of similar quality, to pair with Fox and Wembanyama. They can easily match salaries for most players, as Keldon Johnson's $17.5 million, and Harrison Barnes' $19 million are also movable, and then align value via draft selections. This isn't to say the Spurs are looking to dramatically accelerate their timeline. The organization have shown patience, even after the selection of Wembanyama, and seem to insist on creating a long-term competitive window, which is the right play. It'll be interesting how to see how they play this over the next 6-12 months. Unless noted otherwise, all stats via PBPStats, Cleaning the Glass or Basketball-Reference. All salary information via Spotrac. All odds courtesy of FanDuel Sportsbook.