
Databricks unveils Agent Bricks to streamline enterprise AI agents
Agent Bricks works by taking a high-level task description from users, connecting it with enterprise data, and then handling all additional tasks in the agent-building process. This includes generating synthetic data, running performance benchmarks and optimising the resulting AI agent.
The product is built using research from Mosaic AI and is currently available in Beta. It is aimed primarily at common business needs, such as knowledge assistance, information extraction, and the orchestration of multiple AI agents working together. Databricks has included built-in governance and enterprise controls intended to enable teams to implement Agent Bricks without assembling different components from multiple vendors.
Features and automation
Agent Bricks relies on synthetic data generation and automated evaluation to streamline the tuning of AI agents. According to Databricks, the workflow starts with automatic creation of task-specific assessments and large language model (LLM) judges to measure output quality. Synthetic data that matches the user's domain is then created to train and test the agent. Various optimisation techniques are applied automatically.
At the conclusion of this process, users select the iteration of the AI agent that reflects their chosen balance of quality and operational cost. Agent Bricks is designed to produce a domain-specific agent ready for use in business environments.
Industry use cases
The company outlined a series of use cases for Agent Bricks across different industries. In information extraction, the agent can process documents such as emails and PDFs, converting them into structured fields for easier analysis. For retail businesses, this means the ability to automate extraction of product information from supplier documents, regardless of formatting.
Other examples include the use of knowledge assistant agents in manufacturing, helping technicians quickly find answers in technical manuals, and enabling multi-agent orchestration in financial services to manage tasks like intent detection and compliance checks. Marketing teams can also use custom language model agents to generate content aligned with their brand's standards.
Addressing evaluation and scalability
Databricks states that Agent Bricks is a response to key barriers in deploying production-ready AI agents, especially the challenges of objectively evaluating new models for quality and cost—tasks that traditionally require manual processes and significant expertise. The automation of evaluation and data generation aims to make it possible to scale AI agent deployment without reskilling or expanding teams. "Agent Bricks is a whole new way of building and deploying AI agents that can reason on your data," said Ali Ghodsi, CEO and Co-founder of Databricks. "For the first time, businesses can go from idea to production-grade AI on their own data with speed and confidence, with control over quality and cost tradeoffs. No manual tuning, no guesswork and all the security and governance Databricks has to offer. It's the breakthrough that finally makes enterprise AI agents both practical and powerful."
Customer feedback
Several Databricks customers have provided early feedback. Joseph Roemer, Head of Data & AI, Commercial IT, AstraZeneca, said, "With Agent Bricks, our teams were able to parse through more than 400,000 clinical trial documents and extract structured data points — without writing a single line of code. In just under 60 minutes, we had a working agent that can transform complex unstructured data usable for Analytics."
Chris Nishnick, Director of AI, Lippert, commented, "With Agent Bricks, we can quickly productionise domain-specific AI agents for tasks like extracting insights from customer support calls—something that used to take weeks of manual review. It's accelerated our AI capabilities across the enterprise, guiding us through quality improvements in the grounding loop and identifying lower-cost options that perform just as well."
Roman Bugaev, CTO, Flo Health, added, "Agent Bricks enabled us to double our medical accuracy over standard commercial LLMs, while meeting Flo Health's high internal standards for clinical accuracy, safety, privacy, and security. By leveraging Flo's specialised health expertise and data, Agent Bricks uses synthetic data generation and custom evaluation techniques to deliver higher-quality results at a significantly lower cost. This enables us to scale personalised AI health support efficiently and safely, uniquely positioning Flo to advance women's health for hundreds of millions of users."
Ryan Jockers, Assistant Director of Reporting and Analytics at the North Dakota University System, said, "Agent Bricks allowed us to build a cost-effective agent we could trust in production. With custom-tailored evaluation, we confidently developed an information extraction agent that parsed unstructured legislative calendars—saving 30 days of manual trial-and-error optimisation."
Joel Wasson, Manager Enterprise Data & Analytics, Hawaiian Electric, noted, "With over 40,000 complex legal documents, we needed high precision from our internal 'Regulatory Chat Tool'. Agent Bricks significantly outperformed our original open-source implementation (built on LangChain) in both LLM-as-judge and human evaluation accuracy metrics."
Further AI platform releases
The launch of Agent Bricks is accompanied by additional tools. Databricks now provides serverless GPU support, which is intended to allow customers to fine-tune models or run deep learning workloads without having to manage underlying hardware. This provides on-demand and scalable access to computing resources for AI development and deployment.
Databricks has also released MLflow 3.0, the newest version of its open-source AI development framework. MLflow 3.0 is intended to help teams monitor, trace, and optimise AI agents across different environments, with integrated support for prompt management and evaluation. MLflow provides compatibility with existing data lakehouse architectures and continues to see substantial monthly usage figures.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Techday NZ
30-06-2025
- Techday NZ
AI agents revolutionise business & cybersecurity with automation
The integration of artificial intelligence (AI) agents into business and cybersecurity operations is rapidly shifting from experimental trials to real-world deployment, as organisations across sectors seek to harness the power of automation, efficiency, and data-driven decision-making. Recent commentary from senior technology leaders underscores both the gains and the challenges facing enterprises as they embed AI agents into core processes. In the realm of cybersecurity, the accelerating sophistication of threats driven by AI is a growing concern. Fabio Fratucello, Field Chief Technology Officer World Wide for CrowdStrike, highlights that malicious actors are increasingly using AI, including large language models (LLMs), to launch more convincing phishing scams, business email compromise, and social engineering attacks. "AI is lowering the barrier to entry for adversaries, allowing them to automate social engineering, misinformation campaigns, and credential harvesting at unprecedented speed and scale," Fratucello notes, citing findings from CrowdStrike's 2025 Global Threat Report. This surge in AI-powered attacks is amplifying the pressure on security teams, already hampered by overwhelming alert volumes and a shortage of skilled analysts. According to Fratucello, the key to regaining the upper hand lies in deploying AI-enabled tools that automate detection and triage of threats. CrowdStrike's Charlotte AI, for instance, uses machine learning to validate and prioritise security alerts with more than 98% accuracy. By automating repetitive tasks, Charlotte AI reportedly saves security teams up to 40 hours per week, enabling them to concentrate on proactively hunting advanced threats and halting breaches before escalation. Importantly, Charlotte AI operates within a "bounded autonomy framework," allowing organisations to set specific parameters for automated decision-making. Analysts retain control by defining thresholds and determining when human review is required, ensuring that the balance between automation and human oversight is maintained. Fratucello stresses, "This combination of machine speed and human-defined guardrails returns the AI advantage to defenders and ensures organisations can operate at the speed of threats." Elsewhere in the enterprise technology space, companies are contending with the challenge of bringing AI agents out of the prototype lab and into large-scale, reliable production. Nick Eayrs, Vice President, Field Engineering, APJ, at Databricks, observes that while enthusiasm for AI agents is high, moving beyond proof-of-concept still eludes many organisations. "The challenge is not just building them – it is making them scalable, efficient and reliable enough to stand up in the real world," Eayrs explains. Too often, businesses rely on trial-and-error approaches and are forced into trade-offs between operating costs and performance, which undermines the requirements of enterprise-grade use cases. In response, Databricks has launched Agent Bricks, a system designed to streamline AI agent creation using real enterprise data. Agent Bricks deploys automated evaluation benches, generates synthetic data, and leverages custom judges to test agent quality and efficiency. This reduces the manual, repetitive work typically involved in agent development, freeing AI engineers to focus on higher-value tasks. Eayrs says that common industry uses of such AI agents already include structured information extraction, knowledge assistance, customised data transformation, and multi-agent orchestration. "Ultimately, it is about building production-grade AI agents that are useful, reliable and ready for deployment — that is what we are focused on," he adds. The potential benefits of AI agents are also being demonstrated at the customer coalface. At the property services company hipages Group, AI agents have transformed backend operations. Jeremy Burton, Chief Technology Officer, reports that the introduction of Agentforce has reduced lead response time by 60% and automated verification processes, dramatically speeding up license and registration checks for tradespeople. "Agentforce and Data Cloud are now a core part of our ecosystem, helping us connect better with our customers and tradies," he states. By bringing disparate data and customer touchpoints onto a unified platform, hipages has streamlined work for staff and improved satisfaction levels for both customers and end users. Hospitality firm Urban Rest has leveraged Agentforce to scale up its service operations in step with rapid growth. Jeff Baars, Chief Commercial Officer, explains that Agentforce's centralised knowledge base provides round-the-clock self-serve support for guests, enabling the business to operate without on-site staff across 800 properties in four countries. This has increased productivity by 50% for the operations team, while anticipated field service visits have dropped by up to 40%. Baars notes, "Agentforce empowers our guests with instant, self-serve access to everything they need to know about their apartment, freeing up our guest relations team to focus on support that requires a human touch." At Kudosity, an AI agent named 'Emily' now handles up to 40% of general customer support queries, ensuring prompt responses and reducing the workload for support staff. When a query is too complex, Emily refers it to a human agent, preserving personalised service where it matters most. The evolving landscape of AI agents in the real world demonstrates the transformative potential of the technology, balanced by the need for systematic deployment, human oversight, and ongoing adaptation to rapid changes in both business and cybersecurity domains. As enterprise adoption accelerates, the focus is squarely on building reliable, scalable, and human-aligned AI agents that deliver tangible benefits across sectors.


Techday NZ
12-06-2025
- Techday NZ
Fivetran awarded Databricks 2025 data integration partner of year
Fivetran has been named the 2025 Databricks Data Integration Partner of the Year. The award recognises the collaborative efforts between Fivetran and Databricks to provide data foundations for analytics and artificial intelligence to enterprise customers. The acknowledgement comes in light of a 40 percent year-over-year increase in the number of joint customers using Fivetran and Databricks to manage and analyse data. Fivetran offers solutions that allow organisations to centralise data from a wide array of sources, such as SaaS applications, databases, files, and event streams, into the Databricks Data Intelligence Platform. By automating the process of moving data and streamlining pipeline management, Fivetran aims to lessen the engineering resources required by its clients while ensuring more reliable and faster access to data. Growth and integration The past year has seen the partnership between Fivetran and Databricks expand further, with the introduction of advanced integrations into Unity Catalog and Delta Lake. These integrations assist customers in maintaining governance requirements while making use of both structured and unstructured data. As more organisations look to refine their data operations, the combined capabilities of Fivetran and Databricks are cited as helping to reduce operational overhead, enhance performance, and expedite the transformation of raw data into actionable insights. "Databricks continues to be a strategic partner as more companies invest in modern data infrastructure. This recognition speaks to the value we are delivering together for customers who need reliable, secure data pipelines to support production-grade AI and analytics. We are proud to help build the foundation for what comes next." The above was stated by Logan Welley, Vice President of Alliances at Fivetran, underscoring the role of partnership in supporting enterprise clients adopting artificial intelligence and analytics-driven solutions. Launch partner initiatives Fivetran has also been announced as a launch partner for Databricks Managed Iceberg Tables. This new feature is designed to provide customers with access to open and high-performance data formats optimised for large scale analytics and artificial intelligence purposes. Through its integration with Unity Catalog, Fivetran seeks to offer enterprises a consistent approach to data governance and efficient data accessibility as they scale their workloads and expand use cases for analytics and AI. The solution is currently employed by a range of organisations across different industries. National Australia Bank, for example, uses Fivetran's Hybrid Deployment model to operate data pipelines within its own cloud infrastructure while utilising Databricks for processing and analytics. This structure allows the bank to adhere to stringent compliance requirements, whilst modernising its infrastructure and accelerating its artificial intelligence adoption efforts. Other companies, including OpenAI, Pfizer, and Dropbox, use Fivetran to facilitate data transfer into Databricks to support a variety of applications, from real-time analytics to machine learning in production settings. The goal for these organisations is to improve operational speed and inform decision-making processes. Partner perspectives "As enterprise demand for data intelligence grows, Fivetran has been an important partner for us in helping organisations move faster with data. Their focus on automation, scale, and governance aligns with what our customers need as they bring more data-driven AI applications from production to market." This statement was made by Roger Murff, Vice President of Technology Partners at Databricks, highlighting the significance of the partnership in meeting evolving customer needs in the data intelligence sector. Fivetran reports that its automated pipelines, security measures, and managed experience are intended to support compliance and facilitate AI-focused data infrastructure modernisation for its enterprise clients.


Techday NZ
12-06-2025
- Techday NZ
CData launches accelerator to simplify Databricks integration
CData Software has introduced a new integration accelerator designed to simplify and speed up enterprise data integration for organisations utilising Databricks environments. The CData Databricks Integration Accelerator aims to eliminate traditional data pipeline bottlenecks and shorten integration timelines, enabling companies to make more efficient use of their Databricks Lakehouse investments. Databricks Lakehouse Platform combines data warehousing and artificial intelligence with a unified system for real-time analytics and machine learning. However, integrating data from varied systems can be problematic for enterprises, particularly due to fragmented legacy ETL tools and hybrid environments spanning on-premises and multi-cloud infrastructure. The new accelerator addresses these issues by providing a no-code framework for building scalable ingestion pipelines. It is also designed to simplify transformation tasks and support compliance with data governance frameworks such as Unity Catalogue. "Integrating enterprise data into Databricks often requires extensive custom code and manual configuration, which can introduce delays and increase maintenance overhead," said Manish Patel, Chief Product Officer at CData. "Our Integration Accelerator eliminates those inefficiencies by providing prebuilt connectors, automated pipeline orchestration, and real-time data availability, enabling teams to operationalise data faster and focus on driving value through analytics and AI." The CData Databricks Integration Accelerator is built around four toolkits, each designed to address specific integration challenges. Delta Lake integration The Delta Lake Integration Toolkit allows organisations to ingest data using a no-code approach and supports Change Data Capture (CDC) from over 270 sources. The toolkit provides live data access to business systems — including sales, marketing and finance — using Databricks Lakehouse Federation, while supporting governance through Unity Catalogue. Delta Live Tables The Delta Live Tables (DLT) Extension Toolkit expands connectivity to a broader range of business applications, creating a unified SQL data model via Databricks Spark for straightforward data integration. This toolkit also offers authentication and pagination support for any application programming interface (API), as well as server-side pushdown to improve speed and efficiency. Databricks-Microsoft connectivity With the Databricks-Microsoft Connectivity Toolkit, users can establish standards-based connections between Databricks and the Microsoft software ecosystem. This supports direct integration with products such as SSAS, SSRS and SSIS, maintaining live data connections and Unity Catalogue compatibility. Agentic data pipelines The Agentic Data Pipelines Toolkit focuses on automating data ingestion for agentic workloads, offering programmatic access to serverless PostgreSQL compatible with Databricks serverless deployments. The toolkit enables the instant availability of enterprise data for use by AI agents, provides programmatic orchestration of data pipelines, and processes real-time data from any source using Change Data Capture (CDC). "Everyone needs a good way to load data into their data lake," said Eric Newcomer, CTO and Principal Analyst, Intellyx. "CData has been developing and delivering a broad and deep set of data connectors for more than a decade. Now, Databricks users can leverage not only the proven CData suite of connectors, but also the integration toolkits built on top of them, including SQL data models, Microsoft-specific connections, and MCP servers for AI agent automations." Customer deployment and outcomes NJM Insurance, a provider of personal and commercial insurance policies, has deployed CData's Databricks Integration Accelerator to overhaul its marketing analytics operations. The company reported a 90% reduction in integration build time and saved 66% in project costs by replacing traditional, code-heavy ETL processes with CData's no-code solution. This allowed for quicker insight generation on customer acquisition and lifetime value. "With CData, we were able to ingest our marketing data into Databricks 10 times faster, allowing us to make data-driven decisions more quickly," said Ameya Narvekar, Data Insights Supervisor at NJM Insurance. The CData Databricks Integration Accelerator is now available for organisations seeking to reduce the time and complexity associated with enterprise data integration projects. Follow us on: Share on: