Latest news with #DeltaLake


Techday NZ
12-06-2025
- Business
- Techday NZ
Fivetran awarded Databricks 2025 data integration partner of year
Fivetran has been named the 2025 Databricks Data Integration Partner of the Year. The award recognises the collaborative efforts between Fivetran and Databricks to provide data foundations for analytics and artificial intelligence to enterprise customers. The acknowledgement comes in light of a 40 percent year-over-year increase in the number of joint customers using Fivetran and Databricks to manage and analyse data. Fivetran offers solutions that allow organisations to centralise data from a wide array of sources, such as SaaS applications, databases, files, and event streams, into the Databricks Data Intelligence Platform. By automating the process of moving data and streamlining pipeline management, Fivetran aims to lessen the engineering resources required by its clients while ensuring more reliable and faster access to data. Growth and integration The past year has seen the partnership between Fivetran and Databricks expand further, with the introduction of advanced integrations into Unity Catalog and Delta Lake. These integrations assist customers in maintaining governance requirements while making use of both structured and unstructured data. As more organisations look to refine their data operations, the combined capabilities of Fivetran and Databricks are cited as helping to reduce operational overhead, enhance performance, and expedite the transformation of raw data into actionable insights. "Databricks continues to be a strategic partner as more companies invest in modern data infrastructure. This recognition speaks to the value we are delivering together for customers who need reliable, secure data pipelines to support production-grade AI and analytics. We are proud to help build the foundation for what comes next." The above was stated by Logan Welley, Vice President of Alliances at Fivetran, underscoring the role of partnership in supporting enterprise clients adopting artificial intelligence and analytics-driven solutions. Launch partner initiatives Fivetran has also been announced as a launch partner for Databricks Managed Iceberg Tables. This new feature is designed to provide customers with access to open and high-performance data formats optimised for large scale analytics and artificial intelligence purposes. Through its integration with Unity Catalog, Fivetran seeks to offer enterprises a consistent approach to data governance and efficient data accessibility as they scale their workloads and expand use cases for analytics and AI. The solution is currently employed by a range of organisations across different industries. National Australia Bank, for example, uses Fivetran's Hybrid Deployment model to operate data pipelines within its own cloud infrastructure while utilising Databricks for processing and analytics. This structure allows the bank to adhere to stringent compliance requirements, whilst modernising its infrastructure and accelerating its artificial intelligence adoption efforts. Other companies, including OpenAI, Pfizer, and Dropbox, use Fivetran to facilitate data transfer into Databricks to support a variety of applications, from real-time analytics to machine learning in production settings. The goal for these organisations is to improve operational speed and inform decision-making processes. Partner perspectives "As enterprise demand for data intelligence grows, Fivetran has been an important partner for us in helping organisations move faster with data. Their focus on automation, scale, and governance aligns with what our customers need as they bring more data-driven AI applications from production to market." This statement was made by Roger Murff, Vice President of Technology Partners at Databricks, highlighting the significance of the partnership in meeting evolving customer needs in the data intelligence sector. Fivetran reports that its automated pipelines, security measures, and managed experience are intended to support compliance and facilitate AI-focused data infrastructure modernisation for its enterprise clients.


Globe and Mail
11-06-2025
- Business
- Globe and Mail
Databricks Eliminates Table Format Lock-in and Adds Capabilities for Business Users with Unity Catalog Advancements
Unity Catalog is now the most complete catalog for Apache Iceberg™ and Delta Lake, enabling open interoperability with governance across compute engines, and adds unified semantics and a rich discovery experience for business users


Cision Canada
11-06-2025
- Business
- Cision Canada
Databricks Eliminates Table Format Lock-in and Adds Capabilities for Business Users with Unity Catalog Advancements
Unity Catalog is now the most complete catalog for Apache Iceberg™ and Delta Lake, enabling open interoperability with governance across compute engines, and adds unified semantics and a rich discovery experience for business users SAN FRANCISCO, June 11, 2025 /CNW/ -- Data + AI Summit -- Databricks, the Data and AI company, today extends its leadership in the unified governance category with powerful new capabilities. Unity Catalog adds full support for Apache Iceberg™ tables, including native support for the Apache Iceberg REST Catalog APIs. Now, Unity Catalog is the only catalog that enables external engines to read and write, with fine-grained governance, to performance-optimized, Iceberg managed tables, eliminating lock-in and enabling seamless interoperability. Databricks is also introducing two new enhancements that extend Unity Catalog to business users. Business metrics and KPIs are the foundation of how companies manage their business, and can now be defined as first-class data assets with Unity Catalog Metrics. In addition, data + AI discovery is enhanced for business users with a new, curated internal marketplace that surfaces the highest-value data, AI and AI/BI assets, organized by business domain. All these assets are augmented with automated data intelligence, so every team can find, trust and act on the right data. Unity Catalog Now Eliminates the Need to Choose Between Formats Built on open standards, Unity Catalog is designed to work across every table format and engine. Databricks is now taking that vision further with the Public Preview of full Apache Iceberg support, uniting the Apache Iceberg and Delta Lake ecosystems with a single approach to governance. The preview adds three new capabilities. First, organizations can create Apache Iceberg managed tables that any Iceberg‑compatible engine can read and write through Unity Catalog's Iceberg REST Catalog API. These Iceberg managed tables benefit from the full power of Unity Catalog: best price performance with AI-powered Predictive Optimization; and unified governance and policy enforcement both within Databricks and across external engines, including Trino, Snowflake, Amazon EMR, etc. Second, Unity Catalog's pioneering Lakehouse Federation capabilities enable seamless access to Iceberg tables managed in external catalogs, so those tables can be discovered and governed alongside native tables. Third, Iceberg tables get all the benefits of the Delta Sharing ecosystem, including seamless cross-organizational sharing of Iceberg tables. These capabilities eliminate format-driven data silos — no other catalog in the industry provides these capabilities. A Growing Disconnect Between Data Platforms and Business Users While data platforms have advanced rapidly for technical users, teams across the business remain disconnected from the systems that power their decisions. Technical teams center their world on tables, files, compute and code, while business users operate in BI tools, AI chatbots and focus on KPIs and business metrics in their business domains. These fundamentally different languages result in business users unsure of what data to trust or reliant on engineers for basic questions. Without a unified foundation for business context, organizations face duplicated work, decision paralysis and a persistent gap between data and action. A Single Source of Truth for Metrics Across the Business To address this need, Unity Catalog Metrics brings business metric definitions traditionally embedded within BI tools to the data platform. This creates consistency and accuracy for how everyone in the organization understands business performance. Unlike proprietary Bl semantic layers, Unity Catalog Metrics are fully addressable via SQL to ensure that everyone in the organization can have the same view of metrics, irrespective of what tool they choose. Unity Catalog Metrics is available to all customers today as a Public Preview and will be Generally Available later this summer. A Unified Foundation for Context: From Guided Discovery to Intelligent Insights To make trusted data truly usable for business users, Databricks is introducing new Unity Catalog capabilities that blend intuitive discovery with built-in intelligence. A new Discover experience offers a curated internal marketplace of certified data products — organized by business domains like Sales, Marketing, or Finance and enriched with documentation, ownership, tagging and usage insights. Automated, intelligent recommendations coupled with data steward curation tools ensure the highest value assets - metrics, dashboards, tables, AI agents, Genie spaces, etc. - can easily be explored, understood, trusted, and accessed through a self-serve workflow, without manual approvals or engineering support. Unity Catalog Discover is now in Private Preview. Unity Catalog also now adds intelligence across the experience, surfacing data quality signals, usage patterns, relationships across assets, and certification and deprecation status to help users quickly assess trust and relevance. With Databricks Assistant built into Unity Catalog, they can ask natural language questions and get grounded, contextual answers based on governed metrics — turning discovery into a guided journey where data is accessible, explainable, trustworthy, and ready for use. "We created the Unified Governance category with Unity Catalog four years ago," said Matei Zaharia, Co-founder and CTO of Databricks. "With these updates to Unity Catalog, we are now offering the best catalog in the industry for Apache Iceberg and all open table formats, and the only one that allows reads and writes to managed tables from external engines, for a truly open enterprise catalog. No matter what table format our customers choose, we ensure it's accessible, optimized, and governed. And with our expanded focus on business users, we're ensuring we deliver on the promise of democratizing data + AI to every user in the enterprise." Customer + Partner Quotes "At Riskified, we want to store all our data in an open format and want a single catalog that can connect to all the tools we use," said Hen Ben-Hemo, Data Platform Architect at Riskified. "Unity Catalog allows us to write Iceberg tables that are fully open to any Iceberg client, unlocking the entire lakehouse ecosystem and future proofing our architecture." "Unity Catalog Metrics gives us a central place to define business KPIs and standardize semantics across teams, ensuring everyone works from the same trusted definitions across dashboards, SQL, and AI applications." — Richard Masters, Vice President, Data & AI, Virgin Atlantic "Unity Catalog Metrics presents an exciting opportunity to establish consistency, trust, and control in how business metrics are defined and consumed across Zalando. It is a promising contribution to aligned, data-driven decisions across our BI dashboards, notebooks, and other tools." — Timur Yuere, Engineering Manager, Zalando "Unity Catalog Metrics represents an exciting opportunity for Tableau customers to leverage the value of centralized governance with Databricks Unity Catalog. Through our deep integration and expanding roadmap with Databricks, we're thrilled to help remove the friction for our customers in leveraging Databricks to define their core business metrics" - Nicolas Brisoux, Sr. Director Product Management Tableau "We're excited to partner with Databricks to integrate Unity Catalog Metrics into Sigma. This gives business teams direct access to trusted, standardized business metrics within their dashboards, so everyone can make decisions based on consistent definitions, without relying on data teams for every question." — Dillion Morrison, VP of Product, Sigma Computing Availability Databricks is introducing Public Preview of full Apache Iceberg support in Unity Catalog. Unity Catalog Metrics is available to all customers today as a Public Preview and will be Generally Available later this summer. Unity Catalog Discover is now in Private Preview. About Databricks Databricks is the Data and AI company. More than 15,000 organizations worldwide — including Block, Comcast, Condé Nast, Rivian, Shell and over 60% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to take control of their data and put it to work with AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake, MLflow, and Unity Catalog. To learn more, follow Databricks on X, LinkedIn and Facebook.


Tahawul Tech
19-03-2025
- Business
- Tahawul Tech
'Tableflow makes it possible to build and scale the next generation of AI-driven applications.' – Shaun Clowes, Confluent
Confluent have formally announced the general availability of Tableflow, which brings real-time business context to analytical systems to make AI and next-generation applications enterprise-ready. With Tableflow, all streaming data in Confluent Cloud can be accessed in popular open table formats, unlocking limitless possibilities for advanced analytics, real-time artificial intelligence (AI), and next-generation applications. Support for Apache Iceberg™ is now generally available (GA). And as a result of an expanded partnership with Databricks, a new early access program for Delta Lake is now open. Additionally, Tableflow now offers enhanced data storage flexibility and seamless integrations with leading catalog providers, including AWS Glue Data Catalog and Snowflake's managed service for Apache Polaris™, Snowflake Open Catalog. 'At Confluent, we're all about making your data work for you, whenever you need it and in whatever format is required,' said Shaun Clowes, Chief Product Officer at Confluent. 'With Tableflow, we're bringing our expertise of connecting operational data to the analytical world. Now, data scientists and data engineers have access to a single, real-time source of truth across the enterprise, making it possible to build and scale the next generation of AI-driven applications.' Bridging the Data Gap for Enterprise-Ready AI Tableflow simplifies the integration between operational data and analytical systems. It continuously updates tables used for analytics and AI with the exact same data from business applications connected to Confluent Cloud. Within Confluent, processing and governance happen as data is generated, shifting these tasks upstream to ensure that only high-quality, consistent data is used to feed data lakes and warehouses. This is a breakthrough for AI, as it's only as powerful as the data that shapes it. Today, Confluent announces significant updates to Tableflow: ● Support for Apache Iceberg is ready for production workloads. Teams can now instantly represent Apache Kafka® topics as Iceberg tables to feed any data warehouse, data lake, or analytics engine for real-time or batch processing use cases. Expensive and error-prone table maintenance tasks, such as compaction, are automatically handled by Tableflow, giving time back to data engineers to deliver more business value. It also provides a single source of truth for one of the most widely adopted open-format storage options, enabling data scientists and data engineers to scale AI innovation and next-generation applications. ● New Early Access Program for Delta Lake is now open. This open-format storage layer, pioneered by Databricks, processes more than 10 exabytes of data daily and is used alongside many popular AI engines and tools. With this integration, customers will have a consistent view of real-time data across operational and analytic applications, enabling faster, smarter AI-driven decision-making. Apply for the Tableflow Early Access Program here. ● Increase flexibility through Bring Your Own Storage. Store fresh, up-to-date Iceberg or Delta tables once and reuse them many times with the freedom to choose a storage bucket. Customers now have full control over storage and compliance to meet their unique data ownership needs. ● Enhance data accessibility and governance with partners. Direct integrations with Amazon SageMaker Lakehouse via AWS Glue Data Catalog (GA) and Snowflake Open Catalog (GA) enable seamless catalog management for Tableflow's Iceberg tables. They also streamline access for analytical engines such as Amazon Athena, AWS EMR, and Amazon RedShift, and leading data lake and warehouse solutions including Snowflake, Dremio, Imply, Onehouse, and Starburst. Additionally, Confluent has strengthened enterprise adoption for Tableflow with support from global and regional system integrators, including GoodLabs Studio, Onibex, Psyncopate, and Tata Consultancy Services (TCS).


TECHx
17-02-2025
- Business
- TECHx
Confluent & Databricks Unite for AI-Ready Real-Time Data - TECHx Media Confluent & Databricks Unite for AI-Ready Real-Time Data
Confluent, Inc. (NASDAQ:CFLT), a data streaming company, and Databricks, the Data and AI company, have announced an expanded partnership aimed at accelerating AI-driven decision-making with real-time data. The integration of Confluent's Data Streaming Platform with Databricks' Data Intelligence Platform will enable enterprises to streamline data governance and build AI applications more efficiently. A key highlight of this collaboration is the bidirectional integration between Confluent's Tableflow and Databricks' Unity Catalog, ensuring seamless governance across operational and analytical systems. This enhancement allows businesses to access real-time, secure, and discoverable data for AI applications. 'Real-time data is the fuel for AI,' said Jay Kreps, co-founder and CEO of Confluent. 'Together with Databricks, we're ensuring businesses can harness the power of real-time data to build sophisticated AI-driven applications for their most critical use cases.' Databricks CEO and co-founder Ali Ghodsi emphasized the importance of integrating data, AI, analytics, and governance in a unified system. 'We are excited that Confluent has embraced Unity Catalog and Delta Lake as its open governance and storage solutions of choice, and we look forward to delivering long-term value for our customers.' With these new capabilities, businesses will benefit from a more connected data ecosystem. Tableflow's integration with Delta Lake ensures operational data is immediately available for AI tools such as Apache Spark, Trino, Polars, DuckDB, and Daft. Additionally, automatic metadata synchronization between Tableflow and Unity Catalog enhances AI application development by making operational data easily discoverable and actionable. This expanded partnership between Confluent and Databricks represents a major step in bridging the gap between enterprise applications, analytics, and governance—empowering organizations to drive AI innovation at scale.