logo
#

Latest news with #StewartBond

Denodo Announces DeepQuery for GenAI Research Use
Denodo Announces DeepQuery for GenAI Research Use

TECHx

time08-07-2025

  • Business
  • TECHx

Denodo Announces DeepQuery for GenAI Research Use

Home » Emerging technologies » Artificial Intelligence » Denodo Announces DeepQuery for GenAI Research Use Denodo has announced the private preview availability of Denodo DeepQuery, a new deep research capability designed to extend the potential of generative AI (GenAI). The company also revealed support for the Model Context Protocol (MCP) in the latest version of its AI SDK. According to Denodo, DeepQuery moves beyond fact retrieval and enables GenAI to investigate, synthesize, and explain its reasoning. It has been developed to address complex, open-ended business questions by leveraging live, governed enterprise data from various systems, departments, and formats. Unlike traditional GenAI tools that rephrase existing content, Denodo DeepQuery is built to analyze complex queries. It searches across multiple systems and sources to deliver structured, explainable responses based on real-time information. The solution also integrates external sources, including publicly available data, trading partner data, and external applications, to enrich internal data. Denodo reported that DeepQuery will enable users to ask business-critical questions, such as: 'Why did fund outflows spike last quarter?' 'What's driving changes in customer retention across regions?' The tool is designed to reduce the time analysts spend compiling reports or data exports. Instead, DeepQuery connects to governed data across systems and applies advanced reasoning to deliver insights within minutes. Denodo stated that DeepQuery will be packaged with the Denodo AI SDK. The SDK includes pre-built APIs to support AI application development. Developers and AI teams can build, test, and integrate deep research functions into their own agents, copilots, or domain-specific applications. The company also announced that DeepQuery is available through Denodo's AI Accelerator Program. Selected organizations will receive early access and an opportunity to provide feedback to help shape the future of the product. Denodo further revealed support for Model Context Protocol (MCP). The latest Denodo AI SDK includes an MCP Server implementation. This allows AI agents and apps built on the SDK to integrate with any MCP-compliant client. The integration supports a trusted data foundation for agentic AI ecosystems that rely on open standards. Denodo's CEO and Founder, Angel Viña, stated that DeepQuery enables GenAI to connect with governed, real-time data across distributed systems. He emphasized that this approach allows AI to reason, explain, and act with better context. Industry analysts also weighed in. Stewart Bond, Research VP at IDC, commented that DeepQuery supports explainable AI through deep research advancements. Andrew Brust of GigaOm noted that DeepQuery enables AI to shift from basic retrieval to more complex reasoning with real-time governed data. Nagaraj Sastry, Senior VP at Encora, added that Denodo DeepQuery allows clients to move from general AI queries to intelligent analysis. He stated that this helps accelerate decision-making and AI adoption. Denodo stated that DeepQuery is still in private preview. It will be generally available soon. The company invited interested organizations to participate in the AI Accelerator Program.

Denodo unveils DeepQuery for real-time data research
Denodo unveils DeepQuery for real-time data research

Techday NZ

time07-07-2025

  • Business
  • Techday NZ

Denodo unveils DeepQuery for real-time data research

Denodo has launched DeepQuery, a Deep Research capability aimed at addressing the limitations of conventional generative AI tools in data analysis. DeepQuery explained DeepQuery operates differently from retrieval-augmented generation approaches, which typically focus on rephrasing indexed content. This new capability conducts structured reasoning across live and governed data sources, enabling users to investigate distributed datasets and pose complex "why" questions, such as analysing the causes of changes in customer retention across various regions. With DeepQuery, users receive multi-step analytical answers within seconds, a process that would previously require a significant amount of manual analysis and cross-referencing across multiple data sets. Addressing business needs DeepQuery is built to handle open-ended business questions by leveraging real-time access to enterprise data from a range of formats and departmental systems. In contrast to traditional generative AI solutions—which tend to rephrase pre-existing content—DeepQuery analyses open questions and searches multiple systems and sources to provide structured, explainable answers rooted in current information. The capability will also be able to incorporate external data sources—such as publicly available information, applications outside the organisation, and data from trading partners—to complement and enhance the information already held within a business. This functionality enables users to interrogate complex, cross-functional queries that would previously take days for analysts to resolve, such as understanding the underlying reasons for quarterly changes in fund outflows. Rather than manually gathering data or generating multiple reports, DeepQuery provides connections to governed data in real time and applies reasoning capabilities to respond swiftly to intricate questions. Integration and development DeepQuery is being developed as a fully extensible component of the Denodo Platform and will be distributed as part of the Denodo AI SDK. The SDK offers pre-built APIs intended to streamline AI application development. This integration aims to allow developers and AI teams to use, experiment with, and incorporate the DeepQuery capability into various agents, copilots, or domain-specific solutions. "With DeepQuery, Denodo is demonstrating forward-thinking in advancing the capabilities of AI. DeepQuery, driven by deep research advances, will deliver more accurate AI responses that will also be fully explainable." This is according to Stewart Bond, Research VP, Data Intelligence and Integration Software at IDC, who spoke on the platform's potential impact compared to other solutions currently available. While large language models, business intelligence tools, and other applications are starting to introduce deep research capabilities utilising public web data or data-lakehouse storage, Denodo's solution is anchored in enterprise data. Data delivered and managed by DeepQuery is structured, governed, and available in real-time, made possible by the logical data management inherent to the Denodo Platform and its data virtualisation framework. The private preview phase of Denodo DeepQuery is currently underway, with Denodo inviting selected organisations to participate in its AI Accelerator Programme. Participants receive early access to DeepQuery's features and have the opportunity to provide input that will help shape its future development. "As a Denodo partner, we're always looking for ways to provide our clients with a competitive edge. Denodo DeepQuery gives us exactly that. Its ability to leverage real-time, governed enterprise data for deep, contextualised insights sets it apart. This means we can help our customers move beyond general AI queries to truly intelligent analysis, empowering them to make faster, more informed decisions and accelerating their AI journey," said Nagaraj Sastry, Senior Vice President, Data and Analytics at Encora. Model Context Protocol support Denodo also announced support for the Model Context Protocol (MCP). With the inclusion of an MCP Server in the latest version of the Denodo AI SDK, any AI agents or applications utilising the SDK can be integrated with MCP-compliant clients. This aims to establish a reliable and standardised data foundation for organisations adopting agentic AI architectures based on open standards. "AI's true potential in the enterprise lies not just in generating responses, but in understanding the full context behind them. With DeepQuery, we're unlocking that potential by combining generative AI with real-time, governed access to the entire corporate data ecosystem, no matter where that data resides. Unlike siloed solutions tied to a single store, DeepQuery leverages enriched, unified semantics across distributed sources, allowing AI to reason, explain, and act on data with unprecedented depth and accuracy," said Angel Viña, CEO and Founder of Denodo.

Confluent Cloud boosts agentic AI with enhanced data & security
Confluent Cloud boosts agentic AI with enhanced data & security

Techday NZ

time30-06-2025

  • Business
  • Techday NZ

Confluent Cloud boosts agentic AI with enhanced data & security

Confluent has introduced new features in Confluent Cloud to support the development of secure and intelligent AI agents and analytics through unified access to real-time and historical data. The new capabilities for Confluent Cloud are centered around enhancing data quality, improving security, and simplifying networking for organisations deploying agentic artificial intelligence systems. The company has made these announcements as part of ongoing efforts to enable enterprises to better use their data assets for more effective decision making. Snapshot queries One of the key features highlighted is the availability of snapshot queries in Confluent Cloud for Apache Flink. This functionality brings together real-time and batch data processing, making it possible for artificial intelligence agents and analytics systems to work with historical and up-to-date information in a single environment. The objective is to deliver smarter and more informed business decisions by ensuring complete data context is always available. "Agentic AI is moving from hype to enterprise adoption as organizations look to gain a competitive edge and win in today's market," said Shaun Clowes, Chief Product Officer at Confluent. "But without high-quality data, even the most advanced systems can't deliver real value. The new Confluent Cloud for Apache Flink features make it possible to blend real-time and batch data so that enterprises can trust their agentic AI to drive real change." The integration allows teams to use one product and one programming language to manage both streaming and historical datasets, removing the need to run separate tools or manual workarounds. Through seamless Tableflow integration, snapshot queries offer teams the ability to explore and analyse data comprehensively, enabling analytics and agentic AI to be informed by both past and present trends. This feature is now in early access for users. Expert insights "The rise of agentic AI orchestration is expected to accelerate, and companies need to start preparing now," said Stewart Bond, Vice President of Data Intelligence and Integration Software at IDC. "To unlock agentic AI's full potential, companies should seek solutions that unify disparate data types, including structured, unstructured, real-time, and historical information, in a single environment. This allows AI to derive richer insights and drive more impactful outcomes." Agentic AI has been increasingly adopted by businesses seeking to optimise efficiency and obtain faster insights by analysing diverse sets of data. Access to both historical and streaming information is essential for use cases such as fraud detection, where banks need to examine both current activities and previous transaction records, and in healthcare, where clinicians rely on real-time patient data alongside medical histories for informed decisions. By enabling one platform to handle multiple data types seamlessly, Confluent aims to simplify workflows and reduce the operational complexity associated with managing separate data systems for AI and analytics initiatives. Networking and security improvements Addressing connectivity and security requirements, Confluent Cloud now includes the Confluent Cloud Network (CCN) routing feature. This improvement allows teams to repurpose existing CCNs, previously created for Apache Kafka clusters, to establish secure connections for Apache Flink workloads. CCN routing is now available on Amazon Web Services in all Flink-supported regions. This is intended to make it easier for organisations with strict internal network controls to deploy stream processing, AI agents, and analytics securely. Confluent has also rolled out IP Filtering for Flink to grant more precise access control for hybrid environments. Many organisations have strict policies governing public data access, and this feature enables teams to restrict internet traffic to specific, allowed IP addresses while providing better visibility into unauthorised access attempts. IP Filtering is now generally available for all Confluent Cloud users. The company's new features aim to help organisations transform their intended use of agentic AI into concrete benefits to their operations, offering additional options for private networking, expanded data source connectivity, and improved governance in stream processing environments. Follow us on: Share on:

Confluent Announces New Cloud Capabilities For Data Streaming
Confluent Announces New Cloud Capabilities For Data Streaming

Channel Post MEA

time04-06-2025

  • Business
  • Channel Post MEA

Confluent Announces New Cloud Capabilities For Data Streaming

Confluent has announced new Confluent Cloud capabilities that make it easier to process and secure data for faster insights and decision-making. Snapshot queries, new in Confluent Cloud for Apache Flink, bring together real-time and historic data processing to make artificial intelligence (AI) agents and analytics smarter. Confluent Cloud network (CCN) routing simplifies private networking for Apache Flink, and IP Filtering adds access controls for publicly accessible Flink pipelines, securing data for agentic AI and analytics. 'Agentic AI is moving from hype to enterprise adoption as organizations look to gain a competitive edge and win in today's market,' said Shaun Clowes, Chief Product Officer at Confluent. 'But without high-quality data, even the most advanced systems can't deliver real value. The new Confluent Cloud for Apache Flink features make it possible to blend real-time and batch data so that enterprises can trust their agentic AI to drive real change.' Bridging the Real-Time and Batch Divide 'The rise of agentic AI orchestration is expected to accelerate, and companies need to start preparing now,' said Stewart Bond, Vice President of Data Intelligence and Integration Software at IDC. 'To unlock agentic AI's full potential, companies should seek solutions that unify disparate data types, including structured, unstructured, real-time, and historical information, in a single environment. This allows AI to derive richer insights and drive more impactful outcomes.' Agentic AI is driving widespread change in business operations by increasing efficiency and powering faster decision-making by analyzing data to uncover valuable trends and insights. However, for AI agents to make the right decisions, they need historical context about what happened in the past and insight into what's happening right now. For example, for fraud detection, banks need real-time data to react in the moment and historical data to see if a transaction fits a customer's usual patterns. Hospitals need real-time vitals alongside patient medical history to make safe, informed treatment decisions. But to leverage both past and present data, teams often have to use separate tools and develop manual workarounds, resulting in time-consuming work and broken workflows. Additionally, it's important to secure the data that's used for analytics and agentic AI; this ensures trustworthy results and prevents sensitive data from being accessed. Snapshot Queries Unify Processing on One Platform In Confluent Cloud, snapshot queries let teams unify historical and streaming data with a single product and language, enabling consistent, intelligent experiences for both analytics and agentic AI. With seamless Tableflow integration, teams can easily gain context from past data. Snapshot queries allow teams to explore, test, and analyze data without spinning up new workloads. This makes it easier to supply agents with context from historic and real-time data or conduct an audit to understand key trends and patterns. Snapshot queries are now available in early access. CCN Routing Simplifies Private Networking for Flink Private networking is important for organizations that require an additional layer of security. Confluent offers a streamlined private networking solution by reusing existing CCNs that teams have already created for Apache Kafka clusters. Teams can use CCN to securely connect their data to any Flink workload, such as streaming pipelines, AI agents, or analytics. CCN routing is now generally available on Amazon Web Services (AWS) in all regions where Flink is supported. IP Filtering Protects Flink Workloads in Hybrid Environments Many organizations that operate in hybrid environments need more control over which data can be publicly accessed. IP Filtering for Flink helps teams restrict internet traffic to allowed IPs and improves visibility into unauthorized access attempts by making it easier to track the attempts. IP Filtering is generally available for all Confluent Cloud users. Now organizations can more easily turn the promise of agentic AI into a competitive advantage. To learn more about the other new Confluent Cloud features, including the Snowflake source connector, cross-cloud Cluster Linking, and new Schema Registry private networking features, check out the launch blog. 0 0

Confluent Unites Batch and Stream Processing for Faster, Smarter Agentic AI and Analytics - Middle East Business News and Information
Confluent Unites Batch and Stream Processing for Faster, Smarter Agentic AI and Analytics - Middle East Business News and Information

Mid East Info

time03-06-2025

  • Business
  • Mid East Info

Confluent Unites Batch and Stream Processing for Faster, Smarter Agentic AI and Analytics - Middle East Business News and Information

Confluent, Inc. (Nasdaq: CFLT), the data streaming pioneer, announced new Confluent Cloud capabilities that make it easier to process and secure data for faster insights and decision-making. Snapshot queries, new in Confluent Cloud for Apache Flink®, bring together real-time and historic data processing to make artificial intelligence (AI) agents and analytics smarter. Confluent Cloud network (CCN) routing simplifies private networking for Apache Flink®, and IP Filtering adds access controls for publicly accessible Flink pipelines, securing data for agentic AI and analytics. 'Agentic AI is moving from hype to enterprise adoption as organizations look to gain a competitive edge and win in today's market,' said Shaun Clowes, Chief Product Officer at Confluent. 'But without high-quality data, even the most advanced systems can't deliver real value. The new Confluent Cloud for Apache Flink® features make it possible to blend real-time and batch data so that enterprises can trust their agentic AI to drive real change.' Bridging the Real-Time and Batch Divide 'The rise of agentic AI orchestration is expected to accelerate, and companies need to start preparing now,' said Stewart Bond, Vice President of Data Intelligence and Integration Software at IDC. 'To unlock agentic AI's full potential, companies should seek solutions that unify disparate data types, including structured, unstructured, real-time, and historical information, in a single environment. This allows AI to derive richer insights and drive more impactful outcomes.' Agentic AI is driving widespread change in business operations by increasing efficiency and powering faster decision-making by analyzing data to uncover valuable trends and insights. However, for AI agents to make the right decisions, they need historical context about what happened in the past and insight into what's happening right now. For example, for fraud detection, banks need real-time data to react in the moment and historical data to see if a transaction fits a customer's usual patterns. Hospitals need real-time vitals alongside patient medical history to make safe, informed treatment decisions. But to leverage both past and present data, teams often have to use separate tools and develop manual workarounds, resulting in time-consuming work and broken workflows. Additionally, it's important to secure the data that's used for analytics and agentic AI; this ensures trustworthy results and prevents sensitive data from being accessed. Snapshot Queries Unify Processing on One Platform In Confluent Cloud, snapshot queries let teams unify historical and streaming data with a single product and language, enabling consistent, intelligent experiences for both analytics and agentic AI. With seamless Tableflow integration, teams can easily gain context from past data. Snapshot queries allow teams to explore, test, and analyze data without spinning up new workloads. This makes it easier to supply agents with context from historic and real-time data or conduct an audit to understand key trends and patterns. Snapshot queries are now available in early access. CCN Routing Simplifies Private Networking for Flink Private networking is important for organizations that require an additional layer of security. Confluent offers a streamlined private networking solution by reusing existing CCNs that teams have already created for Apache Kafka® clusters. Teams can use CCN to securely connect their data to any Flink workload, such as streaming pipelines, AI agents, or analytics. CCN routing is now generally available on Amazon Web Services (AWS) in all regions where Flink is supported. IP Filtering Protects Flink Workloads in Hybrid Environments Many organizations that operate in hybrid environments need more control over which data can be publicly accessed. IP Filtering for Flink helps teams restrict internet traffic to allowed IPs and improves visibility into unauthorized access attempts by making it easier to track the attempts. IP Filtering is generally available for all Confluent Cloud users. Now organizations can more easily turn the promise of agentic AI into a competitive advantage. To learn more about the other new Confluent Cloud features, including the Snowflake source connector, cross-cloud Cluster Linking, and new Schema Registry private networking features, check out the launch blog.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store