
Endava to accelerate Google Agentspace enterprise AI
Agentic AI, which refers to autonomous, goal-oriented systems capable of making decisions, learning, and collaborating across varied environments, is seeing greater uptake in business settings as organisations seek advanced ways to deploy artificial intelligence.
Google Agentspace is a platform designed to embed intelligent, proactive agents within enterprise workflows. The platform's core offering is a multimodal search agent that functions as a conversational interface to facilitate access to organisational knowledge.
Agentspace integrates data from both structured and unstructured resources, including Google Drive, Jira, and SharePoint. This gives employees a single destination for up-to-date information, reducing time spent on system navigation and supporting quicker decision-making through proactive AI suggestions.
"As a partner to Google Cloud, Endava plays a critical role in helping organisations realise the full potential of Agentspace. With deep experience in cloud-native engineering, enterprise integration and AI solution design, we support clients in deploying intelligent agents that align with real business needs and ensure a smooth, scalable implementation journey," Andrew Rossiter, Global Senior Vice President of Google Cloud at Endava, said.
Agentspace allows organisations to implement domain-specific expert agents. These AI-powered assistants carry out multi-step tasks, provide analysis, and generate content across operational domains such as marketing, legal, finance, and engineering. The platform is intended to foster collaboration between human teams and AI, enabling the completion of complex tasks with greater efficiency and accuracy.
Enterprise productivity may be improved as these agents can automatically summarise documents, track project milestones, synthesise data-driven insights, and coordinate workflows within a secure company environment.
To support the adoption of agentic AI, Endava is making a suite of Agentspace accelerators available. These include pre-built templates, integration frameworks, and reusable software components that aim to simplify and speed up the deployment process, removing the need for businesses to build from the ground up.
The company serves clients across multiple industries, including financial services, technology, healthcare, media, retail, and more. Teams are based in Europe, the Americas, Asia Pacific, and the Middle East. As of late 2024, Endava reported over 11,600 employees worldwide.
Endava, a leading provider of next-generation technology services, continues to help businesses accelerate growth, address complex challenges, and succeed in dynamic markets. With a strong focus on innovation and an AI-native approach, the company partners with clients to deliver tailored solutions that support digital transformation and enhance decision-making across the enterprise.
Combining deep industry expertise with cutting-edge technologies, Endava collaborates with customers from ideation through production, offering support at every stage of the digital journey. Its comprehensive approach is designed to create a meaningful and lasting impact, regardless of industry, geography, or scale.
Endava's diverse client base includes organisations across payments, insurance, financial services, banking, technology, media, telecommunications, healthcare and life sciences, mobility, retail, consumer goods, and more.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Techday NZ
26-06-2025
- Techday NZ
Tray.ai unveils Merlin Agent Builder 2.0 for enterprise AI scale
has released Merlin Agent Builder 2.0, offering enterprise teams a platform that delivers AI agents capable of executing tasks beyond merely answering questions. The new version of Merlin Agent Builder introduces features designed to address persistent challenges in AI agent deployment within enterprises. Industry data indicates that while a majority of enterprises are investing over $500,000 per year on AI agents, many face difficulties in scaling and deriving value from these solutions. Key obstacles highlighted include lack of complete data, session memory limitations, challenges with large language model (LLM) configuration, and rigid deployment options. Addressing deployment and adoption challenges Rich Waldron, Co-Founder and Chief Executive Officer at said: "Enterprise teams aren't short on ambition when it comes to AI agents - but they are short on results. This release clears the path from prototype to production by removing the blockers that stall adoption. We've built the only platform where enterprises can go from idea to working agent - fast - without compromising trust, flexibility or scale. That's how agent-led transformation actually happens." According to a significant gap exists between building and actual usage of AI agents in workplace settings. Agents that are not integrated with comprehensive and up-to-date knowledge often lose context, make unreliable decisions, and force users to repeat information, which undermines user trust and can lead to underutilisation. Meanwhile, IT and AI teams find it difficult to align LLMs with appropriate use cases, particularly where multiple agents operate in parallel, and encounter added complexity when deploying agents across various platforms. To address these issues, upgraded solution includes advancements in four key areas: integration of smart data sources for rapid knowledge preparation, built-in memory for maintaining context across sessions, multi-LLM support, and streamlined omnichannel deployment. Smart data sources and session memory The Merlin Agent Builder 2.0 offers a new smart data sources feature aimed at simplifying the connection and synchronisation of both structured and unstructured enterprise knowledge. Through a single interface, users can link data from sources like file uploads or Google Drive. This data is then automatically prepared and vectorised to ensure agents are informed with relevant and reliable information. Alistair Russell, Co-Founder and Chief Technology Officer of commented: "Merlin Agent Builder isn't a services wrapper. It's a fundamental part of our product and built for ease of use and scale. It handles chunking and embedding at the source, ensuring each data source is optimally segmented and vectorized so agents are grounded in high-signal, relevant context. That means fewer retrieval failures, more reliable decisions, and agents that reason and take action. It's how teams move fast - without trade-offs." Addressing another common shortcoming of AI agents - context loss between interactions - Merlin Agent Builder 2.0 incorporates built-in memory capabilities. The platform enables agents to recall previous sessions, track conversation history, and manage both short-term and long-term memory requirements automatically. This aims to reduce the need for custom solutions and enhances continuity in user exchanges, improving adoption rates. Flexible large language model support As organisations deploy multiple agents to handle diverse business processes, the ability to configure each agent with the most suitable LLM becomes increasingly important. Merlin Agent Builder 2.0 supports multiple LLM providers, including OpenAI, Gemini, Bedrock, and Azure. Teams can assign specific models to individual agents with tailored configurations, avoiding proprietary lock-in and supporting privacy-driven workflows where necessary. Unified deployment across channels The updated release allows teams to build an agent once and deploy it seamlessly across communication and application environments such as Slack, web applications, and APIs, or for autonomous operations. The delivery configuration is incorporated directly into the agent setup process, which eliminates the need for repeated setup and technical adjustments for different channels. With these updates, targets what it identifies as critical needs for enterprises: simplified data onboarding, session-aware agents, flexible modelling, and consistent deployment experiences. The company states that by providing these features in a unified platform, both IT and business teams are better positioned to transition from pilot projects to production-ready AI agents that are actively used by employees and customers alike.


Techday NZ
24-06-2025
- Techday NZ
Wild Tech hires Andrew Kirk to lead enterprise cloud growth
Wild Tech has appointed Andrew Kirk as Senior Business Development Manager to drive the company's expansion in enterprise-grade managed services and digital transformation partnerships. Kirk brings extensive experience from previous senior roles at Telstra and IBM, and comprehensive familiarity with Amazon, Microsoft and Google Cloud platforms. Wild Tech aims to leverage Kirk's expertise to assist organisations seeking to modernise operations and build robust, cloud-first environments. Dan Whittle, General Manager – Managed Services at Wild Tech, stated, "Andrew's background working with Tier 1 enterprises makes him an exceptional fit for our next phase of growth. He has walked in the shoes of large, complex organisations and knows what it takes to implement scalable, compliant solutions that deliver real outcomes. His insight will be pivotal as we help clients transition from project-based deployments to ongoing service-led transformation." During his tenure at Telstra, Kirk held responsibility for the profit and loss in Cloud Services and led the introduction of Microsoft, Amazon, and Cisco cloud offerings across Australia and the broader APAC region. Early in his career, he was involved in the development of managed desktop services at Advantra, a joint venture between IBM, Lend Lease, and Telstra. More recently, Kirk played a role in establishing Searce's Australian operations, with a focus on Google Cloud and AWS solutions for the retail and mining sectors. In joining Wild Tech, Kirk steps into a role centred on expanding the company's influence across the enterprise and upper mid-market sector, concentrating on government, financial services, and retail. He will facilitate the alignment of long-term managed services with cloud, AI, and data solutions. "The appetite for transformation is strong—but the real challenge is productivity," Kirk said. "Wild Tech gets this. They're not just delivering tech projects, they're embedding long-term capability and service models that evolve with the client and drive the bottom line. That's exactly where I want to be." Kirk's recruitment supports Wild Tech's approach of linking technology delivery with operational excellence through a managed services approach tailored to enterprise requirements. Wild Tech states that its strategy for transformation is rooted in a comprehensive understanding of specific industry demands. The company asserts the importance of listening to clients to remain ahead of evolving requirements, and of taking into account how end-to-end business processes and organisational maturity interact with each technology platform's capability. The company continues to position itself as an Australian-owned and operated entity serving clients across APAC, with a focus on building the next generation of digital operating models through partnerships and established market platforms. Follow us on: Share on:


Techday NZ
18-06-2025
- Techday NZ
Landis+Gyr halves costs by moving Oracle workloads to Google Cloud
Landis+Gyr has optimised its total cost of ownership and operational scalability by migrating critical Oracle workloads to Google Cloud Platform using Tessell's Database-as-a-Service solution. Operating in over 30 countries, Landis+Gyr manages millions of smart meters for utility customers, aiming to support grid performance and energy efficiency. As global energy demand and the need for real-time energy grid intelligence increase, the company identified a requirement to shift from legacy on-premises systems to a more scalable, cloud-native infrastructure. The migration process involved moving complex Oracle workloads - most notably the Oracle Head End System (HES) and Meter Data Management (MDM) applications - previously running on a Windows platform. The legacy environment introduced high licensing expenses, performance limitations, and restricted scalability, prompting the need for change. Tessell and GCP Landis+Gyr collaborated with Tessell to execute a cross-platform migration from Windows to Linux in conjunction with adopting Google Cloud's infrastructure. Tessell's platform enabled a transition that the company states achieved real-time data ingestion with sub-second latency, over 99.99% application availability, a 50% reduction in infrastructure costs, and a 60% increase in labour efficiency for database administrators. Compliance with data residency requirements across global regions was also highlighted. "Tessell's ability to execute complex Oracle migrations with precision allowed us to unlock significant operational and financial value," said Martti Kontula, Head of OT & Data at Landis+Gyr. "Our smart metering applications now run with greater agility, enabling us to deliver better insights and services to our customers while setting the foundation for long-term growth." The project included a proof-of-concept phase on Google Cloud, which demonstrated that moving to a Linux-based system met the company's performance benchmarks. This encompassed the demands of real-time smart meter data ingestion and the required levels of system uptime and throughput at scale. Operational impact Landis+Gyr reports several outcomes resulting from the migration. Scalability has been enhanced through Google Cloud's elastic infrastructure, allowing for the ingestion and processing of data from millions of deployed smart meters and maintaining responsiveness during peak usage. The company has also experienced a reduction in licensing and support costs by shifting from Windows to Linux, and a decrease in maintenance overheads. Automation of essential operations such as patching, updates, and lifecycle management has enabled internal personnel to dedicate more time to innovation and analytics. Landis+Gyr stated that it is on schedule to retire its legacy data centres, and is adopting a cloud-first approach across its operations. Enhancing resilience Looking ahead, Landis+Gyr plans to extend its partnership with Tessell to improve high availability and disaster recovery capabilities. This includes deploying a multi-zone, multi-region high availability architecture on Google Cloud, automating cross-region disaster recovery with minimal data loss, and engaging in business continuity planning in line with industry standards. "With Tessell's robust cloud platform and GCP's global scale, Landis+Gyr is well-positioned to meet the rising demands of the energy sector while supporting its mission of creating a more sustainable and intelligent energy future," said Bakul Banthia, Co-Founder of Tessell. Landis+Gyr continues to focus on improving its cloud infrastructure to respond to the evolving requirements of energy utilities and the broader market. Follow us on: Share on: