
Most software executives plan custom AI agents to drive change
The report, titled Navigating Agentic AI & Generative AI in Software Development: Human-Agent Collaboration is Here , was commissioned by OutSystems in partnership with CIO Dive and KPMG. Surveying 550 global software executives, it explores how artificial intelligence is affecting the software development lifecycle (SDLC) and the workplace at large.
AI changing software development
With technology budgets scrutinised and outcomes under pressure, IT leaders are increasingly turning to agentic AI to address operational hurdles such as fragmented toolsets and siloed data. Businesses are reporting that agentic AI enables them to automate key workflows, offer more personalised digital services, and innovate rapidly - all while maintaining compliance, security, and governance standards.
Woodson Martin, Chief Executive Officer of OutSystems, commented, "The software development lifecycle is undergoing a significant transformation as organizations increase AI investments to maintain their competitive edge. Blending AI with development tools enables IT leaders to manage this shift effectively and securely. In a near future, AI agents acting as highly specialized teams will continuously monitor business needs, identify opportunities, and proactively refine software solutions, allowing developers and business leaders to play a more creative role and focus on strategic priorities. This report underscores how AI advancements are reshaping traditional roles and unlocking opportunities for innovation and collaboration between humans and technology."
Survey respondents highlighted concrete results from AI adoption: more than two thirds reported increased developer productivity and higher-quality software with fewer bugs. Additionally, 62% noted improved scalability in development, while 60% cited greater efficiency in testing and quality assurance (QA).
Impacts on the workforce
The report projects that experimentation with agentic AI and its uptake over the next 24 months will drive organisational change. According to the survey, 69% of software executives expect AI to introduce new, more specialised roles - including oversight, governance, prompt engineering, agent architecture, and agent orchestration - to adapt to AI's evolving function within companies.
Furthermore, 63% of respondents said AI will require considerable upskilling or reskilling of existing teams to meet the skills needed in this new landscape.
Where AI is being used
Almost half (46%) of executives report their organisations already integrate agentic AI into workflows, with a further 28% in the piloting stage. The most anticipated area for AI agent deployment is customer support, with 49% planning to use AI agents to handle customer inquiries and support functions autonomously.
The focus on customer service exceeds other domains such as product development (38%), sales and marketing (32%), supply chain management (28%), human resources (24%), and finance and accounting (23%).
Drivers and risks associated with AI
The primary goals for AI adoption, as expressed by over half the respondents, include improving customer experience (56%), automating repetitive tasks (55%), accelerating software development (54%), and advancing broader digital transformation objectives (53%).
However, the report also identifies significant challenges. 64% of software executives cited risks around governance, security, and compliance with widespread AI adoption. An equal proportion expressed concerns regarding transparency and reliability of AI decisions.
The proliferation of disparate AI tools has led to new issues with oversight and increasing technical debt, with 44% identifying AI sprawl as a growing risk. Addressing these burdens will be critical in ensuring AI's long-term value for business.
Building confidence in AI tools
Michael Harper, Managing Director at KPMG LLP, noted, "A lot of organizations started with pilots a year ago or even prior to that, but now they're starting to see real efficiency gains in areas like code generation and application testing. Those activities are giving organizations more confidence in using these tools and helping them to move forward."
The survey covered executives from a range of industries and geographies, including IT consultancy, manufacturing, banking, financial services, and insurance, with data collected across the United States, United Kingdom, Japan, France, Canada, Australia, India, and Germany.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Techday NZ
16 hours ago
- Techday NZ
Commvault unveils Clumio Backtrack to speed DynamoDB recovery
Commvault has announced the general availability of Clumio Backtrack for Amazon DynamoDB, an offering designed to improve recovery speed and resilience for cloud-native database workloads. Amazon DynamoDB is widely used in the development of large-scale, high-performance applications, including real-time personalisation engines, eCommerce platforms, mobile applications, and AI-driven services. The database is noted for its scalability and low-latency operations, which are critical to cloud-first development teams. However, the nature of DynamoDB's architecture has made recovering data - especially granular or point-in-time recoveries - challenging for organisations operating at scale. Current recovery processes often require teams to restore an entire table, followed by copying the impacted items back and then deleting the new table to control operational and storage costs. This process is resource-intensive and can result in significant downtime, especially for organisations with tables consisting of billions of records or data sets reaching into the terabytes. Near-instant rollback Clumio Backtrack for DynamoDB has been developed to address these challenges. According to Commvault, the solution enables near-instant rollback to a prior point in time for existing DynamoDB tables, with no reconfiguration required. This functionality allows recovery at the partition level, rather than necessitating a full-table restore, helping to reduce both recovery times and associated costs. The solution is based on an "incremental forever" backup model, which is intended to enable faster, more precise recovery operations. Commvault states that this approach can lower total cost of ownership when compared to native recovery options, which typically only support full backups. "Organisations are building their next-generation cloud and AI applications on DynamoDB," said Woon Jung, Chief Technology Officer, Cloud Native at Commvault. "Clumio Backtrack removes the friction and risk from database recovery. Now, teams can roll back or restore their data in minutes, not days, and without needing to perform complex, multi-step manual recoveries. This is another example of Commvault leading the charge in resilience and recovery." The increasing importance of data resilience and availability is evident as organisations rely more heavily on cloud-native databases like DynamoDB. The addition of granular, point-in-time recovery is seen as a significant development in supporting agile software development and AI-driven workloads. "Amazon DynamoDB remains a critical cloud-native NoSQL database in production, thanks to major improvements across consistency, cost and performance. Amid rising data resilience and availability imperatives, precise, point-in-time recovery for DynamoDB is a significant advancement. This granular control reduces downtime, risks and operational costs, key needs for agile development and AI-driven applications," said Archana Venkatraman, Senior Research Director, Cloud Data Management at IDC. Recent product evolution Clumio Backtrack for DynamoDB follows the earlier release of Clumio Backtrack for Amazon Simple Storage Service (S3). The S3 solution enables enterprises to automate the rollback of objects or specific data pieces to defined versions based on a specific point in time. Commvault notes that with the introduction of DynamoDB support, AWS customers can benefit from increased flexibility and speed in recovery operations, aimed at reducing service disruption and operational risk. Availability and pricing Clumio Backtrack for DynamoDB is now available internationally through the AWS Marketplace, with pricing structured on a consumption basis. Follow us on: Share on:


NZ Herald
17 hours ago
- NZ Herald
Air New Zealand inks direct partnership with ChatGPT maker OpenAI, pair create virtual customers
As well as being OpenAI CFO, Friar is a long-time member of the board of Walmart. Before returning home to run Air New Zealand, Foran was in charge of the giant retailer's US operation. There have been several meetings between senior Air New Zealand and Open AI senior leaders over the past year 'and they saw that Air New Zealand could well be the petri dish for innovation in the critical infrastructure and aviation game', Ravishankar said. The airline's use of OpenAI's generative artificial intelligence (AI) has run from the mainstream – summarising complicated documents – to creating the aforementioned virtual focus group. OpenAI's Sarah Friar overlapped with Air NZ CEO Greg Foran at Walmart. Photo / Getty Images 'Our customer service teams have created customer personas by feeding them all the feedback and complaints we've got – to test service improvements before we introduce them to a focus group,' Ravishankar said. Ideas could be 'pre-tested' on and honed on the virtual personas, which were based on hundreds of thousands of pieces of customer comments, helping to fine-tune ideas from a much broader pool of perspectives, before putting them before a human focus group. 'AI allows you to do complex, integrated planning much more seamlessly,' Ravishankar said. 'We're starting to look at how we use AI to optimise our loyalty tier benefits, for example, and how we can optimise turnaround to improve on-time performance management.' What about customer-facing AI? Air New Zealand was an early adopter of Soul Machines' avatar technology, before the Kiwi firm flamed out, but switched from the homegrown solution to its inhouse-developed chatbot 'Oscar'. Is Oscar about to get an an AI makeover? 'As are a lot of organisations, we're doing a ton of work on whether generative AI is ready for prime time; ready to directly interface with customers to provide an, if you will, chatbot on steroids. We're doing a lot of testing, but we're just not fully satisfied it's there yet,' Ravishankar said. 'But it's maturing at a rate of knots. I'm a technologist by trade, and I've never seen anything move as quickly.' Behind the scenes, staff are using AI to help deliver more personalised service to customers, he said. All up, the airline is using 1500 CustomGPTs to introduce efficiencies to internal workflows. CustomGPTs are set to specific tasks and can be ring-fenced to access a company's own data – addressing the dangers of an AI hallucinating or breaching privacy or commercial confidentiality. How do you market to an AI concierge? OpenAI recently released a ChatGPT 'agent' that can carry out autonomous tasks, such as booking travel (or 'looking' at a photo of a meal you like on Instagram, then ordering the ingredients for you from an online supermarket). The initial release was restricted to those on US$200 ($330) per month pro plans, but it's being rolled out this month to those on cheaper Plus plans too. So far, many early testers have found the ChatGPT agent slow and clumsy, in part because human approval is needed at various steps – including credit card purchases – so you can't walk away from your device. 'Uncharted territory' But while the ChatGPT agent might be a while off for OpenAI's free tier, pundits see 'agentic AI' as the next big thing – and it's already starting to figure in the airline's thinking. 'We're starting to wonder what the role of a brand is when ... a customer's own concierge [AI] agent is deciding which product to put in front of the customer,' Ravishankar said. 'That's uncharted territory ... if anyone tells us they know exactly how that's going to play out, they're making it up. 'But we're paying close attention to it, and then we're seeing multiple models emerge. 'It could be we provide an Air New Zealand concierge, or our concierge interacts with a customer's own concierge. Or we just make our environment open to customers' own agents, being able to interact with us. 'It's too soon to say what pattern emerges. If I was a betting man, I'd say we'll probably see multiple models.' Direct collaboration As part of the collaboration, Air New Zealand will gain early access to OpenAI technologies to develop and apply use cases, and equip its people across corporate roles with secure, enterprise-grade AI tools, Ravishankar said. 'By working directly with OpenAI, we not only access leading-edge technology but we also shape how it's used in the real world.' 'Air New Zealand is taking meaningful steps to bring AI across key parts of its business using OpenAI's technology. We have been particularly impressed with how quickly they have built over 1500 CustomGPTs to introduce efficiencies to internal workflows,' OpenAI international managing director Oliver Jay said. 'Their focus on innovation and responsibility shows how the aviation sector can adopt advanced tools in practical ways that deliver value for both employees and customers.' POSTSCRIPT: Captain's chair? Foran resigned as CEO in May. He will depart the airline in October. The Australian recently reported that Ravishankar has the inside running to replace him. The Herald asked Ravishankar if he wanted to take the opportunity to rule himself in or out. 'That is definitely above my pay grade, and you probably want to speak to the board about that,' he replied. There was no immediate response from the board. Boards typically don't comment on potential candidates prior to an executive appointment. Chris Keall is an Auckland-based member of the Herald's business team. He joined the Herald in 2018 and is the technology editor and a senior business writer.


Techday NZ
18 hours ago
- Techday NZ
Why ‘Good Enough' networks are failing AI ambitions
Artificial intelligence (AI) is on every boardroom agenda, but many organisations are quietly discovering their legacy networks just aren't up to the job. While there's no shortage of investment in data, analytics and AI talent, the underlying infrastructure–the network which moves data at scale–remains outdated. And that's creating friction at precisely the time when agility, security and performance matter most, the invisible backbone is becoming a bottleneck. The uncomfortable truth is: in an AI-driven economy "good enough" is no longer good enough. Here are the five most common myths I encounter when working with enterprise clients to future-proof their networks. MYTH 1: "We can get by with upgrading our current network." Reality: Legacy networks weren't built for the demands of today's AI-driven, cloud-intensive environments. Many Australian enterprises believe they can simply 'upgrade' their existing network, but this is a costly illusion. Legacy networks weren't designed for today's fragmented, multi-cloud architectures, now powering AI workloads bursting across public, private, SaaS and edge environments. As data flow needs increase exponentially across hybrid work environments, IoT and AI, the scale and speed required is pushing legacy infrastructure beyond breaking point. Rather than patching things up, enterprises need to rethink their foundations altogether moving towards agile, platformised networks built for speed, security and scale. MYTH 2: "Multi-cloud increases cost and complexity, we'll lose control." Reality: Multi-cloud doesn't inherently drive up complexity or cost. In fact, failing to design for it is what leads to loss of control. A well-architectured multi-cloud environment has significant advantages: it spreads risk, creates leverage and offers cost agility. By avoiding vendor lock-in, enterprises can maximize freedom to use the best of platforms and tools. Combined with improved operational resiliency through increased redundancy, performance optimisation and the ability to exercise compliance control, multi-cloud approach offers significant advantages. However, without intelligent orchestration, workloads can balloon in cost, performance becomes unpredictable and compliance headaches multiply. That's where a modern network-as-a-service (NaaS) model comes in - it gives CIOs visibility and control over exactly how data flows across platforms, while optimising for cost, resilience and governance. MYTH 3: "We can upgrade our network later, right now we're focused on AI." Reality: Without a modern network, your AI ambitions will stall before they scale. It's easy to get caught up in the excitement of generative AI pilots, but AI workloads are performance intensive and data-hungry. They process vast amounts of distributed data dynamically, securely and at speed–the average enterprise network simply wasn't built for that. Forward-thinking organisations adopt NaaS: a network fabric that interconnects hyperscaler and on-premise locations, enabling real-time shifting of workloads to meet the business demands. AI and your network are an inseparable duo; success at the former requires maturity in the latter. MYTH 4: "SD-WAN solves everything" Reality: SD-WAN is powerful, but it's just one piece of a much larger puzzle. The shift from MPLS to SD-WAN was a natural evolution as enterprises moved from on-premise infrastructure to adapt their networks for distributed workloads and remote working–a trend exacerbated by the pandemic. However, many underestimated the complexity of integrating SD-WAN with legacy systems, evolving cloud architectures, and growing security needs–it's not a silver bullet. SD-WAN can't fix a poor-quality underlay network, nor can it alone deliver the zero-trust, high-performance environment modern AI applications demand. Successful transformation requires an integrated approach: combining performant underlay, SD-WAN, SASE, cloud gateways, and unified visibility to create an intelligent, secure, and programmable foundation, delivering on the zero trust network vision. Myth 5: "Uptime Is enough" Reality: High availability is necessary, but not sufficient. Visibility, performance, and agility matter just as much. Relying alone on green service levels doesn't mean your network is healthy–don't be fooled by the "watermelon effect": green on the outside, red on the inside. Applications are slow, user experience is poor and agility is missing. In industries like mining & energy, finance, and healthcare, downtime isn't the only risk - degraded performance, including latency and inconsistency, is costly and can cause revenue and reputational harm. Organisations need to shift from traditional service levels to outcome driven network indicators aligned to business critical AI and cloud services. This requires deep observability, intelligent traffic steering, and dynamic responsiveness to changing workloads and threats–all built into the network infrastructure from the foundation. Only then will your network keep pace with your AI ambitions. So what next? Australia's AI ambitions are valid and urgent, but without secure, resilient and programmable networks even the most advanced AI strategies risk failing. It's time to stop treating the network as backend plumbing, and start seeing it for what it is: a strategic platform for innovation.