DevOps teams now better understand how AI can help them deliver faster

Dynatrace
By Rafi Katanasho*
Friday, 12 July, 2024


DevOps teams now better understand how AI can help them deliver faster

Australian organisations with substantial digital operations, particularly those in the financial sector, have been busy experimenting with generative AI to determine its ability to support their software engineering functions.

While the early results have been promising, they’ve often been qualitative, based on the perception of efficiency improvements, rather than specifically quantitative in nature. The reasons for this are starting to become clear: it’s not that GenAI isn’t suitable for the purpose of boosting developer efficiency and productivity, but more that its capabilities and what it’s good at — like finding or generating code snippets and test scripts — is quite narrowly focused.

These activities are just two aspects of what a full-stack developer or DevOps engineer does. Just as people in these roles are expected to bring a more well-rounded skill set to the table, so should any tooling that is designed to support them.

Organisations are realising that if they want to find efficiency improvements across the software development lifecycle, they can’t achieve it with generative AI alone. It can be achieved with AI, but through the combined use of multiple AIs rather than a narrow focus on just one type.

This realisation is now starting to be reflected in empirical research.

A recent survey by Dynatrace shows the number one improvement that technology leaders expect AI to deliver is to the speed of software deployment and delivery. The same research found the greatest proportion of technology leaders (61%) are increasing their investments in AI to speed up development by automatically generating code. Note that both references are to AI in broader terms. This shows technology leaders are banking on AI to deliver improvements, but not generative AI alone.

Gartner similarly predicts that by 2028, 75% of enterprise software engineers will use AI code assistants, up from less than 10% in early 2023. Importantly, it defines the assistants as “collaborative” in nature, and going “beyond code generation and completion”. Again, the subtle messaging shift here is that generative AI’s utility as a standalone mechanism to improve software delivery is limited; that the majority of teams now recognise generative AI’s limitations and are looking instead to AI more broadly to deliver these improvements.

Where GenAI-only initiatives fall short

Organisations will encounter a range of challenges when trying to use generative AI alone to achieve software engineering efficiency improvements.

The first challenge is getting the GenAI tool to deliver a meaningful response. This requires teams to engineer prompts that contain detailed context and precision. As LLMs are not organisation-specific, and are instead trained on a wide spectrum of data, they cannot provide analytical precision and context about the state of an organisation’s systems or the root cause of any problems. Without rigour in prompting the AI, its outputs will be vague and generic, resulting in trivial and unhelpful suggestions.

To remediate this, DevOps teams must prompt for answers by providing specific context about their environment. But they must also be careful not to prompt LLMs with non-public data, which could inadvertently expose proprietary IP or violate privacy and security regulations.

Another challenge is understanding the intellectual property and licensing implications of using code recommended by generative AI. As these tools may have been trained on data from open source libraries, this creates a risk that teams could accidentally repurpose proprietary code in ways that contravene restrictions.

There’s also the well-known risk of the generative AI hallucinating, which sees it creating statements that are inaccurate, inconsistent or even fictional. That challenge becomes especially pronounced when users create a prompt that is vague or falls outside of the data that the LLM has been trained on. The AI will therefore generate a response that is coherent, but based on fantasy. In a development context, an LLM could end up inventing a new syntax that doesn’t follow the rules of a programming language, resulting in broken code.

The benefits of ‘hypermodal’ AI

If generative AI isn’t the sole solution to software development and DevOps speed, what other types of AI can play an augmentative role to enable these improvements to occur?

Complementary AI technologies — such as causal and predictive AI — are increasingly being used to get the LLMs that underpin generative AI to include precise answers in the responses they produce.

Causal AI observes the relationships between components in a system and explains their dependencies and the reasons for their behaviour. Predictive AI can enhance this further by analysing patterns in historical data, including workload trends, seasonal user behaviour, system capacity and application health, to pre-empt future problems and suggest ways to prevent them.

By combining generative AI with fact-based causal and predictive AI — to create a ‘hypermodal’ AI — DevOps teams can achieve what they set out to do with GenAI alone: freeing up time to focus on higher-level challenges, such as creating new features, while dramatically reducing development and testing time.

*Based in Sydney Australia, Rafi joined Dynatrace in 2007 and has more than 20 years’ experience in the IT industry focusing on business, application and IT service management. His experience includes working for a number of Australian-based technology startup innovators. Prior to joining Dynatrace he held a senior management position at technology startup Proxima Technology.

Top image credit: iStock.com/Just_Super

Related Articles

Meeting modern citizens' needs with AI-powered government services

Many citizens find themselves experiencing long wait times when pursuing services, or unable to...

Improving staff retention in government

How Australian government departments can better manage their people to maximise wellbeing and...

Inaction towards human-centred design is no longer an option

Prioritising a human-centred approach is crucial to realising the 2030 Data and Digital Strategy.


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd