Three key AI app development trends for 2025

Three key AI app development trends for 2025

Written by editor James Bourne 

How many AI agents does it take to change your job?' asks the Financial Times. It may sound like the start of a bad joke, but for many industries, the potential impact of AI is no laughing matter. One recent analysis, from the Institute for Public Policy Research (IPPR), estimates that up to 70% of knowledge economy tasks would be transformed or replaced by AI. Yet an analysis of freelance job roles since ChatGPT was released saw backend and frontend development jobs both increase in number.

For software developers, the traditional tech stack will continue to be disrupted. Yet there are myriad opportunities which come with this disruption. This article explores three different ways in which AI can transform application development.

Agentic AI across the app development lifecycle

Coding assistants are hardly new. GitHub's 2024 AI in Software Development survey found that more than 90% of developers polled have used such tools, with use cases around improving code quality, adopting new programming languages, and generating test cases. The next step, however, will be moving from AI assistants to AI agents.

Agentic AI, where autonomous systems ' or agents ' are capable of actions, making decisions and executing tasks, has been called everything from the next big thing to the third wave of AI. By 2028, according to Gartner, one in three enterprise software applications will incorporate agentic AI; the analyst firm estimates that last year it was less than 1%.

What would an agentic enterprise application look like? Collaboration software provider Weavy gives an example: a calendar app which would have previously logged employee hours being able to suggest optimisations to the point of reorganising schedules in real-time. '

Beyond the functionality of the application, agentic AI can potentially have a major role to play in the future of DevOps and cloud automation.

The biggest roadblocks in DevOps have traditionally been around integration between tools ' not helped by the sheer number of tools on the market ' lack of clear metrics and a creeping complexity as projects scale. Generative AI technologies can be effective across the continuous integration and delivery (CI/CD) lifecycle, from anomaly detection on the CI side to intelligent automation for CD.

With an AI agent, as an example, the agent can detect new code pushed by a developer and set an optimal deployment strategy before triggering the CI/CD pipeline. Once the application is deployed, the agent can then configure and provision infrastructure as required. Understandably, stringent alignment will be needed to keep the agent from overprovisioning resources, but this is another instance where developers will be able to move away from mundane tasks to more innovation and problem-solving.

Small language models

Small language models (SLMs) work much the same way as large language models (LLMs) in terms of generating natural language as an output. As the name suggests, SLMs have data sets ranging from a few million to a few billion parameters, as opposed to up to trillions for LLMs.

Due to their size and speed, SLMs are ideal for applications such as chatbots and assistants. Offline capability and sector-specific customisation are also key features. Many of the tech giants have developed their own SLMs; Google's Gemma, Microsoft's Phi-2 and Phi-3, and IBM's Granite among others.'

One example of an organisation transitioning from large, generalised language models to smaller, customised ones is EON Reality. The company, which provides augmented and virtual reality services across education and industry, cited affordability, customisability and scalability and cited use cases across healthcare and agriculture.

Why will this be a trend? LLMs may have been the focal point for plenty of AI development and discussion, but SLMs may be a bigger hit for CIOs and heads of departments for their projects. As Gartner has pointed out, spiralling cost is as big an AI risk as security or hallucinations. SLMs can emerge as a cost-effective ' and potentially more secure ' equivalent to LLMs for less complex tasks.

AIs and APIs

Digging down further, generative AI could see a change at the API (application programming interface) level in 2025. API platform provider Postman noted in its 2024 State of API Report that AI-related traffic across its platform increased by nearly 73% last year.

AI can be used to lower time-to-market for new APIs. Developers can take natural language prompts and generate OpenAPI specifications, a formal standard for describing REST APIs. Some providers, such as Workik, enable creation of instant, scalable REST APIs with AI assistance.

In terms of usage, APIs are traditionally meant for individual tasks, but this is changing. A theoretical example of an emerging AI API can be seen in media. If your users are uploading images onto your platform, an API could combine logo recognition with NSFW image detection. The result, as API4AI puts it, is 'invaluable for businesses aiming to maintain brand integrity and platform safety with minimal overhead.'

Likewise, imagine a flight booking app. A single action by the user, such as entering search criteria or selecting a flight ' would relate to a single API call. In theory, an AI agent application can make limitless calls until the user request is satisfied, parsing natural user intent (question, search, modify) and delivering a natural reply.

Ultimately, as Postman puts it: 'Until now, we have primarily been designing APIs for humans, but designing APIs for machines will become an increasingly important area.
Loading
Register Today

Access All Areas and Free tickets are now available! Register now for Dev Expo: LDN 2025.