'Active listening' for chat interfaces
In AI Studio, we’ve been working on AI-enabled chat interfaces.
We recently analysed conversation logs from a classic ‘Retrieval Augmented Generation’ (RAG) system connected to all GOV.UK content. RAG is a technique for using AI to retrieve and generate answers to user questions based on the content you give it access to.
From this, we found that trying to meet a user’s request straight away is not always the best thing to do. Jumping straight in with answers can mean that a chat agent can mis-interpret the user’s needs and is unable to handle multiple requests. These kinds of answers tend to include redundant information, and can lead to dead ends.
A good conversation between 2 people is based on ‘active listening’. This means concentrating on, understanding, and acknowledging what a person is saying, before offering your own thoughts.
We think a good chat interface should use the same principles of active listening to identify the user’s underlying needs or goals before deciding what to do next. And by designing the system around active listening, we reckon lots of useful conversation patterns can emerge.
Allowing user-led and AI-led conversations
We looked at 200 real question-answer pairs, and identified at least 8 ways that active listening could more effectively help the user.
Currently, these are:
- Ask clarifying questions before trying to answer.
- Prioritise from multiple requests.
- If you’re not sure, double check.
- Move the conversation from generic to specific.
- Orient the user within their wider goal or journey.
- Get a deeper pragmatic understanding.
- Offer helpful alternatives.
- Get to the point.
When appropriate, these patterns shift the experience from user-led to AI-led conversations. They aim to remove the burden of learning and navigating government from the user, and allow the AI system to guide users through their experience.
These behaviours also shift users’ expectations about what the system can do. The system gives signals, or ‘affordances’, that it is capable of doing more than just answering simple questions.
Some examples of active listening
Let’s look at number 5: Orient the user within their wider goal or journey. In this case, the AI can infer the broader goals around the literal question about the State Pension age.
The AI is aware of the broader goals around retirement, as represented on the GOV.UK step by step guide for planning for retirement.

Now, let’s look at number 6: Get a deeper pragmatic understanding. If we explicitly tell the system to understand the underlying goals or needs, it can then determine what it can do to help with that goal.
This forces the AI to distinguish the semantic meaning (what the user said) from the pragmatic meaning (what the user meant). In this example below, the AI does not have access to the accounts. But it can help the user find the accounts, which will ultimately help the user get their task done.

Designing by building
If we instruct the AI system to use active listening — to reflect on what the underlying goal or need is — we think that it can be better able to help the user.
But working with a pro-active system poses new design challenges. For example, how can we allow for free exploration and task completion in the same experience, and how might we balance better personalisation without adding more steps for a user to complete.
To approach this, we’re designing by building.
By working with AI prototypes, it’s easier for us to see where interactions work or where they break. It also helps us find implementation challenges earlier, and helps bring our multi-disciplinary team together around a shared ideation space.
Get in touch
If you’re also working on AI-enabled chat systems, we’d like to share experiences with you.
Have you seen a similar need? What is your approach to ensuring users’ underlying needs are being met, and how is that experience quantified?
You can email us at govuk-ai@dsit.gov.uk.