AI to answer your 101 calls, leaving 999 calls to humans
“It’s not unusual for us to see 5,000 calls in 24 hours,” chief superintendent of the Thames Valley police department, Simon Dodds, told City AM. “Prioritising 999 over everything else is a real challenge”.
That pressure has pushed his department, as well as the Hampshire & Isle of White Constabulary, to use AI agents.
Their new so-called ‘virtual assistant’, named ‘Bobbi’, went live just last week, answering non-emergency calls, filtering out routine, repeated questions, and triaging anything that may require human input.
Built with Salesforce’s Agentforce platform, the tool is designed to carry part of the 20m non-emergency calls the forces receive each year.
In an era of rising demand, shrinking patience and tight budgets, the police forces claim AI could offer something no call centre could promise. It could provide instant, consistently accurate, 24/7 responses.
“This isn’t about cutting staff”, Dodd claims. “It’s about improving service. If Bobbi can answer the question about a hedge overhanging a fence, or tell someone how to change their bail address.”
“That means our people can focus on emergencies and complex issues”.
Already, the system has been tested by over 200 people, ranging from victim support specialists to scrutiny panels.
The system was fed with around 90 internal knowledge articles, the same material that human operators use.
“If we get it wrong, trust evaporates”, operations manager Tom Boyd told City AM. “One mistake in policing gets amplified very quickly”.
AI: A safer bet for low-risk queries
Bobbi’s core job is to handle the repeatable and redirect the risky.
Behind the scenes, it’s large language logic scans for key signals – a mention of a knife, a child’s voice, a pattern resembling stalking.
In those moments, Dodds and Boyd explained, Bobbi is programmed to escalate the issue.
And it’s already happened. “We had a child mention mummy and daddy arguing”, Boyd said. “Bobbi flagged it as a child at risk. The operator recognised a domestic incident and had officers attending within ten minutes.”
In another case, a mother worried about a neighbour’s behaviour toward her daughter was quietly moved to a human operator, after Bobbi recognised signs of stalking.
Critically, the forces have limited the AI’s scope to avoid hallucinations or creative interpretations.
“We’ve locked it down”, Boys explains. “It doesn’t pull from the internet, and it doesn’t learn from conversations. That massively reduces bias and stops it going off piste.”
All conversations are currently being reviewed in what the team calls “hyper care.”
So far, no harmful advice has been given, though the team has already tweaked answers when more detail could help.
“Just like we coach the staff”, Boys says, “we coach Bobbi.”
The sector is equally bullish on transparency. “You can clearly see when you’re talking to an AI”, Boyd adds. “We wanted the public to know exactly what they’re engaging with.”
Naturally, some residents remain uneasy about automation anywhere near the sector.
Dodds acknowledges this resistance, but claims patterns have so far shown widespread appetite.
“People are asking exactly the kind of questions we expected – lost phones, dangerous driving, local disputes…”
Scaling may be the bigger challenge, however, as Dodds said: “Policing takes years to implement anything. By the time we do, we’re behind.”
“What’s different here”, he claims, “is speed. We set Bobbi up in days, which lets us explore other areas where AI could help us.”
These areas include automated data entry during emergency calls, for one, a long standing bottleneck where operators struggle to record while listening.
“We want to let AI capture the details”, Dodds says, “and let the operators make decisions.”
A digitalised police force
Bobbi arrives as policing faces a broader, much more controversial debate around the use of technology.
The government has announced plans for a nationwide expansion of facial-recognition cameras, a policy now in a ten-week consultation and likely headed for legislation.
Officials have dubbed it “the biggest breakthrough since DNA matching”. But critics have long argued that it risks building “Big Brother Britain”.
The Home Office have admitted that the tech “interferes” with privacy rights, and wants the police to be able to compare faces captured by CCTV or doorbells against millions of images stored across passport and immigration databases.
In trials, it has already helped over 1,300 arrests, ranging from offenders to missing persons.
But misidentification concerns persist, and privacy groups continue to warn of a drift toward mass surveillance.
“Facial recognition surveillance is out of control”, said Big Brother Watch’s Silkie Carlo. “It’s deeply sinister”, added Reform leader Nigel Farage.
Supporters have maintained their stance, with former counter-terror chief Neil Basu calling it a “massive step forward”, and ex-policing minister Chris Philip backing a “dramatic increase”.
Public opinion seems to be walking the tightrope between being comfortable with facial recognition for murder and terrorism, but less so for everyday use.
“We know people will always want to speak to humans”, Dodds claimed, “so we must protect those channels. But there is also a growing part of society that wants to engage online, quickly, and in their own time. And we should serve them too.”