Siri-ous business: How virtual assistants are gradually becoming invaluable

I don’t use virtual assistants. Deep down, perhaps, I’m sceptical that I’m important enough to warrant an assistant, virtual or otherwise, or maybe it’s just that my experience of talking to my devices has been a bit underwhelming. From Siri and Google Now to cheerily greeting my Xbox One, I’ve been left feeling self-conscious without enough reward to make the whole rigmarole seem worthwhile.

Siri-ous business: How virtual assistants are gradually becoming invaluable

So it was with some outsider intrigue that I attended a series of talks from Nuance, one of the world’s largest speech-recognition and transcription software manufacturers. They may not sound like a household name, but the roll call of clients they work with is so extensive that it feels quicker to list the companies they don’t work with rather than those that they do. Nonetheless, here is a far from extensive selection: Roku, Panasonic, LG, Samsung, Lexus, Ford, BMW, Toyota, Vodafone, BT, T-Mobile, Domino’s, Coca-Cola, Barclays, Citi, Delta, Air France, FedEx, AT&T, BMW and the NHS. In short, even if the name Nuance doesn’t ring any bells, you’ve likely dealt with them in some capacity. The AIs deal with 14 billion customer engagements a year, in 80 different languages.

If you’re concerned that the same virtual assistant picking out toppings on your pizza is also moonlighting in the NHS, you can breathe a sigh of relief. Although there are some shared elements, the various artificial intelligences have very different functions – there is no jack-of-all-trades AI here. This is why Nils Lenke, senior director of corporate research at Nuance Communications, doesn’t have much truck with the idea that AI will take over the world any time soon, even though the company keeps a close eye on things, co-organising the Winograd Schema Challenge – the successor to the Turing Test.nuance_nils_lenke

“The underlying technology is very similar, but once you have trained a system, it can only do one task,” he explains. “This Go system is very good at playing Go, but it cannot recognise faces or understand speech. We [humans] solve all our problems with the same brain, but this is not what these systems do. That’s why there is a long way to mimic human intelligence – if ever.”

Indeed, it’s possible that other virtual assistants’ designers’ unwillingness to accept these limitations is the root of why I’ve found my experiences to date so underwhelming. I put this to Lenke: “Exactly, they’re trying to boil the ocean, right?” If the assistant is too generic, Lenke suggests, you often find yourself unsure of what to actually expect next, and where its limitations lie.

Instead, it seems that limiting the scope of a virtual assistant makes it more useful. “With, say, an assistant to a driver, it’s pretty clear what a driver’s problems are. They all want gas, they all need to know what the dashboard says, they all need something to eat: it’s much easier to build something that’s useful for drivers.”

“There wasn’t much call for virtual assistants pronouncing Jeremy Corbyn’s name correctly a year ago, he notes, whereas any AI filling you in on the news today had better know how to say the Labour leader’s name correctly.”

This simplification even comes down to the included vocabulary. John West, principal solution architect at Nuance, tells me that in order to keep virtual assistants’ synthesised voices sounding fluent, the dictionary is often revisited to ensure current trends are catered for. There wasn’t much call for virtual assistants pronouncing Jeremy Corbyn’s name correctly a year ago, he notes, whereas any AI filling you in on the news today had better know how to say the Labour leader’s name correctly. This, according to Lenke, is another reason why it’s preferable for a virtual assistant to be specialised: “You try to anticipate what the domain is, and you generate a voice based on samples from that domain. If you try to have a voice that can say everything from every domain, quality can deteriorate.”nuance_dragon_drive

“As for virtual assistants’ gender, that often comes down to cultural differences by country – some of which have very deeply set and prescribed views of which gender is right for each task.”

Speaking of voices, something that’s always thrown me about virtual assistants is the need for them to have male or female tones at all. What place does gender have for an artificial intelligence anyway? “It’s a deliberate choice you need to make,” says Lenke, and one that each client has final say on. “You can either say you go for the illusion of a human being, or you can say ‘I want people to see it’s a robot,’ so you give it a robotic voice and there’s no persona involved.” As for gender, that often comes down to cultural differences by country – some of which have very deeply set and prescribed views of which gender is right for each task. Again, the client’s choice, and not Nuance’s place to impose its own philosophies on any given client.

The voices themselves sound rich, fluent and natural in the demos I’m shown. Theoretically, could any clients be trying to pass them off as humans, or at least not mention they’re a bot in a lie by omission? “Let me put it this way: I personally would not advise to do that,” cautions Lenke.

But how useful are virtual assistants proving to be? Nuance has several key examples of improvements to business, from voice biometrics surpassing the password, all the way to call centre workers’ job satisfaction rising because their tedious preliminary questions are automatically dealt with. But perhaps the most invaluable is within healthcare, one of the company’s most significant areas.

 

“UK doctors do three-and-a-half hour’s worth of admin work a day,” explains Frederik Brabant, Nuance’s chief medical information officer. I’m shown a couple of demos, one in which a doctor can complete 20 prescriptions around a third faster using voice commands, and another where a healthcare professional is given all the pertinent information about their various patients on an iPad. Semantically, the system even seems to prompt doctors for details they may have forgotten: which type of diabetes, for example, so their notes make sense when transferred. This can lead to an increase of revenue, according to Brabant, of between 6 and 8% on average.

The aim here is to make the doctors’ jobs easier, rather than to replace or augment their professional expertise. “Doctors hate that. We don’t want to get instructions. The machine should never say ‘you studied seven years of medicine, but…’” Despite the limitations of a still largely paper-based NHS, Dragon Medical is used by more than 80% of the trusts in the

UK with varying degrees of integration. In terms of the future, though, there are uses beyond administration. If you’re working in surgery, you can’t access important information with your hands for obvious hygiene reasons, so a voice assistant makes perfect sense.nuance_localization_lab

And it’s here that the other advances the company is looking to make in the future will really come into play: making the software more intelligent and logical. An example I’m given: if you say to your car, “Book a table at Joe’s Pizza after my last meeting, and let Tom know to meet me there,” you’re actually relying on a heady mix of big knowledge, semantic routing, planning, semantics and dialogue. It needs to consult your calendar for the last meeting time, consult a map to see where Joe’s Pizza is in relation to the meeting location, hunt down the number for the restaurant and try to book, then search for Tom’s contact details and send a message. That’s impressive as it stands, but they want the car to be able to find quality Italian alternatives if Joe’s Pizza is all booked, and suggest times that work for everyone. Theoretically, this isn’t too far away, and likewise, a smartphone that can tell a doctor the red blood cell count of a patient on the fly is infuriatingly close. However, doctors need to lose their attachment to the paper and pen and go 100% digital first.

“Theoretically, this isn’t too far away, and likewise, a smartphone that can tell a doctor the red blood cell count of a patient on the fly is infuriatingly close.”

“I personally think there’s going to be a lot of opportunities for the UK in the next four or five years in the digital transition, and that new technologies like voice recognition and natural language processing will bring it to the next level,” concludes Brabant.

After leaving the event, and heading back to the office, I decide to give virtual assistants another try. “OK Google,” I say, “navigate me to Goodge Street.” And it does so in double-quick time, leaving me wondering if maybe there is a vacancy for a virtual assistant in my personal office after all.

READ NEXT: 10 things you need to understand about Artificial Intelligence

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.