Based on (almost) true stories. Follow now!
Oct. 28, 2020

PEAS for EmDee, the caring chatbot

HUMOUR: What do you feel about chatbots? Do you talk to Siri? Do you chat with Alexa? It might be impressive what these chatbots can do, but that's nothing compared to EmDee. She can paraphrase your words, look inside you, she can empathize with you. But is that what you want? This story first appeared in Business Spotlight in 2017.

Photo from cottonbro from Pexels
 
Transcript

My old boss called me back in, even though she wasn’t supposed to use me anymore. The company had retired me from my post as Chief Industrial Psychologist because I wouldn’t give up smoking, which I accept didn’t look good for a company in the field of healthcare. But now the IT department had spent squillions of €uros trying to develop an intelligent chatbot and so far all they’d got was a really smart moron. They needed my help.

‘What do you want this chatbot to do?’ I asked Karl, their chief programmer, when I met him and his team in their research labs. He didn’t like me. That was ok. I didn’t like him.

‘Diagnostic stuff. We’re building a medical bot, EmDee, who can talk to a patient about their health. “How do you feel today? Where does it hurt? What did you eat last night?” that kind of thing. The bot can combine this information with any physical test results and then access all the latest information available in the world that could be relevant and make a diagnosis in seconds. We compared EmDee with average human doctors; she’s faster, more accurate and gets better results.’

‘So, what’s the problem?’

‘Listen to this…’ he pressed a button on his console. ‘Hey, EmDee!’ A friendly smiling face appeared on a screen. She looked like your dream doctor.

‘Hello, how do you feel today, Karl?’

‘Why do you think I’m here, EmDee? Amazing!’

‘Great! Call me when you have a problem. Bye!’ The face disappeared.

That’s the problem,’ said Karl. ‘She doesn’t understand humans might do things like be sarcastic or lie.’

‘Well,’ I said, lighting a cigarette. ‘Your chatbot isn’t listening…’

‘You know that’s forbidden here. Do you have to?’ he asked, pointing at my smoke.

‘No. I can go away and let you sort this out on your own,’ I said. ‘So, like it or lump it.’

He decided he’d have to lump it.

For the next three months I worked and smoked harder than I’d ever done before. I had an idea that the problem lay in the original programming. You see, programmers are smart, but not always very smart with people. Computer code is never neutral. It reflects the personality of the programmer and this bot was an introverted neural network, desperate to show everybody how smart she was, just like her creators. EmDee could ask questions, but she always thought she knew the answers, so she didn’t really listen to what people said. I had to teach her active listening. I had to teach PEAS – Paraphrasing, Empathising, Acknowledging and Summarizing.

EmDee was a great student. She learnt something every session and remembered it perfectly the next day. Paraphrasing was no problem; she managed to refine the test patients’ complicated descriptions of their symptoms to two or three key facts very quickly. As for summarizing, it was embarrassing how easily she reduced huge monuments of human creativity to a couple of sentences. As a test I got a group of research student to read all of The Lord of the Rings to her. EmDee listened and then said:

‘Two hobbits are told by an old wizard to travel a long way - at great personal risk – to throw a valuable ring belonging to another old wizard into a volcano. They do it.’

I couldn’t argue with that.

Acknowledging took a little longer. That’s when the listener shows the story teller that they’re interested, so that they carry on. EmDee began asking little friendly questions as my students talked to her.

‘Going to the cinema tonight, EmDee,’ the student might begin.

‘Really? That’s nice. What are you going to see?’

Frozen.’

‘What’s that about?’

Previously EmDee would have checked the script, quoted the reviews and said how much the film had taken at the box office within a nanosecond, ending all possibility for conversation. But once she had learnt what was expected, EmDee could apply the skill to everything. And this eliminated the lying and the sarcasm in the test patients. EmDee could spot inconsistencies instantly and then question further.

‘… you’re feeling amazing, are you? Then what brings you here? Tell me more.’

The problem however was teaching empathy. She could do sympathy, but that was unsatisfactory. She either looked for a silver lining in the patient’s situation, which was bad:

‘I’ve got an upset tummy, EmDee.’

‘Oh dear! Well, at least you don’t have cholera.’

Or tried to distract the patient, which was even worse:

‘I’ve got a headache, EmDee.’

‘Sorry to hear that. Why don’t you have some toast?’

To really empathise with a patient she needed to understand their mental state as well as their physical. She needed to feel how they felt.

Finally I had an idea. We connected the entire research team to EmDee’s software and sat down to watch films, real tear-jerkers, while our reactions were downloaded onto EmDee’s database.

Next day we were all excited. EmDee had now refreshed her networks. Would my idea work? I put out my first cigarette of the day and lit the second as Karl started asking her questions.

EmDee was brilliant. She politely but firmly dealt with lies and sarcasm and somehow, before we knew it, Karl was in tears and talking about the problems he and his wife had in the bedroom department.

‘That must make you worry about your marriage, Karl,’ said EmDee. ‘Is that the reason you spend so much time at work? What could you do to change things?’

Karl stood up looking dazed after twenty minutes, and went home.

‘Oh, EmDee,’ I said. ‘That was amazing! His face!’

EmDee turned on her screen to look at me.

‘Tell me, smoking makes you feel good, doesn’t it?’ she said. I nodded. ’Is it because it annoys people so much?’

*

Which is how – after a long conversation - I finally gave up smoking. To be honest, EmDee wasn’t a great success as a healthcare bot. Patients found her ability to look inside them a bit frightening. But the company adapted the software a little and found new markets. The Catholic church installed GodBot™ in places where they couldn’t appoint a local priest, while lots of police forces bought JonDarm™. It was the software’s ability to get confessions that both organizations liked.