How would you feel about using an Artificial Intelligence as your counsellor or personal life coach? Well now you can find out! Woebot (which is an absolutely genius name!) is your friendly robotic self-care expert and the world’s first AI counsellor.
Naturally, I had the usual concerns…
- How would personal data be protected?
- How would people be prevented from hacking into the system and targeting vulnerable people?
- Would an AI counsellor inhibit normal, healthy socialisation?
- Could the AI learn to give bad advice?
But it also got me thinking about the positives as well –
- Mental health help for everyone regardless of financial/physical/mental ability
- Immediate help – no waiting lists for your first therapy session
- 24/7 availability
- No need to worry about how you’re negatively affecting anyone else (e.g. worrying about wasting other people’s time, being a burden etc)
I was pretty 50/50 about whether an AI counsellor would be a good idea or not so I signed up to find out more…
First of all, he’s adorable! (which I assume is very much on purpose to give an instant sense of friendliness and connection – got a bit of a WALL-E vibe going on there!)
Once I’d signed in though, I was actually a little disappointed – I think because the Woebot video shows someone writing in their diary and conversational examples, I was expecting to be able to just sign up and type ‘freeform’ and Woebot would reply with something constructive.
The actual format is much more rigid and generally, you can only take part in the conversation by choosing from a few preset responses. This more automated method of conversation did address some of my earlier concerns though;
How would personal data be protected?
The opportunity to enter private/personal information is very limited. There is the option to enter an email address (which is required if you want to access or delete your data) when you sign up, but it isn’t compulsory. Woebot also assures us “Your data is private and encrypted with hospital-level security standards” so even if there was a security breach, there is only a minimal amount of personal data that could be at risk.
Would an AI counsellor inhibit normal, healthy socialisation?
Potentially, but definitely not this one! There are no attempts at all to make Woebot in any way humanoid – he’s a very effective learning tool, designed to guide you through practical techniques based on Cognitive Behavioral Therapy (CBT), Mindfulness and Dialectical Behavior Therapy (DBT) and teach you simple self-help tools. He’s more of a replacement for self-help books rather than any kind of substitute for a therapist or for human socialisation.
Summary
I think if Woebot had been designed to be a conversational humanoid, it would be much more of a moral concern. However as a little robot companion, Woebot is a really useful tool to help us learn more about ourselves and teach us new skills so that we can take better care of ourselves. My only real complaint is that Woebot is being wrongly branded (not by the creators of Woebot) as a counsellor/therapist and it’s a misleading description.
What do you think of Woebot? Let me know your thoughts/experiences @GemmaNCrawford