Share.

5 commenti

  1. Beginning-Shock1520 on

    My experience with the service in the past wasn’t great I have to say. Like it’s meant to be a text service you reach out to when you’re in need of someone to talk to. But sometimes there would be huge gaps in sending a message and the responder responding to it. I honestly felt like I was talking to a robot and not a real human on the other end. It didn’t help me at all – so I stopped using it.

  2. The statement from Spunout does not deny that it was an AI message.

  3. Sanimal88 on

    Former volunteer here; they use canned responses that are edited to the conversations to try and get to the bottom of things, the conversations are steered in a way that is safest for the service user to get calmed in a hot moment and get directions to the help that best suits them. It’s really tricky when you’re trying to help someone in crisis when you have to be helpful yet detached and calm. Their training is robust and a lot of people actually don’t pass. The volunteers are closely overseen in what they are saying by professional qualified therapists, psychologists, etc. It’s quite intense in practise and conversations don’t always go as great as you’d hope. But I can confirm there wasn’t a lick of AI involved in any of the work.

Leave A Reply