That's not wrong, but AI aren't really known to be good at those roles, and they're highly censored and retain and share that information with virtually no laws or protections.
There's definitely something in between what we have now and a useful therapy tool.
Because it works. They are designed to simulate human conversation. They are great at repeating/summarizing and reframing situations and empathy- skills of a therapist. Since they know resources, the validity of coping techniques and other suggestions the possibility of the responses being AI hallucinations doesn't really matter because they are easily tested for practicality by the user, once given. And we can logically test many AI suggestions in our heads to easily determine their validity.
I think the AI is keeping me alive by offering reasurance and a non-judgemental voice. I have learned much from it, and it taught me to get out of my shell, which I had for my whole life, until last year.