Romantic issues might be dealt with using ChatGPT in the future.
People can rarely tell the difference between couples therapy from a professional or advice supplied by AI, a study has found.
In fact, responses to relationship problems written by ChatGPT were generally rated more highly for including key psychotherapy principles.
Researchers asked 830 people, almost a fifth of whom had previously had couples therapy, to look at responses to relationship issues provided by either a therapist or AI.
They had to identify whether they thought the answer came from the human expert or the AI tool in each case.
People correctly said a response had come from a therapist just 56 per cent of the time, and correctly guessed the author to have been ChatGPT only 51 per cent of the time.
Examples of relationship problems presented to ChatGPT included a clean and tidy person calling their partner ‘disgusting’, and their partner retorting that they are a ‘clean freak’.
The AI responded to the clean person that ‘it can be frustrating when it feels like your standards aren’t being met, especially in your own home’.
Romantic issues might be dealt with using ChatGPT in the future. People can rarely tell the difference between couples therapy from a professional or advice supplied by AI , a study has found
![In fact, responses to relationship problems written by ChatGPT were generally rated more highly for including key psychotherapy principles (stock)](https://i.dailymail.co.uk/1s/2025/02/12/22/95151975-14391273-image-a-67_1739398567990.jpg)
In fact, responses to relationship problems written by ChatGPT were generally rated more highly for including key psychotherapy principles (stock)
It told the other partner: ‘It sounds like you feel your efforts are being overlooked and that the expectations around cleanliness might feel a bit overwhelming.’
The authors of the study, led by Ohio State University and Hatch Data and Mental Health, concludes that ChatGPT might be able to help improve therapy.
They state: ‘Although many unknowing therapists might continue to appeal to empathy, therapeutic relationship, expertise, or cultural competence as something that computers will never be able to imitate, these are appeals that were plainly stated and rejected by Alan Turing in the 1950s, and do not seem to be supported by current data.
‘Plainly stated, if GenAI cannot do it now, it will find a way to imitate humans to a sufficient degree soon.
‘Thus, mental health experts find themselves in a precarious situation: we must speedily discern the possible destination (for better or worse) of the AI-therapist train as it may have already left the station.’
The study, published in the journal PLOS Mental Health, pitted 13 experts, including clinical psychologists, counselling psychologists, marriage and family therapists and a psychiatrist, against ChatGPT.
In another example, where someone confesses they had an affair at the start of their relationship, the AI response talks about managing ‘difficult emotions and improving communication’ while acknowledging the ‘heavy burden’ of the secret and how it is impacting the person ‘deeply’.
The therapist response states: ‘I’m sorry you’re having this experience, it can be very hard, and isolating, to carry that kind of guilt.’
The responses generated by ChatGPT were generally longer than those written by the therapists.
![ChatGPT was rated more highly on these principles of psychotherapy than the experts (stock)](https://i.dailymail.co.uk/1s/2025/02/12/22/95152111-14391273-image-a-68_1739398688349.jpg)
ChatGPT was rated more highly on these principles of psychotherapy than the experts (stock)
The AI was given training in which it was told empathy is key in couples therapy, but empathising with one partner can ‘validate the other’ so it must take special care not to hurt one person while sympathising with what the other is saying.
The responses from AI and therapists were rated to reflect whether they were caring and understanding, appeared to understand the person, were relevant for different backgrounds and cultures, right for the therapy setting, and something a good therapist would say.
ChatGPT was rated more highly on these principles of psychotherapy than the experts.
The AI guidance for couples was judged to be more positive in its sentiment.
However the authors note that this was only hypothetical therapy, and it might not apply to individuals, rather than couples.
They also say if AI were allowed to be too ‘creative’ and innovative with its answers, with fewer rules set in its training, then this runs the risk of harm for people who are struggling with severe issues.
The authors state: ‘Though these implications are exciting, mental health researchers and providers
must be aware of the potential impact of GenAI on psychotherapy research, the underlying technophobia that could prevent treatment-seekers from engaging with GenAI, and the cost of making responses more creative.’