The British Affiliation for Counselling and Psychotherapy (BACP) is warning in regards to the rising dangers of youngsters utilizing AI instruments equivalent to ChatGPT for psychological well being recommendation.
Its new survey revealed that greater than a 3rd (38%) of therapists working with beneath 18s have shoppers searching for psychological well being steering from AI platforms. And nearly one in 5 (19%) therapists reported kids receiving dangerous psychological well being recommendation.
Therapists have informed the BACP that some AI instruments are offering doubtlessly dangerous and deceptive data together with encouraging kids to self-diagnose situations equivalent to ADHD and OCD, reinforcing avoidance behaviours and routinely validating their emotions no matter what they specific. There have additionally been tragic instances the place AI instruments have given dangerously misguided recommendation and even inspired suicide.* Therapists are additionally significantly involved about AI’s incapacity to supply real-time help or intervene in disaster conditions.
Ben Kay, Director at BACP, which is the most important skilled physique for counselling and psychotherapy within the UK and has greater than 70,000 members, stated:
“It’s alarming that kids are more and more having to show to AI chatbots like ChatGPT for psychological well being help, typically unable to inform whether or not the recommendation they’re getting is protected and even true. Some have already suffered devastating penalties. And that is probably simply the tip of the iceberg, with many extra kids struggling in silence, with out entry to actual remedy.”
“We wish mother and father, carers, and younger individuals to know that utilizing AI for psychological well being help isn’t the simple, protected, or fast repair it would seem like, there are actual dangers concerned, and it have to be approached with warning. Whereas AI is accessible and handy, it could’t replicate the empathy, connection, or the security of remedy delivered by an actual particular person educated to know advanced psychological well being challenges and assess dangers. Kids in misery might be left with out correct skilled help. The knowledge shared with AI additionally doesn’t have the identical protections as remedy. “
“Too many younger persons are turning to AI as a result of they’ll’t get the psychological well being help they want. That’s unacceptable. The federal government should step up and make investments now in actual, skilled remedy via the NHS, colleges, and neighborhood hubs. No younger particular person ought to ever be pressured to show to a chatbot for assist. AI may fill gaps, however it could by no means change the human connection that modifications lives. Younger individuals deserve greater than algorithms, they deserve professionally educated therapists who pay attention.”
New survey findings
The BACP’s annual Mindometer survey, which gathered insights from almost 3,000 practising therapists throughout the UK, reveals that greater than 1 / 4 (28%) of therapists – working with each adults and kids – have had shoppers report unhelpful remedy steering from AI. And nearly two-thirds (64%) of therapists stated that public psychological well being has deteriorated since final yr, with 43% believing AI is contributing to that decline.
Senior accredited therapist Debbie Keenan who works at a secondary college and has her personal personal observe in Chepstow, added:
”I’m undoubtedly seeing extra kids and younger individuals turning to AI to hunt remedy recommendation and self-diagnosis situations equivalent to ADHD and OCD. This raises actual issues for me. As superior as AI is, it merely can’t do that. It additionally can’t inform if a baby is distressed, dysregulated or at risk. If a baby was telling me that they have been going to harm themselves, or that they had suicidal ideation, help can be in place for that baby earlier than they left my room- however would AI do that?
“Moreover, I’m additionally involved in regards to the present danger of youngsters isolating and disconnecting from actual human relationships – this could result in an over reliance on Al for emotional help and improve emotions of loneliness, making it tougher to succeed in out for ‘actual life’ help.
“I imagine kids are more and more turning to AI for remedy as a result of it’s obtainable 24/7. It feels non-judgemental and presents a way of privateness. Nevertheless, AI remembers information, it isn’t sure by moral or confidentiality requirements, and it lacks regulation or accountability. Whereas it might fill the hole in entry to psychological well being help, it can’t change human connection or recognise delicate emotional cues like a educated psychotherapist can.”
Amanda MacDonald, BACP registered therapist who gives help for youngsters, teenagers and adults stated:
“AI remedy bots are inclined to undertake considered one of two approaches: providing validation or offering options. Each lack the nuance of actual remedy and danger giving recommendation that contradicts finest practices for emotional misery. For instance, some AI instruments have suggested people with OCD to proceed their compulsions, mistaking short-term reduction for progress. Others have inspired avoidance of tension triggers, which can really feel useful initially however can worsen anxiousness over time by reinforcing avoidance behaviours.
“There have additionally been well-documented, tragic instances the place AI instruments have given dangerously misguided recommendation and even inspired suicide – outcomes which are each devastating and deeply alarming.
“Dad and mom and carers ought to be conscious that their kids could also be turning to AI for steering and recommendation. Whereas it’s necessary to maintain applicable parental controls in place, open and sincere communication at house is simply as important. Speak to your kids with curiosity and share your issues in an age-appropriate manner.
“Kids and adolescents aren’t but outfitted to completely assess danger, so mother and father play a vital position in retaining them protected. Balancing privateness with security isn’t simple, however with out that stability, younger individuals can develop into overly reliant on what’s finally a really sensible algorithm; one which lacks the moral and safeguarding requirements present in helplines, remedy, or school-based help.
“Reaching for his or her telephones after they’re upset feels pure for a lot of younger individuals, particularly as AI instruments can appear supportive and validating. This creates a helpful alternative for households to speak about their relationship with telephones and know-how. Dad and mom can assist by modelling wholesome behaviour – setting shared screen-free instances and recognising after they themselves instinctively flip to their telephones. In spite of everything, telephones have been designed to attach us, but when we’re not cautious, they’ll begin to change actual human connection.”
References:
All figures are from BACP’s annual Mindometer survey of its members. The whole pattern dimension was 2,980 therapists, and fieldwork was undertaken between 3 – 17 September 2025. The survey was carried out on-line.






