Researchers warn against the use of ChatGPT & Co. in psychotherapy – and call for an international research project to make this possible.
Numerous AI startups and established companies are looking for use cases for large language models and chatbots like ChatGPT. Candidates range from marketing and sales to healthcare and psychotherapy.
But a group of researchers, in a paper titled “Using large language models in psychology,” now cautions against using these models in psychology, and especially in psychotherapy. Their ability to generate psychologically useful information is fundamentally limited, says co-author Dora Demszky, a professor of data science in education at the Stanford Graduate School of Education. “They are not capable of showing empathy or human understanding,” Demszky said.
ChatGPT doesn’t have a Theory of Mind
What the scientist means is that large language models lack a “theory of mind,” an understanding of other people’s mental states. David Yeager, a professor of psychology at the University of Texas at Austin and also an author of the paper, points out that while the models can generate human-like text, they don’t have the depth of understanding of a professional psychologist or a good friend.
In their paper, the researchers argue for a partnership between academia and industry on the scale of the Human Genome Project. This partnership should include the development of key data sets, standardized benchmarks, such as for use in psychotherapy, and a shared computing infrastructure to develop psychologically competent LLMs.
Further research could enable transformative deployment
The motivation behind the call is twofold: First, the researchers fear “a world in which the makers of generative AI systems are held liable for causing psychological harm because nobody evaluated these systems’ impact on human thinking or behavior,” Yeager said.
Second, they see potential in the models: “We argue that although LLMs have the potential to advance psychological measurement, experimentation and practice, they are not yet ready for many of the most transformative psychological applications — but further research and development may enable such use. “