As one of the newest and most modern innovations in recent technology, ChatGPT is the newest chatbot to enter the market.
A chatbot uses artificial intelligence (AI) to process and respond to text conversations using learned knowledge and algorithms – something which can be applied to many different fields and areas of research, education, and more.
ChatGPT was developed by OpenAI – a research company that specialises in the application and deployment of AI around the world – and was released to the general public in November 2022.
Currently, ChatGPT and other similar technologies can be found across popular internet channels such as Google searches and various social media features, but both the potentials and dangers of this can affect individuals on a far more varied scale.
This article aims to explore the dangers of ChatGPT as a social communication channel and how this may lead to addiction in some cases, but there are other areas and issues which must be addressed.
As mentioned above, there are many different ways that ChatGPT can be used, and many applications have already been implemented across many different fields.
Because ChatGPT is a form of AI, its applications are endless in terms of how it can be altered and tailored to suit many different needs and requirements of technology and software.
In general, because ChatGPT is used to respond to human queries and problem-solving situations, the limitations are therefore endless.
However, this article will focus on three current and popular uses: writing documents/forms of text, answering questions, and social communication.
These three key uses are outlined in the following subheadings, as well as how these uses may pose a risk in terms of addiction and misuse.
As a writing aid, or as a way to keep track of and correlate different sources of information and research sources, ChatGPT can be incredibly beneficial.
For example, some individuals have used ChatGPT to create research algorithms, allowing for quicker and easier research for large projects in terms of organisation and finding relevant ideas.
However, others may use ChatGPT’s ability to produce text as a way to create reports, essays, fiction, and other documents without writing it themselves.
This is fraudulent in many cases, as this work is often submitted alongside human-created work and compared, leading to many legal issues, moral dilemmas, and further debate.
In some cases, individuals may only use ChatGPT to create their ‘own work’, meaning that they may slowly lose the ability and skills to create this work on their own (1). This could become an addiction in some cases.
Another application, such as that currently employed by Google searches at the time of writing, is ChatGPT’s ability to accumulate information and answers from many different sources and present it as a full report in answer to most Google searches.
This is a useful feature for many individuals as it provides summaries and many different related features in relation to a large number of search terms, making research easier and quicker in many cases.
However, one drawback of this use is that it is AI, at the end of the day. This means that not all information provided will be 100% accurate due to the different sources of information that ChatGPT pulls from.
For example, numerous disreputable or ‘joke’ sources may be pulled from, meaning that accurate information may often be displayed alongside completely false or misleading information.
Without further research conducted manually, individuals will not know for sure if this application of ChatGPT is beneficial for their needs or not.
As the final example of uses of ChatGPT mentioned in this article, social communication and conversation is a popular and increasingly used service that is offered as a part of its features.
However, this does not mean that individuals cannot engage with this form of communication. Instead, it appears to be the opposite in many cases.
For example, individuals may use ChatGPT to have everyday conversations or to speak about a specific topic by typing or using speech-to-text features.
Through this use, individuals can have conversations with a computer – something which, again, has both strong benefits and strong drawbacks.
As a positive, this way of communicating can be gratifying for individuals who do not have the means, confidence, or ability to speak with other humans across the world, but they may be able to access ChatGPT and socialise in this manner.
However, if this becomes an individual’s only form of communication, then it can also become an addiction due to the benefits that these individuals may draw from this form of communication.
An addiction is the continuous engagement of a behaviour despite the negative consequences that it may have.
Though addictions are generally only thought of in the context of drugs such as alcohol, there are also behavioural addictions which could include communicating with AI.
When an individual is struggling with addiction, they may struggle to function in their everyday life and responsibilities without engaging in the behaviour that they are addicted to.
This means that an individual may feel as though they need to communicate with ChatGPT on a regular basis in order to maintain their lifestyle.
Though this may seem extreme, there are many different reasons why an individual may develop an addiction and reasons such as the development of a coping mechanism are often likely to lead to further issues.
As previously mentioned, some individuals may feel as though they are unable or incapable of communicating with other humans, meaning that they turn to other forms of communication and socialisation in order to overcome this need.
Over a long period of time in which an individual engages with ChatGPT as socialisation, this may become ingrained in their schedule and everyday behaviours, becoming a part of how they function without them realising.
Without this outlet of socialisation, these individuals may become agitated, aggressive, or distant in their behaviours – something which cannot be rectified without turning to this behaviour again, or through therapy.
With any addiction, it is always recommended that individuals attend rehabilitation. This is the process in which individuals learn to overcome and cope with the effects of their addiction, often overcoming the addiction as a part of the process.
In general, there are three stages to rehabilitation: detoxification, rehabilitation/therapy, and aftercare. This may be considered slightly differently in long-term cases (2).
Detoxification concerns overcoming the addiction in terms of withdrawing from engaging in these behaviours. Individuals will be expected to stop communicating with ChatGPT completely or, in some cases, massively reduce the amount in which they are engaging in this behaviour.
Rehabilitation/therapy is the stage in which individuals will partake in different treatments to focus on their mental health and how this has been affected by their addiction. These courses of treatment will be different in every case, for every individual.
Aftercare refers to any further support that individuals may receive after completing the bulk of their treatment in rehabilitation. This can be as interactive or as independent as the individual requires.
To discuss any issues relating to addiction – both drug and behavioural – please get in touch with Rehab 4 Addiction today through our addiction support hotline on 0800 140 4690.
[1] Yankouskaya, A., Liebherr, M. and Ali, R., 2024. ChatGPT addiction: From support to dependence in AI large language models. Available at SSRN 4972612.
[2] Martinelli, T.F., Nagelhout, G.E., Bellaert, L., Best, D., Vanderplasschen, W. and van de Mheen, D., 2020. Comparing three stages of addiction recovery: Long-term recovery and its relation to housing problems, crime, occupation situation, and substance use. Drugs: education, prevention and policy, 27(5), pp.387-396.