Page 8 - IJSA, Vol. 6, No 2, 2023
P. 8
International Journal of Science Annals, Vol. 6, No. 2, 2023
рrint ISSN: 2617-2682; online ISSN: 2707-3637; DOI:10.26697/ijsa
Reuters (Hu, 2023), the analysts note that in the last 20 It is important to note the statement of the Committee of
years in the Internet space, it is hard to remember a faster Publication Ethics (COPE). On its website, the
growth rate for consumer Internet applications. Committee has published its official position on
authorship and the use of AI tools (COPE Council, 2021;
Results COPE, 30 January 2023, 13 February 2023, 23 February
Let us consider the possibilities of using ChatGPT in 2023; Watson & Stiglic, 2023). Also a number of papers
scientific and practical spheres of psychologists’ activity. on using AI for scientific writing (Çalli & Çalli, 2022;
Note that if we were asked the specific direct question of Dans, 2019; Dimitriadou & Lanitis, 2023; Farahani,
whether ChatGPT could replace the work of a 2023; Singh & Sood, 2022).
psychologist, we would answer “no”. ChatGPT or similar The IJSA is a full member of the COPE (COPE, n.d.).
chat tools that may be created in the future are not capable The Editorial Board recommends that researchers take
of replacing the work of a “live” psychologist. This the issue of quality and use of AI in manuscript writing
applies to both scientific and practical areas. Although and text citation seriously (Melnyk & Pypenko, 2021,
ChatGPT or other AI tools can be used in academic and 2023). Especially since there will soon be programs that
practitioner settings and positively impact the can determine the extent to which AI-based ChatBots are
effectiveness of that work. However, they cannot involved in writing a research paper.
adequately replace a live expert in the field. The development of programs capable of recognizing
Let us consider some arguments that we believe machine-generated text is evidenced by the statement of
strengthen this position. researcher Scott Aaronson of the OpenAI Company
First, we discuss using ChatGPT for psychological (Hern, 2022). Turnitin has already begun work on
research. developing an AI-based text detection tool (Chechitelli,
When studying any psychological phenomenon or 2023). So there is a good chance that if you try to pretend
writing research papers, we face the problem of to be the author of text written by a ChatBot, you may be
reviewing a large number of references. Today, it is detected.
actually unnecessary to visit libraries, as most books are Emphasize that the above was an inappropriate
digitized and modern academic periodicals are available borrowing of ideas and text written by ChatGPT. This
on the Internet (in scientometric databases, repositories does not mean at all that we are strongly against the use
and search engines). ChatGPT certainly has access to a of ChatGPT. On the contrary, the use of ChatGPT or
lot of this information. Sometimes we do not even realize other AI tools in a research paper is a sign that the
it. ChatGPT is also capable of self-selecting relevant researcher is able to use modern methods and tools
literature, reviewing it, and drawing conclusions from it. correctly.
This is a great temptation for the researcher working To conclude the discussion of the use of ChatGPT in
under time constraints. psychological research, I would like to make one more
In this way, young researchers had free and unrestricted argument related to the inability of AI to replace the work
access to information without having to analyze it of a research psychologist.
independently. We did not have that in the not-so-distant We have a reasonable belief that AI or the latest
past – when we had to spend all day in libraries, searching modification of ChatGPT will not be able to develop a
catalogs for the books we needed, ordering them one by fundamentally new psychological approach or theory.
one, and writing citations for our dissertations. This is because ChatGPT’s possibilities will always be
However, the benefits that ChatBots give us at first conditioned and connected to the human ability to ask
glance come at a price. A researcher who receives a questions.
“ready-made conclusion”, even if ChatGPT has At this stage of AI development, it is hard to see how
described the algorithm step by step, deprives himself of ChatGPT, for example, could develop and describe
the opportunity to touch the primary sources and realize something like a psychoanalytic approach, or how
the essence for the formation of certain ideas, ChatGPT could conduct psychoanalytic sessions with
psychological concepts, theories and practices. In this humans. Despite the fact that most clients still naively
way, the researcher saves time on the one hand, but on believe that this is a fairly simple method of formulating
the other hand deprives himself of the opportunity to questions with fairly limited involvement of the
develop intellectually by relying entirely on ChatBots for psychoanalyst.
the validity of the literature analysis and the conclusions Second, we discuss using ChatGPT for psychological
drawn from it. It is quite possible that such actions could practice.
jeopardize his professional reputation and certainly will Most people evaluate AI, like any digital product
not contribute to his personal growth. (Pypenko, 2019), from the consumer’s point of view. For
We emphasize that borrowing other people’s ideas is an some people, technological innovation provokes
unacceptable practice that can be equated to plagiarism. technophobia, in some cases even paranoia. This has only
Even if these ideas or conclusions are generated by gotten worse with the advent of AI and Chatbots. Such
artificial intelligence (AI). people tend to view ChatBot as an evil agent spawned by
In general, compliance with ethical principles and computers and the Internet. In addition, there are a
standards of academic integrity is an important aspect growing number of people with the exact opposite
that may be underestimated, consciously or condition – nomophobia, which is related to the fear and
unconsciously, by a researcher using Chatbots. despair of being disconnected from technological
6