AI tools can both transform and produce content such as texts, images and music. The tools are also increasingly available as online services. One example is the ChatGPT tool, which you can ask questions and get well-informed, logically reasoned answers from. Answers that the tool can correct if you point out errors and ambiguities. You can interact with the tool almost as if you were conversing with a human.
Artificial intelligence tools, such as Large Language Models (LLM) like ChatGPT are becoming increasingly powerful and useful for researchers, in a way that would be considered science fiction in 2020. But they are not inherently ethical and responsible. Research institutions have a key role in providing professional development to all of its research community about the correct use of such technology. This needs to include very early career researchers (such as PhD candidates), early career researchers and more experienced researchers. The use of such technology is not automatically misconduct, but its use without disclosure and care could easily become a breach or potentially misconduct.
The challenge in education and research is thus to learn to use these AI tools with academic integrity. Using AI tools is not automatically cheating. Seven participants in a European network for academic integrity (ENAI), including Sonja Bjelobaba at CRB, write about the challenge in an editorial in International Journal for Educational Integrity. Above all, the authors summarize tentative recommendations from ENAI on the ethical use of AI in academia.
An overarching aim in the recommendations is to integrate recommendations on AI with other related recommendations on academic integrity. Thus, all persons, sources and tools that influenced ideas or generated content must be clearly acknowledged – including the use of AI tools. Appropriate use of tools that affect the form of the text (such as proofreading tools, spelling checkers and thesaurus) are generally acceptable. Furthermore, an AI tool cannot be listed as a co-author in a publication, as the tool cannot take responsibility for the content.