Generative AI usage policy

The journal Research and Educational Studies adheres to the recommendations of World Association of Medical Editors (WAME) and Elsevier regarding the use of generative artificial intelligence (AI) in scholarly publications.

Generative AI tools (e.g., ChatGPT, Gemini, Bing AI) cannot be considered authors or co-authors of scientific articles, as they are unable to assume responsibility for the content of a publication. Authorship implies accountability for the accuracy of results, participation in the revision of the manuscript, and confirmation of the absence of plagiarism—responsibilities that can only be assumed by a human researcher.

The use of AI is permitted solely as an auxiliary tool, for example to improve the language, style, or structure of a text. In all cases, the authors bear full responsibility for the content of the submitted material. The use of such tools must be explicitly disclosed in the manuscript (for instance, in the “Acknowledgements,” “Abstract,” or “Methods” sections), specifying the name and version of the tool as well as the conditions of its use.

The use of AI for the creation or editing of images, graphs, or other illustrative materials is not permitted, except in cases where such technologies constitute part of the research methodology. In such situations, their use must be described in detail in the “Methods” section.

Editors and reviewers must also disclose any use of AI tools while working with manuscripts. Transferring the text of a manuscript to external services without the authors’ consent is considered a violation of the confidentiality of the editorial process. The editorial board may employ specialized tools to detect materials created or modified using AI in order to ensure academic integrity.