Artificial Inteligence (AI) tools usage Guidelines

Artificial Intelligence and its developments in the so-called broad language models (LLM), Machine Learn, Generative Automation and ChatBots, such as ChatGPT (and derivatives such as Copilot and Grok); Google Gemini and Meta AI have spread socially and affected scientific practices; Taking as an example the integration of the Copilot engine into the text editing software Microsoft Word (in its most recent version) to generate automated summaries, suggestions or use of modifications in writing directly in the text during the typing process.

In this context, supported by studies and recommendations made by the Committee on Publication Ethics (COPE), Journal Alterjor stablishes the following guidelines related to the use of AI:


1.⁠ ⁠AI tools can be an auxiliary resource for the development of texts and scientific research. However, they have limitations, biases, and possible problems of objectivity and accuracy. Therefore, its use, in the preparation of manuscripts, must be done in a considered, critical and responsible manner: for example, by checking the validity of information and references provided, as well as by checking whether there is plagiarism in query data.

2.⁠ ⁠Facing the scientific transfer, the use of Artificial Intelligence Tools must be clearly indicated in the originals, with a description of the type of AI used, justification and purpose of use, preferably in the methodological section of the article. As in relation to other research tools, articles may present reflections on the advantages and limitations of the resource, in different forms of use, such as generating initial ideas, collecting data, preparing tables and figures, among others.

3.⁠ ⁠The authorship of scientific works and articles is related to the full development of an intellectual creation, which corresponds to moral and legal responsibilities. Therefore, only human beings can assume this role and Generative Automation and ChatBots applications should NOT be listed or included as authors nor co-authors of works submitted to the journal. In case of detection of imitation of the writing style of authors vehemently not participating in the manuscript, the submission will be automatically discarded due to authorial falsehood style done by mimicry technology.

4.⁠ ⁠The evaluation of scientific works is also a human task that involves concerns about the confidentiality of the relationship between the journal, authors and reviewers. In this sense, the magazine's editors and their reviewers should not use artificial intelligence tools in the evaluation stages of works sent to Journal Alterjor.

5. Journal Alterjor reserves the right to use AI detection software (written intervention generated by Artificial Intelligence) such as Clarivate to verify and safeguard the originality of manuscripts submitted at any time.