I'm not a chatbot – I promise! – Nature.com

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.
Advertisement
E. M. Wolkovich is an associate professor of forest and conservation sciences at the University of British Columbia in Vancouver, Canada.
You can also search for this author in PubMed  Google Scholar

Accusations of plagiarism, including alleged misuse of ChatGPT, should not be made lightly.Credit: Alexandre Rotenberg/Alamy
I have just been accused of scientific fraud. Not data fraud — no one accused me of fabricating or misleadingly manipulating data or results. This, I suppose, is a relief because my laboratory, which studies how global change reshapes ecological communities, works hard to ensure that data are transparent and sharable, and that our work is reproducible. Instead, I was accused of writing fraud: passing off ‘writing’ produced by artificial intelligence (AI) as my own. That hurts, because — like many people — I find writing a paper to be a somewhat painful process. I read books on how to write — both to be comforted by how much these books stress that writing is generally slow and difficult, and to find ways to improve. My current strategy involves willing myself to write and creating several outlines before the first draft, which is followed by writing and a lot of revising. I always suggest this approach to my students, although I know it is not easy, because I think it’s important that scientists try to communicate well.

Collection: ChatGPT’s impact on careers in science
Imagine my surprise when I received reviews on a submitted paper declaring that it was the work of ChatGPT. One reviewer wrote that it was “obviously ChatGPT”, and the handling editor vaguely agreed, saying that they found “the writing style unusual”. Surprise was just one emotion I experienced; I also felt shock, dismay and a flood of confusion and alarm. Given how much work I put into writing, it was a blow to be accused of being a chatbot — especially without any evidence.
In reality, I hadn’t written a word of the manuscript using ChatGPT. I quickly brainstormed how I might prove my case. Because I write in plain-text files (using the typesetting language LaTeX) that I track using the version-control system Git, I could show my text change history on GitHub (with commit messages including “finally writing!” and “Another 25 mins of writing progress!” that I never thought I would share). I could also try to compare the writing style of my pre-ChatGPT papers with that of my submission.
Maybe I could ask ChatGPT itself if it thought it had written my paper. But then I realized I would be spending my time trying to prove that I am not a chatbot — which seemed a bad outcome to the whole situation. What I really wanted to do was pick up my ball and march off of the playground in a fury. How dare they? But first, I decided to get some perspectives from researchers who work on data fraud, co-authors on the paper and other colleagues. Most agreed with my alarm. One put it most succinctly: “All scientific criticism is admissible, but this is a different matter.”
These reviews captured something both inherently broken about the peer-review process and — more importantly to me — about how AI could corrupt science without even trying.
E. M. Wolkovich was accused of passing off text generated by ChatGPT as her own work.Credit: T. J. Davies
People worry about AI gaining control over humanity, its potential to supercharge misinformation and how it might help to perpetuate insidious bias and inequality. Some are trying to create safeguards to prevent this. But communities are also trying to create AI that helps where it should, and maybe that will include as a writing aid. But, as my experience shows, ChatGPT corrupted the whole process simply by its existential presence in the world. I was at once annoyed at being mistaken for a chatbot and horrified that reviewers and editors were so blasé about the idea that someone had submitted AI-generated text.
So much of science is built on trust and faith in the ethics and integrity of our colleagues. We mostly trust that others do not fabricate their data, and I trust that people do not (yet) write their papers or grants using large language models without disclosing it. I wouldn’t accuse someone of data fraud or statistical manipulation without evidence, but a reviewer apparently felt no such qualms when accusing me. Perhaps they didn’t intend this to be a harsh accusation, and the editor thought nothing of passing along and echoing their comments — but they had effectively accused me of lying by deliberately presenting AI-generated text as my own. They also felt confident that they could discern my writing from that of an AI tool — but they obviously couldn’t.

NatureTech
We need to be able to call out fraud and misconduct in science. In my view, the costs to people who call out data fraud are too high, and the consequences for committing fraud are too low. But I worry about a world in which a reviewer can casually level an accusation of fraud, and the editors and journal editor simply shuffle along the review and invite a resubmission. It suggests not only that reviewers and editors have no faith in the scientific integrity of the submitting authors, but also that ethics are negotiable. Such a world seems easy for ChatGPT to corrupt without even trying — unless we raise our standards.
Scientific societies can start by having conversations during their meetings and conferences with the goal of more explicit, community-generated standards about when and how AI can be used in the manuscript-writing process, and how that help should be acknowledged. Such standards could help editors to develop better processes for handling accusations of AI-generated text, ideally in a way that is less demoralizing for authors.
As for me, I now plan to use Git and GitHub for all my writing from day one, and to document changes every day. It’s not an ironclad system, but it has given me some peace of mind — not to mention, a paper trail that clearly shows a manuscript written slowly and painstakingly, and without ChatGPT.
doi: https://doi.org/10.1038/d41586-024-00349-5
This is an article from the Nature Careers Community, a place for Nature readers to share their professional experiences and advice. Guest posts are encouraged.
Could AI help you to write your next paper?
‘Arms race with automation’: professors fret about AI-generated coursework
AI and science: what 1,600 researchers think
Innovative funding systems are key to fighting inequities in African science
Nature Index
How I learnt to write research papers as a non-native English speaker
Career Column
Science’s fake-paper problem: high-profile effort will tackle paper mills
News
Passion, curiosity and perseverance: my mission to capture women in science on camera
Career Q&A
How to enhance lab-team efficiency with tools from the tech industry
Career Column
From a pocketful of rocks to scientific director of palaeontological research
Career Q&A
How to enhance lab-team efficiency with tools from the tech industry
Career Column
Academia needs radical change — mothers are ready to pave the way
World View
How I learnt to write research papers as a non-native English speaker
Career Column
Houston, Texas (US)
Baylor College of Medicine (BCM)
A postdoctoral fellow position is available
Bethesda, Maryland (US)
NIH, National Heart, Lung, and Blood Institute
The Department of Biological Sciences at Wayne State University is searching to fill a 9-month, tenure-track Assistant Professor
Detroit, Michigan (US)
Wayne State University-Biological Sciences
We are currently seeking an Associate or Senior Editor with experience in research to join Communications Earth & Environment.
London, Beijing, Nanjing or Shanghai – Hybrid Working Model
Springer Nature Ltd
The Department of Biology (www.biol.ethz.ch) at ETH Zurich invites applications for the above-mentioned position.
Zurich, Switzerland
ETH Zurich
You have full access to this article via your institution.

Could AI help you to write your next paper?
‘Arms race with automation’: professors fret about AI-generated coursework
AI and science: what 1,600 researchers think
An essential round-up of science news, opinion and analysis, delivered to your inbox every weekday.
Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.
Nature (Nature) ISSN 1476-4687 (online) ISSN 0028-0836 (print)
© 2024 Springer Nature Limited

source

Jesse
https://playwithchatgtp.com