Australian researchers use ChatGPT to assess grant applications

Occurred: June 2023

Can you improve this page?
Share your insights with us

The use of ChatGPT to assess grant applications by peer reviewers at the Australian Research Council (ARC) prompted warnings about academic misconduct and abuse of confidentiality, and calls for greater transparency.

Researchers reported that some assessor feedback provided as part of the ARC's latest Discovery Projects round of grant funding included generic wording suggesting they may have been written by artificial intelligence. One assessor report included the words 'Regenerate response' – text which appears as a prompt button in the ChatGPT interface.

The finding prompted affected researchers to call for greater transparency in ARC's grant review process. It also resulted in ARC warning peer reviewers about the confidentiality of the grant review process and to point out the security risks of using AI chatbots.

ARC_Tracker, an unofficial tracker of ARC grant outcomes, argued that the use of ChatGPT and equivalent services was likely due to academics’ unmanageable workloads and ARC taking too long to release an AI policy. 

Databank

Operator: Australian Research Council
Developer: OpenAI
Country: Australia
Sector: Govt - research
Purpose: Assess grant applications  
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning
Issue: Confidentiality; Security
Transparency: Governance