Cambridge prohibits the use of AI in assessed work, classifying it as academic misconductMarketcomlabo via wikimedia commons https://commons.wikimedia.org/wiki/File:Image-chatgpt.webp https://creativecommons.org/licenses/by-sa/4.0/deed.en

Cambridge students' use of ChatGPT has increased by 14% since 2023, with over 60% of students admitting to have used the AI chatbot to assist with university work in a new survey. 

Varsity survey found that 61.3% of Cambridge students had used AI tools to assist with academic work .The survey, which received over 330 responses from Cambridge students, also found that 29.7% of students have used the artificial intelligence (AI) chatbot to help with assessed work, such as coursework or exams.

This marks a significant increase from an investigation conducted two years ago, which found that 47.3% of students had used ChatGPT for academic purposes, and nearly one in five had used it in assessed work.

Cambridge students are prohibited under University guidelines from submitting unacknowledged AI-generated content as assessed work. Attempting to pass AI-generated content off as their own work is considered to be academic misconduct, unless the assessment brief states otherwise. Guidance for non-assessed work varies by department.

The most commonly cited reason among students for using the tool was summarising texts, with 34.5% of respondents stating they had used the platform for this purpose. One student told Varsity: “Why would I read a 70 page article when ChatGPT can just sum it up for me in a paragraph or two?”

Other popular reasons for enlisting the help of ChatGPT included requesting explanations of how to tackle questions, problems, or equations (28.5%), revision (27.9%), and research (27.6%).

According to the results of the survey, students taking STEM subjects were more likely to have employed the software for university work (74.7%) than students reading humanities (52.3%). For assessed work, 46.8% STEM students admitted to using ChatGPT, compared to only 18.8% of humanities students.

While 34.5% of all students surveyed reported never using ChatGPT to assist with University work, 21% admitted to using it multiple times a day. 14.1% said they use the tool a few times a week, with 12.6% using it less than a few times a month.

36% of STEM students revealed that they use ChatGPT multiple times a day. This figure was much lower, at 12.2%, for humanities students.

One student told Varsity: “I’m concerned by how widespread the use is. I think students have lost sight of the idea of learning how to think critically for yourself.” Another student remarked: “After dabbling with AI I realised I am much better off just doing the work and research myself.”

However, others have acknowledged the potential benefits AI tools like ChatGPT could bring.

One student said: “It is a brilliant tool for generating new ideas and offering alternative solutions when improvements to work are stagnating.” Another reflected: “It’s like Grammarly with extra steps.”

This study comes just months after Varsity reported that the University recorded its first-ever formal AI-related cases of academic misconduct in 2024.

Departments have responded to the software in various ways. Last year, the HSPS faculty announced that handwritten exams would replace online assessments for first and second-year Sociology and Anthropology students, citing a rise in AI use in exams.

By contrast, English students were told in 2023 that AI could assist with tasks such as “sketching a bibliography” or “early stages of the research process,” provided it was done under supervisor guidance.

One History student described having “a Black Mirror moment” when “a lecturer ‘simulated’ a convo [sic] between FDR [Franklin D Roosevelt] & Reagan in a lecture”.


READ MORE

Mountain View

Uni offers AI ‘research clinics’ to academics and students

Another student was blunt in their assessment of enforcement efforts: “The University are daft if they think they can ever police it properly. They’ll end up wasting a lot of money on detection software that’ll never be fully effective.”

A University spokesperson said: “A survey of 333 students represents around 1.4% of Cambridge’s student population. The University has strict guidelines on student conduct and academic integrity. A student using any unacknowledged content generated by artificial intelligence within a summative assessment, as though it is their own work, constitutes academic misconduct – unless explicitly stated otherwise in the assessment brief – to be dealt with under the University’s disciplinary procedures. Students are permitted to make appropriate use of AI tools to support their personal study, research, and formative work, but it is recommended they discuss this with supervisors or lecturers, to understand how to best to engage with these tools.”