The programme's areas of interest include Large Language Models like Chat GPTAnnabel Davis for Varsity

The University of Cambridge is running “AI clinics” for academics and students in a bid to enable a “new wave of scientific discovery” through the use of AI tools.

The clinics, which were recently introduced to the University’s West Hub site, are designed to assist academics and students who have “thought about using AI” in their research but “don’t know where to get started,” along with utilising AI to solve “engineering problems”.

This comes as the University is attempting to explore the potential uses of AI in academia through its Accelerate Programme for Scientific Discovery, which has trained over 2,000 researchers on how best to use AI in academic work.

The programme’s areas of interest include Large Language Models like Chat GPT, which have been described as holding “remarkable possibilities for accelerating scientific discovery” when used responsibly by students and academics.

However, not all Cambridge departments have welcomed AI, with the HSPS faculty reverting to in-person exams last year after “too many” students used Chat GPT on their online assessments.

In March 2024, the HSPS faculty also issued an open letter urging students not to use generative AI, warning it could “rob you of the opportunity to learn”. They emphasised that presenting AI-generated text as one’s own would constitute academic misconduct.

This coincided with the University introducing an “AI” category for academic misconduct, after three instances of malpractice were linked to the use of artificial intelligence for the first time last year.

Commenting on these changes, a University spokesperson stated that their academic guidelines “stress that students must be the authors of their own work” and emphasised “strict guidelines” around AI. These comments came two years after Cambridge’s pro-vice-chancellor for education, Bhaskar Vira stated that a ChatGPT ban was not “sensible” because “we have to recognise that this is a new tool that is available.”

A 2023 Varsity survey found that nearly half of Cambridge students had used AI for university work, with almost a fifth using it for assessed tasks like coursework. The University prohibits the use of AI in assessed work, classifying it as academic misconduct, but guidance for non-assessed work varies by department


READ MORE

Mountain View

University creates ‘AI’ category for academic misconduct after rise in cases

Speaking to Varsity, Katie Light, Programme Manager of the Accelerate Programme for Scientific Discovery, stated: “The programme’s AI Clinic offers expert advice to Cambridge University postgraduate students and researchers across the University of Cambridge at all stages of the research pipeline, from ideation to model implementation. Our team of expert Machine Learning Engineers can be contacted through our website at any time or through in person AI café events.”

“These events are organised in collaboration with departments across the University to connect researchers with expertise from the clinic through face to face discussion. So far, the clinic has supported over 200 researchers from 40 departments and we have run 35 in person AI café sessions,” she continued.

Sponsored Links

Partner Links