The lovers and loathers of ChatGPT: Academic's opinions

ChatGPT–The artificial intelligence chatbot that has rocked the professional and academic landscape. Despite its many uses, ChatGPT has been mainly used by students. Maybe you’ve used it to complete last-minute assignments or asked the software questions about your degree. But have you ever wondered what your tutors think of ChatGPT?

We at Cybernews Academy wanted to know how ChatGPT is perceived by academics. If you’ve ever wondered whether your teachers know if you’re using the chatbot, these answers might surprise you.

We sat down with four academics as they discussed their opinions of ChatGPT and how it’s affected the current academic landscape.

A threat to convention?

In our previous article, we discussed the nature of ChatGPT with students. Most of these students noted that although GPT-3 models create powerful pieces of text, this software tends to leave students feeling misguided. This notion of misguidance is one of the concerns lecturers have when implementing ChatGPT into an academic environment, alongside the potential loss of academic integrity and the impact that ChatGPT may have on soft skills like critical thinking and creativity.

Dr. Peter Boxall said AI will "definitely change conventional learning methods." Dr. Boxall reluctantly accepts ChatGPT's presence within academia. When asked if AI should be eradicated, he said, "I think we're going to have to learn to live with it." Dr. Boxall's discipline requires free-thinking analysis of texts and welcomes impassioned ideas surrounding literature. Clearly, ChatGPT is a potential threat to Dr. Boxall's subject as he expressed, "How do you defend the line between your thinking and the language being machine produced?" Some educators aren't thrilled about AI becoming a part of the education system. However, some lecturers are beginning to welcome the idea into their classrooms.

Dr. Antony Aumann doesn’t view ChatGPT as a foe; he regards artificial intelligence as a friend. He suggested that ChatGPT does exactly what academic mentors do– they provide feedback, which we incorporate into our writing. He said that we don’t tell students to sit in a room alone and not talk to anyone. We are natural communicators, so we want to receive feedback on our work and talk about interesting topics. Dr. Aumann suggests that ChatGPT is no different from an academic mentor or a friend. Although some would agree that academic mentorship could be automated, would this potentially promote plagiarism?


So, we’ve all been wondering whether university lecturers regard using ChatGPT as plagiarism. Dr. Antony Aumann, despite his acceptance of large language models, expressed that “some ways of using ChatGPT are just plagiarism. If the student uses it in a foolish way where they just tell it the prompt and cut and paste the answer,” then this is a poor use of the program and would be viewed as malpractice. Dr. Peter Boxall, although not encountering plagiarism at the hands of ChatGPT, agrees that using AI to generate or enhance your work is a form of malpractice. Professor Mitali Halder agrees you shouldn’t use ChatGPT to write your entire paper. She believes that ChatGPT shouldn’t be completing your assignments.

However, lecturers feel differently about using AI to support their students' learning process. Dr. Antony Aumann expressed that using ChatGPT to help you learn is not plagiarism. Asking the software questions, asking for feedback, and polishing rough drafts are OK if you use ChatGPT modestly. Similarly, Professor Mitali Halder explained that using ChatGPT as a form of mentorship is a legitimate use. Although students shouldn't be entirely dependent on ChatGPT, they should know what "machine learning" is and what AI learning tools are. She added that students can use the tool, but their creativity and skill should ultimately be their own.

Catching AI

Dr Antony Aumann told us that reports of plagiarism have risen in NMU as more and more students use AI to complete coursework. We spoke to lecturers and asked them whether they’ve caught their students using ChatGPT and how they dealt with this problem.

Dr. Olivia Ramsbottom commented on her initial thoughts surrounding the software. “My first reaction was fear she said. “I thought students would use this to create work, and it will be potentially tough for us to spot whether that work is their work.” Soon, her fears were realized as she identified an instance of plagiarism executed by ChatGPT. She told us that some of her students submitted work that was “too good to be true.” This was “the first prompt” that forced her to investigate the incident. The occurrence kickstarted her exploration of AI tools to determine what was available to her students. According to Dr. Ramsbottom, one telltale sign of ChatGPT is poor referencing. “They are often incorrect in these pieces of work as they are created from several sources.” These sources may be credible in their original form, but instead, they are combined to create “hybrid references.” Dr. Ramsbottom said, “If there are references from students’ work that are artificially created, you can find out whether they are real sources.” She identified that several sources within the student's assignment weren’t real. Dr. Ramsbottom acknowledged that the work submitted wasn’t valid and then went through her university's “very robust” misconduct procedure.

In a recent interview with Dr. Antony Aumann, we discussed how he caught a student handing in a paper that ChatGPT had written. Dr. Aumann had received a suspicious essay, and he decided to do something brilliant. He chose to submit the report to ChatGPT. He told us: "I submitted the essay to ChatGPT and said, did you write this? The chatbot responded: "There's a 99.9% chance it was written to me." ChatGPT confirmed Dr. Aumann's suspicions, and then he did something even more remarkable. "I'm going to send them something written by the chat." He sent a scolding email to the student written by the very thing that had executed the malpractice. Finally, Dr. Aumann gave the report to the student and waited for their response. Dr. Aumann said, "Hey, what's going on with this? And, of course, the student fessed up." Then Dr. Aumann encountered a difficult decision, "Do you flunk them? Report them?" Each university has different procedures when it comes to plagiarism. It is usually regarded as malpractice which may result in failure of the module or course. However, Dr. Aumann "just had the student rewrite the paper and said this time, don't be naughty." These are some instances where ChatGPT wasn't used appropriately. However, you can use ChatGPT to your advantage in many ways.


Despite it’s limitations, there are a multitude of positive ways that ChatGPT could influence teaching and learning. Instead of eradicating or changing traditional learning methods, Professor Mitali Halder believes that AI will be used to augment conventional modes of teaching. AI has the potential to enhance mentorship, increase productivity and improve students' pre-existing competencies. Dr. Ramsbottom believes that ChatGPT could really open up mentorship opportunities. She said she could “see some possibilities there.”

Additionally, AI may help us complete work that could otherwise be automated. For example, Dr. Boxall thinks that ChatGPT could be used “to write an abstract, perfectly legitimately” or “you could use it to do very intelligent indexes.” Due to its robotic nature, ChatGPT could effectively write rigid texts. But when it comes to academic writing, lecturers will know whether it’s been written by a robot. Despite this claim, Professor Mitali Halder says ChatGPT is “one of the best tools for text generation.” Due to their language writing proficiency AI writing tools can be a brilliant option for those struggling to formulate their ideas into fully realized sentences. Dr. Antony Aumann suggested that those who struggle with learning difficulties or are non-native English speakers may benefit from using AI to enhance their writing skills. Despite all these excellent uses for ChatGPT, the software still has its limits.


Dr. Boxall expressed how traditional writing and teaching styles are more impactful than the stylings of AI. One example Dr. Boxall recalled was his first encounter with ChatGPT on a UK radio show. He told us that a Shakespearean actor named Charles Dance was asked to read two sonnets– one written by Shakespeare and one by ChatGPT. Dr. Boxall remembers the actor reading the ChatGPT poem with "great flourishes." Yet that still wasn't enough to make the poem come to life. Dance then read the original Shakespearean sonnet afterward, and "the effects were electric." Dr. Boxall told us, "It made you realize how good the Shakespearean sonnet was and how impossible it is to imitate." He said, "It had the right rhyme sequence, and it was in iambic pentameter, so it was a sonnet. However, it was totally empty." ChatGPT may be good at sourcing information and structuring rigid texts, but it can never replace language containing emotion and meaning. This is similar to academic writing, as Dr. Boxall mentioned that ChatGPT "can't write good essays yet. It could probably write an undergraduate essay that would get a low third." ChatGPT can replicate the structure of a Shakespearean sonnet, but it can never replicate individuality. That's one reason why doing your own work is so important. Teachers acknowledge your imagination and the skills you honed at university– so use them to your advantage.

One concern Dr. Ramsbottom raised surrounding ChatGPT was the lack of knowledge educators have regarding the software. As educators, we have a responsibility to safeguard and protect students, she explained. So, it's difficult to safeguard something that we haven't realized fully. It's difficult to say whether ChatGPT is safe– as we don't know what it's capable of yet. Furthermore, she raised the issue of academic validity, suggesting that ChatGPT could potentially threaten students' academic success.

This idea of academic validity feeds into another fear, the loss of academic integrity. When asked if our lecturers had questioned the authenticity of their students’ work, surprisingly, many said they hadn’t. However, with tools like ChatGPT, more lecturers may suspect their students of cheating. This could create a poor work environment or promote poor relationships between students and teachers. Dr. Aumann commented on this idea–he said students might argue that they write like robots. Unless universities are using AI checkers, it might be difficult to differentiate those who are using AI from the students that aren’t. Thus forging strained relationships between teacher and student.


So, did the result surprise you? Not only do many lecturers appreciate AI they would consider implementing the technology into their curriculums. It’s harder to welcome AI writing tools in disciplines like English as they may promote plagiarism. However, lecturers in other fields believe AI can augment and enhance conventional teaching methods if used responsibly. This way, more students may begin actively engaging in classroom discussions and activities with the help of ChatGPT and other AI language models. Although some lecturers disagree with ChatGPT, most of the lecturers we spoke to believe that implementing ChatGPT and other artificial intelligence models into academia is our ticket to educational advancement.