• Fraser Valley Current
  • Posts
  • As student cheating soars, some UFV instructors crack down—while others hand out top marks

As student cheating soars, some UFV instructors crack down—while others hand out top marks

'We clamp down on one or two cases in an effort to create a deterrence, but to little avail I think,' an instructor writes

The University of the Fraser has seen a surge in cheating cases the past year. 📷 Grace Kennedy

This story first appeared in the December 19, 2024, edition of the Fraser Valley Current newsletter. Subscribe for free to get Fraser Valley news in your email every weekday morning.

It’s never been easier to cheat thanks to artificial intelligence.

But whether a university student gets away with using AI to write a college paper may have less to do with technology and more to do with their professor’s willingness to upset their pupils.

The University of the Fraser Valley recorded a huge surge in academic misconduct the past year but hundreds of recorded cheating cases may be just the tip of the iceberg. Instructors say the university has largely left professors on their own to pursue AI cheaters. And in some cases, policies may discourage instructors from punishing academic misdeeds.

A surge in cheating

University of Fraser Valley administrators investigated 407 academic misconduct cases in the 2023/2024 school year. Those cases are up 54% from the previous year, and mirror increases in post-secondary institutions across the continent. Even so, the figures may still obscure students’ rapid turn to artificial intelligence as their preferred way to try to cheat.

Historically, most academic fraud cases reported in UFV’s annual misconduct roundup have been counted as “plagiarism.” But last year, plagiarism accounted for only about one-third of all cases, while incidents broadly defined as “cheating” more than doubled. (Those two categories are, by far, the largest types of academic fraud recorded in the report.) The figures suggest that many would-be cheats have turned from copying others’ work—once the favourite way to trick a teacher into handing out a good grade—to asking artificial intelligence to write their papers for them.

The report says that while UFV hasn’t historically tracked artificial intelligence-related cheating, instructors mentioned AI in 90 different written disciplinary reports in the 2023/24 year. Two-thirds of all misconduct findings were in the university’s arts department.

But the use of artificial intelligence in university is not only challenging instructors’ ability to detect it, it’s also posing serious questions about governmental policies, institutional passivity, the economics of teaching, and the future of education at all levels.

A multi-pronged threat

“Cheating has always been possible, but it is much easier now,” Linda Kaastra, a sessional instructor at UFV, wrote in an email to The Current.

UFV’s academic misconduct report and the overall rise in misconduct bears that out, suggesting that artificial intelligence has both changed the nature of cheating and students’ willingness to deceive their instructors in the first place. (The other explanation for the rise in misconduct cases could be that cheaters who use AI are easier to detect than traditional plagiarists.)

Kaastra, who teaches a first-year English course at UFV, says education has reached a “critical moment.” Although some have promoted the usefulness of artificial intelligence in academic settings, Kaastra suggested that doing so overlooks major flaws in generative artificial intelligence, which assembles collections of words that may sound good but could be wildly and obviously wrong.

“AI does not think,” Kaastra said. “It is an algorithm. It is fake. Its products are distortions.”

Kaastra said she has a zero-tolerance policy for AI tools, and said she increasingly demands students centre themselves in their work to prove that they—and not a machine—are the author of what they are producing.

“It has always been hard to produce good scholarly writing, but now it is just a bit harder because each student has to prove the authenticity of their work—to demonstrate their thought process in writing and to develop their own scholarly voice,” she said.

But it is largely up to UFV’s faculty to determine just how artificial intelligence is—or isn’t—used in academic settings.

The university recently adopted a new policy on student academic misconduct. But although it defines and outlaws plagiarism and cheating, it doesn’t mention the use of artificial intelligence at all. The improper use of technology is only mentioned in reference to examinations. Nowhere does the policy explicitly outlaw the use of artificial intelligence to produce written work. (You can read the policy here.)

That’s a problem, according to another university professor.

The instructor—who we will call Instructor A because they spoke on the condition they not be named, citing the delicacy of the issue and cost-cutting at the university—told The Current that by not specifically laying out ground rules for the use of AI, the policy allows educators to ignore the abuse of the technology, if they choose to do so. That, they said, is a problem because other university policies encourage faculty to avoid displeasing students—even those who might cheat.

“If you’re well-known for going after AI and for being rigorous, then you’re less popular,” they said. And popularity matters because UFV has told teachers that if too few students sign up to their classes, they may have to teach additional classes.

Some university professors have handed out top grades to students who clearly use AI, one instructor says. 📷 Grace Kennedy

“We don’t control enrolment so unfortunately, what it’s becoming right now, is a popularity contest,” said the instructor.

A second instructor (we’ll call them Instructor B) who also didn’t want to be named because of potential ramifications, agreed with Instructor A’s concerns.

“I think the cheating issue is real—instructors may go easy on cheating to keep their numbers up,” Instructor B said.

They said faculty are fighting an uphill battle against artificial intelligence. Detecting and, especially, proving a student is using artificial intelligence is difficult and so widespread that it is difficult to completely root out, they said.

“Most of us go easy on cheating because it is now too much work to prosecute cases,” they said. “We clamp down on one or two cases in an effort to create a deterrence, but to little avail I think.”

(The quotas aren’t just impacting instructors’ willingness to crack down on students. Both instructors also independently said that the enforcement of student quotas are deterring university professors from teaching more upper-level courses with lower enrolments.)

Although the university would not make anyone available for an interview, UFV vice-president of academics James Mandigo wrote in an email that ignoring academic misconduct “would not align with the institution's values of integrity and excellence, nor with faculty standards established by Senate. More importantly, UFV faculty are dedicated, objective and conscientious academics, who take the grading process very seriously, without linkage to personal circumstances.”

But Instructor A says some faculty are handing good grades to obvious cheaters.

“We have students at the first year who cannot write in English—like what comes out, no reasonable person can understand what this means—and I’ve seen those students get As in university arts classes,” they said.

Some of those using AI are international students who cannot carry on a conversation without using Google translate, the instructor said. They may have come to Canada carrying the hopes of their families, only to find themselves relying on AI to try to pass classes. (The instructor noted that international students also account for some of their best students.) Other cheaters were raised locally, where the instructor said students are increasingly being encouraged to use AI by peers, technology companies and educators themselves.

Their use of AI, Instructor A said, “is literally the same as us reaching for our calculator in our day-to-day activities.”

The instructor said students, in general, are entering university with writing abilities that are markedly inferior to those of their peers a decade ago.

So the students turn to AI. And it’s up to faculty to police its use.

Detecting the use of artificial intelligence doesn’t necessarily take a background in computer science given how blatantly some students copy and paste AI-generated content into their own work. Instructors can use a variety of AI detectors to sniff out computer-written material, but many themselves ask ChatGPT to complete the assignment, then compare students’ writing to that of the AI program. Sometimes, the students’ work comes complete with the bullet-point formatting preferred by the AI program. Often, the language itself is clearly computer-generated and different from the student’s own voice. And not infrequently, the information the words contain is full of erroneous details and facts.

Indeed, Instructor A now tries to warn students about the problems with using AI.

“I’ll use ChatGPT to ask some questions about current social policy or current events, and because it’s a language model, it always will come up with lies,” the instructor said. “So I basically show the students ‘This is lying to you.’”

Kaastra expressed a similar concern.

“GenAI can produce text that approximates a summary of an article, but the summary is full of errors—the words in that summary do not reflect insight or knowledge on the basis of the original text,” she wrote. “The text is empty of meaning or knowledge.”

But although she wrote that students being encouraged to use AI “are being experimented on,” Kaastra was more positive about the long-term impacts on her own students than Instructor A. She says she now requires students to demonstrate they are not using AI by demonstrating their thought processes in their own voice.

“In my opinion (and we need another five years to find out if this will be true), the students coming up through college now have an opportunity to place themselves at the centre of their research and write pieces no one else could produce,” she wrote. “That is raising the bar for them, to be sure, but from what I see in my classes, many students in this generation are up to the task.”

Others are more pessimistic—both about AI’s impact on students, and its effect on the university as an institution of higher learning.

Instructor A said those who are cheating, even if they are not caught, are also being primed for failure.

“Ultimately the whole point of a university degree is to get a good paying job, but if you get there and you can’t do it because you didn’t learn the skills…it’ll become apparent pretty quick,” they said. “We’re not giving them the skills and then are wondering why they cheat.”

Instructor A said the scale of cheating—and the lackluster institutional efforts to police it—will impact the perceived value of university education in the labour market. They said UFV’s lack of a policy dealing with artificial intelligence underscores a lack of leadership and commitment to the principles that sit at the core of post-secondary education.

The minutes of an October UFV board meeting reveal that UFV has created a taskforce led by its Chief Information Officer that will consider the use of AI. The minutes say that “Principles developed by the taskforce were brought to [the UFV Senate] that outlined ways in which UFV faculty, staff and administration can think about use of AI and how we embrace and utilize it.”

But Instructor A says UFV hasn’t yet confronted the danger posed by the rampant use of AI.

“A lot of this comes down to, what’s the moral obligation of a university? That’s kind of lost.”

“It’s almost like we’re cutting off our nose to spite our face,” they continued. “Like at the current moment, it’s much easier just to pump out degrees than deal with things like academic integrity, because it fundamentally rocks the core of what university does. At this point, there doesn’t seem to be the leadership or will to do those things; it’s just easier to stick our head in the sand.”

We couldn’t interview UFV officials about the use of AI by cheaters, but we did ask ChatGPT itself about university cheating. FVC members can read that in our weekend edition. Become a member here.

Reply

or to participate.