From discrimination to vulnerabilities, the issues are as plentiful as the software is widely used.
During the coronavirus pandemic, the use of remote proctoring software has risen in universities and educational institutions worldwide. It’s designed to ensure that people don’t cheat, using their webcam feed to monitor how they act and what they do during exam conditions, and tracking their movement to make sure nothing nefarious goes on.
It’s one of the key bits of technology that was implemented into the education system as the pandemic pushed teaching and exams online – but it has fatal flaws, according to a new paper by academics in the United States and Australia. The paper, which claims to be the first technical analysis of four widely-used primary proctoring suites used by US law schools and in-state attorney licensing exams, highlights major issues with the systems that the academics claim could disadvantage those using them.
The academics looked at four major software tools – Examplify, ILG Exam360, Exam4, and Electronic Blue Book – that are used in 93% of US law schools and 100% of remote state bar exams. They found, by reverse engineering the suites, that they had “vulnerabilities of varying complexity that compromise the purportedly secure testing environments”, they write.
Poking holes in the defense
The academics managed to probe and test the suites using three potential adversaries: a law student; a law student with computer science experience; and an experienced reverse engineer. Each user would be capable of finding holes in the software that could be exploited, the authors discovered.
That means that users could theoretically cheat on the exams they wanted to, according to the authors. While different tools use different mechanisms to try and get around this, including checking the computer, hard drive, network adapter, and bios vendor information to see if any fields contain the string ‘virtual’, or a rudimentary virtual CPU check, there are methods that can be used to get around it, nullifying the anti-cheating technology deployed by each software package.
One other method that proctoring software uses to try and ensure cheating does not occur is to monitor who is taking the test using a computer’s webcam. That relies on facial recognition technology – which has long been highlighted as performing less well when dealing with the faces of people from ethnic minority backgrounds. That’s perpetuated here.
“We see significant variations in the performance of the different classifiers depending on the race of the subject being analysed,” the academics write. They even allege in some instances that minority students may be prevented from taking the exam in the first place or presented to a human proctor for review at a higher rate than white faces.
Perpetuating pre-existing issues
It all makes for sobering reading – and raises concerns that some people could be disadvantaged, or could take advantage of the system, with benefits and drawbacks that result from that. “Where allowable, educators should design assessments that minimise the possible advantage to be gained by cheating,” the academics write. They add that “given the substantial fairness concerns with facial recognition systems, our primary recommendation is to avoid using them.”
Examplify, ILG Exam360 and Electronic Blue Book have all been approached for comment. Greg Sarab of Extegrity, which operates the Exam4 system, says: “’AI remote proctoring was used on an emergency basis mainly on remote bar exams during the pandemic. It raised serious questions about fairness, equity and discrimination.”
Sarab adds: “Under conditions of constant technological change, the playing field must be kept as level as possible, but at the same time, institutions seek to build their processes around consistent, predictable systems.
More from Cybernews:
Subscribe to our newsletter