When U.S. law student Areeb Khan tried to sign into the online portal to take his practice bar exam, he was met with a strange message: “Due to poor lighting we are unable to identify your face.”
Additional lighting did not solve the issue. The 27-year-old even tried to sign in from the brightest room in his New York apartment – the bathroom.
Khan began to suspect that it was his dark skin tone that rattled Examplify, a test proctoring platform adopted by New York state’s law exams board during the COVID-19 pandemic. It took days of back and forth with customer service before he could sign in.
“There are so many systematic barriers preventing people like me from obtaining these degrees – and this is just another example of that,” he told the Thomson Reuters Foundation.
As COVID-19 restrictions force students to take remote exams, universities around the world are relying on proctoring software like Examplify. But many students are wary of the technology, including mass data collection and bias in facial recognition.
“Students are already under tremendous pressure because of the global pandemic,” said Hye Jung Han, a researcher at advocacy group Human Rights Watch who specializes in technology and education.
“And now we have this invasive and unfair surveillance pushing the envelope, invading their private lives.”
Industry leaders maintain their platforms are a critical part of the infrastructure that allows students to continue learning.
“We believe that many lives have been positively impacted by being able to continue their education and careers,” said Nici Sandberg, spokeswoman for ExamSoft, which makes the Examplify platform.
“ExamSoft maintains a non-biased identification and exam delivery process to ensure that individuals of color are not disproportionately affected.”
More than 90% of countries have instituted some form of remote learning since the start of the pandemic, according to a report in August by UNICEF.
This, in turn, has created a booming business for companies dealing in educational technology – or edtech – including firms that specialize in ensuring remote exams are free from cheating.
One firm, Proctorio, reported that it was proctoring more than five times the number of exams this year, compared to last year.
The remote proctoring industry offers a range of services, from basic video links that allow another human to observe students as they take exams to algorithmic tools that use artificial intelligence (AI) to detect cheating.
But asking students to install software to monitor them during a test raises a host of fairness issues, experts say.
“There’s a big gulf between what this technology promises, and what it actually does on the ground,” said Audrey Watters, a researcher on the edtech industry who runs the website Hack Education.
“(They) assume everyone looks the same, takes tests the same way, and responds to stressful situations in the same way.”
Facial recognition systems – which some proctoring platforms use to confirm the identity of the test taker – are less accurate with dark-skinned people, noted Shea Swauger, a researcher who tracks the industry at the University of Colorado (CU) Denver.
And algorithms designed to detect suspicious movement will inevitably flag disabled students and others who do not move in the way the platforms expect, he added.
Students also are balking at allowing third-party software access to their devices, with some services requiring that students give them permission to read their computer files, monitor their keystrokes and analyze their biometrics.
“It’s not just (about) racial bias,” said Miguel Bishop, a member of the student senate at California State University in Chico, which uses the platform Proctorio for exams.
“There’s the unaccountable data collection and the damage to the student-teacher relationship,” he said.
Proctorio CEO Mike Olsen said in a phone interview that the platform is the most convenient way for schools to deal with cheating in the COVID era.
“Taking an exam in the comfort of your own home, on your own schedule, is less invasive,” he said.
Critics of the technology often misunderstand how it works, he added, emphasizing that the tools do not directly identify cheaters, but simply flag suspicious behavior to universities.
Students in various countries have been demanding their schools reconsider the use of remote proctoring software, with mixed results.
In September, the Supreme Court of India cancelled a remotely proctored admissions exam at the National Law School of India University, after a parent of a prospective student and a former university official filed a lawsuit.
They argued that the exam – which was slated to use AI to detect cheaters – was unfair to students with “lesser means and from marginalized areas” who might not have high speed internet or fast enough computers to run the exam.
Urvashi Aneja, founding director of policy and advocacy collective Tandem Research, which has been reviewing tech tools being used during the pandemic, said that “such software risks undermining students’ privacy and the agency of educators”.
“In addition, they are being introduced in a legal vacuum,” said Aneja, a speaker at the Thomson Reuters Foundation’s annual Trust Conference on Wednesday.
Students at the University of Queensland in Australia have been unsuccessfully petitioning the university to rein in what they consider to be the most invasive and potentially discriminatory aspects of ProctorU, a U.S. proctoring service the school uses.
“We’ve asked them to do things like in-house identity verification, rather than having (the platform) read our ID and do biometric analysis on our faces,” said Rowan Evans, a student representative at the school.
Jennifer Buckley, an engineering professor at the University of Delaware, decided not to use ProctorU after she heard from her students that proctors had interrupted remote exams and asked students to show they weren’t cheating.
“No thank you,” she said. “I’d rather my students not feel like they’re in a police state.”
She and her assistants use Zoom to proctor exams themselves.
ProctorU chief executive officer Scott McFarland said that “students should be reassured that all testing data is owned by their schools, not the proctoring provider.”
“Schools set the rules about what data is collected, how it’s retained and for how long,” he said in an emailed statement.
In the United States and Canada, students at dozens of universities have sent petitions criticizing exam proctoring technology, according to a tally by the Electronic Frontier Foundation, a digital rights non-profit.
Patrick Sullivan, a 19-year-old sophomore at the University of Massachusetts Lowell, created one such petition after a math professor asked him to install the remote proctoring service Respondus for an exam in September.
He said the software, which is designed to block access to certain functions on a computer while a student takes an exam, would gain access to files on his personal device. “Giving software this much access is playing with fire,” he said.
Jodi Feeney, the chief operating officer of Respondus, said that “our applications have also been closely analyzed by hundreds of customers along the way – university security teams, IT staff, students, third-party companies.”
After 1,200 students signed Sullivan’s petition demanding the University of Massachusetts Lowell ban the software, the university announced that professors could not mandate its use.
As the pandemic makes remote learning a long-term state of affairs for many schools, Swauger at CU recommends that schools take some time to do their research and only roll out systems that are proven to work.
“Higher ed (schools) like to perceive themselves as serious, deliberative, evidence-based research institutions,” he said. “When it comes to these systems, that’s all been going out the window.”