Illustration by Adriana Heldiz

Even after his classes went online last spring, William Scott Molina, a 31-year-old student at San Diego State University, thought his remote learning experience was going just fine. As required in one of his online courses, he downloaded and began using the Respondus LockDown Browser, custom software that prevents students from venturing outside of their testing page to ward off cheating.

Molina didn’t think much of the company until he started using its monitoring software for a business administration course over the summer and his webcam became a device to observe, record and study him during an exam.

By the end of the semester, Molina was the subject of multiple cheating accusations that consumed his academic life.

In the swift and chaotic pivot to virtual test-taking, companies like Respondus — along with competitors including Honorlock, ProctorU and Proctorio — have stepped in to help schools keep watch on students. Because the new digital tools are required in certain courses, students are being forced to subject themselves to surveillance inside their own homes and open themselves up to disputes over “suspicious activities,” as defined by an algorithm.

According to the company website, Respondus Monitor uses “powerful analytics … to detect suspicious behaviors during an exam session,” and then flags such behaviors for professors to review once the session concludes.

At the self-described “heart” of the company’s monitoring software is Monitor AI, a “powerful artificial intelligence engine” that collects facial detection data and keyboard and mouse activity to identify “patterns and anomalies associated with cheating.”

Privacy advocates have been raising alarms about this type of technology and how easily it’s infiltrating the lives of students during a public health emergency. For starters, facial detection and facial recognition technologies have been widely criticized in the last several years for having inherent biases against people of color, often misidentifying or unable to recognize the faces of Black and Brown people as accurately as White faces.

“This obviously has a disproportionate effect on Black and Brown test-takers,” said John Davisson, a legal expert and senior counsel at Electronic Privacy Information Center, a nonprofit based in Washington, D.C., whose work is focused on the privacy and algorithmic aspects of these proctoring systems. “And people who have some sort of facial disfigurement have special challenges; they might get flagged because their face has an unexpected geometry.”

To Davisson, the sheer volume of data that is collected by these proctoring technologies — which varies slightly by system — was unsettling before the pandemic forced students online. Now, Davisson said, with most students at home, the data collected are even more intimate. And students don’t have the choice to opt out.

Davisson also questioned the technology’s efficacy in catching people who cheat, despite what the companies insist.

“Many of them make pretty remarkable claims about being able to detect signs of cheating based on cues from the data they’re collecting,” Davisson said. “But it’s through opaque algorithms, it’s very difficult to evaluate whether these systems are correctly flagging signs of cheating.”

Regardless of the algorithmic ambiguity, these companies have made millions of dollars through partnerships with colleges and universities since COVD-19, the Washington Post reported. SDSU is just one of many public schools that have entered into contracts with companies like Respondus as they moved the classroom online.

Molina would soon become intimately familiar with Monitor AI’s algorithmic problems. The lead professor for his business administration course was SDSU’s finance department chair, Kamal Haddad, but Molina noted that most of his correspondence was with Renee Merrill, a lecturer and assistant. In preparation for the class’s first exam, Merrill gave students explicit instructions on how to use the new monitoring software.

“You have to record your environment, you have to record the whole desk, under the desk, the whole room,” Molina recalled. “And you need to use a mirror to show that you don’t have anything on your keyboard.”

On top of that, if the wireless connection was disturbed during an exam, Molina said, students would receive an automatic zero — no excuses. This was a particular point of stress for Molina, who rents a room in a family home that he shares with his girlfriend and 3-year-old daughter. The commotion can interfere with his internet connection.

“I’m on financial aid,” Molina said. “I don’t have a lot of resources to throw out there, or to have a room strictly dedicated to learning.”

Even so, Molina was performing well in his summer course. By the time the final rolled around in mid-August, he said, he had a solid A in the class. But three days before the final, he got an email from Merrill citing “suspicious activity” during his second exam.

Merrill said he didn’t properly show the front and back of his notes, on which students were permitted to write anything they wanted to help with the exam. She also mentioned that the monitoring software had picked up Molina talking throughout the exam.

He said he didn’t realize he hadn’t sufficiently shown his notepaper to his webcam, or that his habit of talking through questions aloud would be considered suspicious.

“I try to suppress it as much as possible but sometimes it just happens without me knowing,” Molina explained in an email to his professor. “Especially when I am stuck or trying to make sense of a question that isn’t registering well with me.”

Merrill’s email served only as a warning, but Molina was all the more nervous when he sat down to take his last exam just a few days later. To his surprise, he got another email from Merrill alleging he had been flagged for cheating once again during the course’s third exam. The information was sent over to SDSU’s Center for Student Rights and Responsibilities.

“At the beginning of the exam, you leave the area for about one minute without explanation,” Merrill wrote in an email to Molina. She added that it looked like he was using his calculator for problems that did not require a calculation and that he solved certain problems too quickly. As a result, Molina was given an F in the course and his case was submitted to SDSU’s Center for Student Rights and Responsibilities, where he could appeal the decision.

The story wasn’t as open and shut as the video recording Respondus had taken made it seem. Molina said he followed the instructions: He turned on his webcam and went through the environment check. But before he officially started the exam, he said he heard a knock on his locked door coming from his daughter, who was accompanied by the special-needs child of the family he lives with. He went to go check on them and waited for his girlfriend to arrive and take over so he could get back to the test.

When it came to the calculator allegations, Molina was even more frustrated. He didn’t always answer test questions in order; like a lot of students, sometimes he skipped around to different questions. He also noted that all of the exam questions could be accessed on the same page, and weren’t separated individually.

Molina appealed. But even well into the fall semester and over a month after the accusations were filed, the office had canceled his scheduled meetings twice due to coronavirus-related emergencies.

After the third rescheduling, Molina finally had the chance to explain himself. One week later, he received a letter of “no action,” meaning the university would not pursue disciplinary action against him. He forwarded the letter to his business administration professors and requested that he get the grade he deserved. He said he had already emailed the student ombudsman twice, and never received a reply. Merrill finally gave him his grade back, almost two months after he’d received an F in the class.

Throughout those two months, Molina was consumed by the cheating accusation, he said, and what an F would do to his GPA.

“From that point on, I’m stressed,” he said. “Every day I think about it. I go to bed, I think about it. I wake up, I think about it. Because you don’t know what’s going to happen.”

La Monica Everett-Haynes, a spokeswoman for the university, wrote in an email that the timelines for the school’s Center for Student Rights and Responsibilities investigations vary widely depending on a number of factors, including the delays that come along with students not showing up or needing to reschedule meetings.

“We work as judiciously as possible, but we also prioritize active student participation in this process,” Everett-Haynes wrote.

Students in the same business class this fall — who estimated that the enrollment was around 500 — have reported similar problems with the software. Neekoly Solis, an SDSU junior and first-year transfer student, said each test-taker now has to verbally explain each of their calculations to their webcam every time they use their calculators during an exam.

“Taking an online exam in a course that you already feel like you’re struggling in is anxiety-inducing,” Solis said. To be given a set of additional instructions, he said, just adds a level of stress and complication to a difficult test-taking environment.

The research about whether proctored exams produce less cheating is also unclear, said Davisson of the Electronic Privacy Information Center.

“Even if you could demonstrate that it was necessary to collect all of this data and that the algorithms were reliable,” he said, “if all they’re doing is reproducing the exact same rates of cheating as would occur without this test surveillance software, then the whole industry is snake oil.”

Mariane Edelstein, a U.S.-born student who was raised in Brazil, who transferred to SDSU from Mesa College last year, took the same business administration class over the summer. She said that every time she has had to use Respondus Monitor for an online exam, she performs poorly.

“I can’t control external factors. I can’t control the time that my neighbor is doing his lawn. I can’t control when the Amazon delivery guy is going to stop by and ring my bell,” Edelstein said. “All this stuff gives me crazy anxiety.”

Edelstein also has concerns about her privacy. As she followed the instructions laid out by Respondus Monitor and SDSU, she filmed her testing environment. Then, she had to show the camera her desk, and underneath her desk, with her bulky desktop computer. She realized she was in a pair of shorts, and her webcam was picking up — and recording — seconds of her bare legs that could be seen by her older male professor. She was creeped out.

“You have to do a crotch shot, basically,” said Jason Kelley, associate director of research at the Electronic Frontier Foundation, a digital privacy group based in San Francisco. He recalled watching a tutorial video from another proctoring system called HonorLock, horrified as he watched the video subject do a long pan of their body.

After she took that environment check video, Edelstein started looking into Respondus’s data retention period and privacy terms.

Respondus’s website states that the default data retention period for Respondus Monitor is five years, but the client can change that.

“That’s absurd, to say the least,” Kelley said. “What I would expect … is that they would hold on to that data as long as the teacher or the university or the student needs it to determine whether or not there was any kind of suspicious activity, which would be a week, two weeks, maybe a month?”

Respondus CEO David Smetters, however, wrote in an email that proctoring data collected by the monitoring software is under the control of the university, not the company. Everett-Haynes insisted that captured videos are only used for review purposes and aren’t kept as a part of a student’s permanent record.

But Smetters’ statement is only partially true. Like other tech companies, Respondus maintains access to the data it collects.

“Only a few Respondus engineers have the [security] credentials that would allow access to both the database and proctoring videos,” Smetters said. The system that tracks the activity cannot be modified by engineers, he added, and “attempts to access certain data would trigger alarms that are monitored by our security and data privacy teams.”

Researchers have different views on who’s ultimately responsible for rolling out an educational surveillance system. But Bruce Schneier, a security expert and public policy lecturer at Harvard University’s Kennedy School, argues that universities deserve the blame for rushing into partnerships with tech companies without sufficiently scrutinizing their claims first.

“A lot of them are doing really bad jobs here in understanding students’ needs,” he said.

While institutions have a major responsibility to understand surveillance companies’ privacy policies before partnering with them, both the company and university must work together to give students a transparent look into the data they’ll be collecting, said Linnette Attai, an expert on youth and education data privacy regulations.

“Students should be given the ability to make a well-informed choice about their decision, to understand the privacy and security practices,” Attai said. “Because there’s a good deal of data collection happening here.”

Correction: An earlier version of this post mischaracterized Mariane Edelstein’s background; she was born in the United States.

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.