November 6, 2024

Since the start of the pandemic, online learning has become far more common — and so have online examinations. Typically, the examination software also proctors the test, observing with the student’s own camera. Does this process violate the Constitution?
A federal judge in Ohio decided last week that at least in some circumstances, the answer is yes. The court’s impeccable reasoning should give us pause about the project of online proctoring.
The problem of surveillance in what’s been called “the constant and expanding classroom” predates Covid-19 and is as severe in grade school as in colleges and universities. But unlike teens and toddlers, college students are adults, clothed in the full regalia of constitutional protections. And the issue isn’t going away anytime soon. The global market for examination proctoring software is expected to reach $1.5 billion by 2028. The U.S. is the biggest user and developer. So while the decision applies of course only to public universities, the outcome matters.
The plaintiff, a student named Aaron Ogletree, alleged that Cleveland State University violated the Fourth Amendment when, before sitting for a test, he was asked to allow remote proctoring software to scan his surroundings in search of “potential impermissible study aids or notes.” According to the complaint, when the email request arrived just before the test in question, Ogletree had confidential tax documents in view that there was no time to shield. The scan was recorded. A copy was kept by the vendor, and the scan was also available to his fellow students. This process, he argued, violated his Fourth Amendment right to be free of unreasonable searches and seizures.
The court agreed. In its decision, it rejected the university’s analogies to cases that have carved out exceptions for items that are in plain view from places where the public routinely goes. A computer’s cameras, the court wrote, “go where people otherwise would not, at least not without a warrant or an invitation.”
The school further argued that, in effect, everybody uses remote proctoring now. The judge was unmoved: “The ubiquity of a particular technology or its applications does not directly bear on that analysis.” In the court’s view, the “very core” of the Fourth Amendment is the right to be free of governmental intrusion in the home; the proctoring scan “occurred in Plaintiff’s house, in his bedroom, in fact.”
One might respond to all this by saying that if Ogletree didn’t like the proctoring policies, he should have enrolled somewhere else, or perhaps taken a course that didn’t require the scan. But the information necessary to make that decision was presented in a manner the court called “opaque.” And, of course, the pandemic left little choice in any case.
The court conceded that the school had a legitimate interest in the prevention of cheating, but found, on balance, that the proctoring scan was so intrusive and unreasonable that it violated the Fourth Amendment.
The outcome warms my libertarian heart. But even those who might disagree with the court’s constitutional conclusion have reason to be concerned about proctoring software.
Consider the most obvious: Apart from all the other things the artificial intelligence watches for — atypical facial movements or speaking, for instance (woe to many disabled students!), or people passing in the background (woe to students with children at home!) — the first task is to make sure that the person sitting for the test isn’t an imposter.
That’s harder than it sounds.
For example, a longstanding criticism of AI software that’s used for purposes of identification is that it often misperceives the gender of trans people. Why does that matter in practical terms? If the remote proctor identifies the student sitting at the keyboard as one gender and the school’s files record the same student as another (or as nonbinary), the mismatch might cause the software to flag the test for possible cheating or even to refuse access.
The problems go deeper. Commercially available facial recognition products tend to be far better at identifying men than women, and White subjects than Black. And, as one might suspect, there’s a deeply troubling intersection. A much-discussed 2018 study published in Proceedings of Machine Learning Research found that the error rate in identifying “darker-skinned females” runs as high as 34.7%.(1)  Dark-skinned women might even be asked by the software to shine more light on their faces to aid remote identification.
None of this is new. The biases in facial recognition software have been known for two decades. One might reasonably respond that what’s needed is better algorithms. Fair enough. But consider the possibility that as the AI grows more accurate, ethical questions might grow more complex.
That’s a topic for another day. For now, the Ogletree case stands as a reminder that the rush to online learning might be making education worse. Certainly it’s making testing worse. There are crude, ugly answers — like requiring that students, as a condition of going to school, waive their Fourth Amendment rights in case of another shutdown.
Or maybe instead schools should discard online proctoring as intrusive and unjust. Put the students on their honor. And if the fear is that with nobody watching there’ll be an epidemic of cheating, then the problem is much more serious than what the camera can and can’t see.
More From Bloomberg Opinion:
• Online Schooling Is the Bad Idea That Refuses to Die: Andrea Gabor
• Remote Learning Can Be a Lot Better: The Editors
• Remote Schooling’s Perverse Social Divide: Justin Fox
(1) True, facial recognition software is more likely to generate false positives than false negatives.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Stephen L. Carter is a Bloomberg Opinion columnist. A professor of law at Yale University, he is author, most recently, of “Invisible: The Story of the Black Woman Lawyer Who Took Down America’s Most Powerful Mobster.”
More stories like this are available on bloomberg.com/opinion
©2022 Bloomberg L.P.

source

About Author