Jason Kelley writes:
Over the past year, the use of online proctoring apps has skyrocketed. But while companies have seen upwards of a 500% increase in their usage, legitimate concerns about their invasiveness, potential bias, and efficacy are also on the rise. These concerns even led to a U.S. Senate inquiry letter requesting detailed information from three of the top proctoring companies—Proctorio, ProctorU, and ExamSoft—which combined have proctored at least 30 million tests over the course of the pandemic.1 Unfortunately, the companies mostly dismissed the senators’ concerns, in some cases stretching the truth about how the proctoring apps work, and in other cases downplaying the damage this software inflicts on vulnerable students.
In one instance, though, these criticisms seem to have been effective: ProctorU announced in May that it will no longer sell fully-automated proctoring services. This is a good step toward eliminating some of the issues that have concerned EFF with ProctorU and other proctoring apps. The artificial intelligence used by these tools to detect academic dishonesty has been roundly attacked for its bias and accessibility impacts, and the clear evidence that it leads to significant false positives, particularly for vulnerable students. While this is not a complete solution to the problems that online proctoring creates—the surveillance is, after all, the product—we hope other online proctoring companies will also seriously consider the danger that these automated systems present.
Read more on EFF. Legal intern Haley Amster contributed to the post.