1. Home
  2. >>
  3. software
  4. >>
  5. Proctorio: The DaD software does not detect the faces of many black students

Proctorio: The DaD software does not detect the faces of many black students



Proctorio: The DaD software does not detect the faces of many black students

The racial prejudice it is an element also and above all interwoven in technology, an aspect that has become crucial to identify and address. A case in point is remote learning and distance learning where facial recognition software is often used, such as Proctorio, a tool based on artificial intelligence that allows you to monitor students during exams to decide if they are cheating or copying. This surveillance tool is based on open source software that has a history of racial bias issues, according to a Motherboard report. Proctorio, in fact, uses face detection software that has failed to recognize the faces of black people for more than half the time.

  Call of Duty Black Ops Cold War leaves new clues about its announcement on the Summit map

While the software appears useful, it has been found that it marks black students as far from their devices as compared to white students. The problem was discovered by a student who understood how the software performed the face detection and found that he often fails to recognize the faces of black people. Proctorio, and other similar programs, designed to keep an eye on students during exams, if it does not detect students’ faces, it reports them.

Proctorio: The DaD software does not detect the faces of many black students

Akash Satheesan decided to look into the facial detection methods the software was using. Satheesan documented his research methods and discoveries and analyzed the code found in the Proctorio Chrome browser extension, finding that the filenames associated with the tool’s facial recognition were identical to those published by OpenCV, a software program. open source machine vision that can be used to recognize faces and has had problems with racial prejudice previously.

  Netflix sued for Black Mirror Bandersnatch

“Satheesan demonstrated for Motherboard that the facial detection algorithms built into the Proctorio tool behaved identically to the OpenCV models when tested on the same set of faces.” The researcher also explained that the Proctorio software not only failed to recognize the faces of black people, it also struggled to recognize the faces of any ethnicity, with the highest success rate of less than 75%.

Black students described how frustrating Proctorio’s lousy face detection system is. Some have claimed that the software does not recognize them every time they take a test; others fear that their tests will abruptly close and rule them out if they come out of perfect lighting. Satheesan said his findings help demonstrate the fundamental flaw in supervision software, and that the tools, apparently built for educational purposes, actually undermine education and discriminate students: “They use biased algorithms, add stress to a stressful process. during a stressful period, they dehumanize students ”.

  Nikon Italy presents several offers for Black Friday

If we use learning models that replicate today’s world, racism is automated

The director too Shalini Kantayya he denounced extensively in his documentary Netflix Coded Bias the risks of algorithmic discrimination, following the example of Joy Buolamwini, an MIT researcher who has experienced facial recognition bias firsthand; the researcher found that the face recognition software she was using at the MIT Media Lab did not recognize her face, showing that even the seemingly impartial world of technology is subject to racism.

Partial or unrepresentative data, such as the faces of black men and women less present in the databases used to make facial recognition software, lead to poor and often incorrect identification. This happens because machines replicate the world as it is, they do not make ethical decisions, they make mathematical decisions and at the base of the algorithm structure there is no racism or sexism but data, and the data embodies the past. If we use learning models that replicate today’s world, racism is automated and we will not make any social progress.