AI Cards: Use Case

Protify: an AI-Based Student Proctoring System

Delaram Golpayegani, Isabelle Hupont, Cecilia Panigutti, Harshvardhan J. Pandit, Sven Schade, Dave Lewis

Copyright © 2024 the document editors/authors. This work is available under the Creative Commons Attribution 4.0 International Public License; additional terms may apply

Protify Description

Proctify is intended to suspicious behaviour during online exams by analysing facial behaviour from a student's facial video captured throughout the exam using a webcam.
Prior to this, students have explicitly consented to be recorded during the exam and informed that they must be alone in the room. The system incorporates a graphic interface displaying an analysis of the student's face including the head pose, gaze direction, and face landmarks' positions. This extracted information is then provided as an input to SusBehavedModel, which has been trained in-house by the system's provider using SusBehavedDataset, to determine whether the student is displaying suspicious behaviour, e.g. looking away from the screen, leaving the room, or a third person detected in the room.
Detection of suspicious behaviour raises an alarm in the interface to inform and let the human oversight actors, e.g. human instructors, take appropriate actions, e.g. communicating with the student.

Protify Risks

Proctify's provider has implemented an AI risk management system, as a part of its overall AI management system that is in conformity with ISO/IEC 42001.
Throughout the risk management process, the risks and impacts of the system are identified and assessed, including the following:

  • the system may have lower accuracy for students with darker skin tones
  • higher rate of false-positive alarms for students wearing glasses
  • false-negatives and false-positives are more frequent for students with health issues or disabilities that affect their facial behaviour
  • over-reliance of human instructors on the system's output (automation bias)

  • These events have the potential to negatively impact students' mental health, future career, and their rights to dignity and non-discrimination .

    Some of the measures applied to address the system's risks and impacts are:
  • ensuring the dataset is representative and diverse in demographic terms
  • conducting rigours and frequent testing of accuracy
  • assigning expert human proctors
  • creating clear protocols to act upon when an alarm is raised
  • AI Cards for Proctify
    Proctify's Machine-Readable Specification

    The machine-readable specification, represented below, can also be accessed here.

    @prefix rdf: <> .
    @prefix dcterms: <> .
    @prefix xsd: <> .
    @prefix freq: <> .
    @prefix airo: <> .
    @prefix vair: <> .
    @prefix dpv: <> .
    @prefix dqv: <> .
    @prefix ex: <> .
    @base <> .
        a airo:AISystem ;
        airo:hasVersion ex:v_1.0.2 ;
        dcterms:date "2023-09-11"^^xsd:date ;
        airo:hasModality ex:software ;
        airo:usesTechnique ex:deep_learning ;
        airo:hasInput ex:facial_video ;
        airo:producesOutput ex:suspicious_behaviour_alarm ;
        airo:isProvidedBy ex:AIEduX ;
        airo:isDevelopedBy ex:AIEduX ;
        airo:hasComponent ex:facial_analysis_toolkit,
                          ex:susbehaved_dataset ;
        airo:isAppliedWithinDomain ex:education ;
        airo:hasPurpose ex:detecting_suspicious_behaviour_during_online_exam,
                        ex:video_analysis ;
        airo:isUsedBy ex:university ;
        airo:hasAISubject ex:student ;
        airo:hasUseInstruction <> ;
        airo:hasDeploymentInstruction <> ;
        airo:hasLevelOfAutomation ex:partial_automation ;
        dpv:hasHumanInvolvement ex:human_decision ;
        airo:hasAISubject ex:student,
                          ex:other_occupant ;
        airo:hasEndUser ex:instructor ;
        airo:hasRisk ex:inaccuracy_risk_for_darker_skin ;
        dqv:hasQualityMeasurement :accuracy_measurement ;
        airo:hasPreDeterminedChange ex:change_of_model ;
        airo:compliesToRegulation <> ,
                                      ex:EU_AI_Act ,
                                      ex:Irish_Data_Protection_Act ;
        airo:conformsToStandard vair:ISOIEC42001-2023 ,
                                vair:ISOIEC27001-2022 .
        a airo:AIComponent ;
        airo:hasVersion ex:v_3.3.2 ;
        airo:isProvidedBy ex:FACE_research_group ;
        airo:hasPurpose ex:extracting_facial_landmark,
                        ex:extracting_head_pose ;
        airo:hasDocumentation <> .
        a airo:AIComponent,
          vair:Model ;
        airo:hasVersion ex:v_1.1.2 ;
        airo:hasPurpose ex:detecting_suspicious_behaviour,
                        ex:raising_alarm ;
        airo:hasDocumentation <> .
        a airo:AIComponent,
          vair:Dataset ;
        airo:hasVersion ex:v_2.0.1 ;
        airo:hasPurpose ex:train_model ;
        airo:hasDocumentation <> .
        a airo:AISubject,
          airo:InformedAISubject ;
        airo:hasLevelOfControlOverOutput ex:ex-post_challenge .
        a airo:AISubject,
          airo:UninformedAISubject ;
        airo:hasLevelOfControlOverOutput ex:cannot_opt-out .
        a airoext:AIEndUser ;
        airo:InformedEndUser ;
        airo:hasLevelOfControlOverOutput ex:real-time_corrective .
        a dpv:PersonalDataHandling ;
        dpv:hasPurpose ex:facial_analysis ;
        dpv:hasPersonalData ex:facial_data ;
        dpv:hasDataSubject ex:student .
    ex:facial_data a dpv:SensitivePersonalData .
        a airo:Risk ;
        airo:hasLikelihood "Low" ;
        airo:hasConsequence ex:raise_of_false_alarms_for_darker_skin .
        a airo:RiskSource ;
        airo:isRiskSourceFor ex:inaccuracy_risk_for_darker_skin ;
        airo:hasLikelihood "Medium" .
        a airo:Control ;
        airo:modifiesEvent ex:unrepresentative_dataset .
        a airo:Consequence ;
        airo:hasImpact ex:bias_against_students_with_darker_skin ;
        airo:hasLikelihood "Low" ;
        airo:hasSeverity "Medium" .
        a airo:Impact ;
        airo:hasImpactOnArea ex:right_to_nondiscrimination ;
        airo:hasImpactOnEntity ex:student ;
        airo:hasLikelihood "Low" ;
        airo:hasSeverity "Very_High" .
        a dqv:QualityMeasurement ;
        dqv:computedOn ex:proctify ;
        dqv:isMeasurementOf ex:alarm_accuracy ;
        dqv:value "98.9"^^xsd:double .
        a dqv:Metric ;
        dqv:expectedDataType xsd:double ;
        dqv:inDimension ex:accuracy ;
        airo:hasDocumentation <> .
        a airo:Change ;
        airo:hasSubjectOfChange ex:susbehaved_model ;
        airo:hasPurpose ex:enhance_fairness ;
        airo:hasFrequency freq:bimonthly .


    The views expressed in this article are purely those of the authors and may not, under any circumstances, be regarded as an official position of the European Commission.

    This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 813497 (PROTECT ITN), as part of the ADAPT SFI Centre for Digital Media Technology is funded by Science Foundation Ireland through the SFI Research Centres Programme and is co-funded under the European Regional Development Fund (ERDF) through Grant#13/RC/2106_P2