Post Written By: Shazeda Ahmed
On February 10th, the Algorithmic Fairness and Opacity Working Group (AFOG) hosted Sava Saheli Singh, a Postdoctoral Fellow in the Department of Criminology at the University of Ottawa. Saheli Singh screened three short fictional films she and colleagues at the Surveillance Studies Centre of Queen’s University produced about the unexpected social consequences of surveillance technologies on people’s mental health, economic opportunities, and interpersonal relationships.
In Blaxites, a young Black college student, Jai, is preparing for midterm exams when her doctor notifies her that her insurer is cutting off her supply of anti-anxiety medication. The doctor reveals that the insurer scanned her social media and found images of her drinking wine at a party, a behavior they identify as non-compliant with her prescription. Jai’s doctor offers her two options: pay the uninsured price of the medication, or wear a wristband that will track her vitals and other, unstated activity to prove to the insurer that she can comply with their stringent regulations and receive affordable antidepressants again. Panicked upon surrendering to the latter choice, Jai follows a friend to a shadowy clinic where devices are not allowed. In a flash forward to a follow-up doctor’s appointment, it is revealed that the insurer’s wristband detected Jai’s intake of medication she acquired illegally. She is devastated by this invasive surveillance of her body, which ultimately prompts her to delete a variety of social media, meditation, and productivity apps she is seen using throughout the film.
Coauthored by a teacher who noticed her students struggling with mental health and family issues that interfered with their academic performance, Blaxites is a speculative commentary on the increasingly punitive nature of institutions that track young people in ways they may not fully comprehend. The group discussion noted thematic resonance with the work of an earlier AFOG speaker, Professor Desmond Patton, who studies police scrutiny of social media activity of suspected gang members, casting a broad net which catches many young people who are unaware that they are being surveilled. The film sparked a conversation on how scope creep places the onus on young people to possess “digital literacy” within systems that are further complicated by dark patterns in app design. Saheli Singh talked about how thinking of this as a responsibility of users can divert critiques of the institutions that wield enormous power over individuals, which she wanted to foreground through the film’s creation of a highly specific narrative about these systems’ ability to further traumatize people in need.
The second film in the series, Frames, follows one morning in the life of a woman who lives in a city blanketed with snow and ubiquitous closed-circuit television (CCTV) surveillance cameras. From the moment we see her washing her face inside her apartment up until the film’s final outdoor shot, the unnamed protagonist’s every move is captured within the camera feeds’ blue boxes which frame every person onscreen. Much of the text accompanying these frames is incomprehensible, but at certain moments individual actions, such as money transfers, are labeled. When the protagonist sees someone unable to pay for bus fare, she anonymously donates a generous $750 to the person’s account. The surveillance system identifies this act as one of “community cohesion,” and allots the protagonist points toward an unexplained score. Later the protagonist leaves a $5,000 tip for a waitress in a cafe. The system marks this as potential fraud before verifying the payment and again writing it off as a contribution to “community cohesion.” Yet when we see the protagonist next, she slowly walks to the edge of a canal until she is barely visible in the falling snow, and when she jumps over the edge, her blue-framed box frantically glitches before disappearing altogether. She has died by suicide, recasting the moments of “community cohesion” in a somber light.
Saheli Singh touched upon the complicated nature of portraying a system that might prompt some to suggest it could have been “improved” had it been able to flag instances of “community cohesion” as the troubling signs of personal distress they were later revealed to be. The group discussion reflected upon the divergent types of dangers that would arise from a surveillance system that could identify emotionally troubled individuals as opposed to the risks that played out in the film under a system unable to spot these signs. Saheli Singh pointed to the lack of interpersonal interaction in the film as a contrast to how “community cohesion” was defined, and discussed the film’s role in underscoring that these types of top-down behavioral monitoring can and do happen in Western democracies, despite the common belief that only authoritarian states seek such a granular level of control.
The final short, A Model Employee, charts the trajectory of a high school student and restaurant employee named Neeta. One day her boss chides her about being an inattentive worker, and asks her to enroll in an employee monitoring program that will track her productivity on and off the job while potentially offering discounts and other rewards. She cautiously accepts the wristband and expresses mild exasperation when it sends her frequent notifications about tasks to complete at work. After her shift she heads to her DJ set at a club, where the Model Employee notifications transition to a mix of warnings about being in a high-crime area, exposure to high noise levels, and even an advertisement for a DJ she might like. When her boss compares her performance from the always-on data tracking system to that of her peers and finds it to still be inadequate, she concocts the idea of making her studious, scholarship-seeking sister wear the wristband for her when Neeta goes out in the evenings. Unfortunately, when Neeta’s sister heads to the restaurant one night after closing time to kindly deliver a plant to the owner, an alarm system alerts the owner that an “employee” is attempting to break in to the business. Neeta arrives to find her sister in handcuffs, jeopardizing her sister’s future scholarship prospects.
A Model Employee was written by journalist Tim Maughan, author of the speculative novel Infinite Detail. Saheli Singh’s discussion of the film brought up the well-meaning intentions of the employer who thought adoption of the Model Employee system could give his young employee a leg up in the working world, and the ways in which all parties’ partial comprehension of the system ultimately brought Neeta’s sister into contact with the criminal justice system. She also cited research on the affective labor of retail workers under conditions of workplace surveillance as partial inspiration for the film. The group responded to Saheli Singh’s description of prior screenings where older viewers made the case that before the rise of CCTV, they were surveilled by their neighbors—an argument that she noted belies the scale of consequences including arrest and the resulting records that can erode future economic and social opportunities. Discussants addressed the question of which groups of people society has historically accepted as being tolerable to surveil, drawing upon examples including a recent documentary series detailing the US government’s multi-agency surveillance of Malcolm X leading up to his death.
Educators and others interested in screening these three short films can find them at screeningsurveillance.com. Media packs and screening guides are also available, with Saheli Singh offering to video call in to screenings. Her suggested reading following screenings includes research on the intersection of critical race studies and technology from scholars Simone Browne, Safiya Umoja Noble, and Ruha Benjamin. When asked what recommendations she often finds herself giving after showing the films and being asked what viewers can do to combat the spread of surveillance systems, she advocated for joining pre-existing efforts at pushing back against and mitigating these harms, rather than attempting to forge seemingly novel solutions—technological or otherwise.