Post written by: Zoe Kahn and Emma Lurie On February 24th, the Algorithmic Fairness and Opacity Working Group (AFOG) hosted Madeleine Clare Elish, who leads the AI on the Ground Initiative at Data and Society. Elish shared two papers, “The Human Body is a Black Box”: Supporting Clinical Decision-Making with Deep Learning as well as an early draft of an upcoming Data & Society report. The discussion at AFOG centered on the integration of the Sepsis Watch system into a hospital emergency department. Sepsis is a leading cause of inpatient mortality in hospitals that is notoriously difficult to diagnose. Sepsis Watch, the system, uses deep learning to assist clinicians in identifying hospital patients that are at risk of developing sepsis. As an anthropologist, Elish joined as an external collaborator with the Duke University project team to conduct observation and interviews in the hospital to study Sepsis Watch’s integration into hospital practices. The Sepsis Watch project challenges our notion of why algorithmic interpretability is important. As there is no universally accepted definition or explanation of sepsis, it is challenging to even conceptualize an interpretable model that identifies patients with a high-risk of sepsis. Interpretability is often emphasized as a mechanism to foster trust in a technical system, instead, the Sepsis Watch team established trust in the system with “rigorous documentation and institution-specific validation and evaluation” (Sendak et al., 2020) as well as deep engagement with hospital stakeholders in the system design, development, and integration process. Elish and her co-authors point out that “too often, the focus has been on the technical properties of the model, and in turn, the potential solutions are intrinsically technical” (Sendak et al., 2020). Elish’s work emphasizes the importance of understanding and designing for the sociotechnical. In her public talk, Repairing Innovation: The Labor of Integrating New Technologies Elish challenged the audience to think deeply about sociotechnical system design. She argues that taking a sociotechnical approach requires moving away from the rhetoric of deploying technologies to one of integrating technologies into existing social contexts. While deploying has its roots in the military and conjures up images of temporarily ‘dropping in’ to a place, integrating requires technologists to grapple with the complex social systems within which technologies reside. Taking the sociotechnical approach leads us to ask different questions. In the case of Sepsis Watch, this meant integrating into a hospital dynamic chock full of social complexities, including long standing power differentials between doctors, residents and nurses, some of whom had never met and were physically separated across multiple hospital floors. While Sepsis Watch is one example in the ‘high stakes’ contexts of medical diagnoses, Elish drew on previous research (part of AI in Context) on family-owned farms and grocery stores to provide other instances of integrating technologies into social systems. Sepsis Watch is one of the first deep learning models to be integrated into a clinical setting. Elish partially attributed the success of the system to the ‘repair’ work done by nurses. Elish and her co-authors also describe the importance of prioritizing communication to establish trustworthiness using a variety of strategies and metrics. The Sepsis Watch system was specifically designed for the Duke University hospital system, this ensured the active engagement of hospital stakeholders, as well as the model being trained on data exclusively from the hospital system. Elish views the inclusion of hospital stakeholders and local context as integral to the success of Sepsis Watch. However, this raises additional questions about how well the Sepsis Watch system would perform in different hospital systems. This brings back a question we wrestled with throughout Elish’s time with AFOG: does it matter how much of Sepsis Watch’s success was due to the technical components vs. the team’s deep understanding of the hospital environment (a complex sociotechnical system)? As the Sepsis Watch team faces discussions about the scalability of Sepsis Watch, this issue seems to be of the utmost importance. In the working group, there was discussion around the challenges of integrating technologies into social contexts. In particular, integrating technologies can be disruptive to the existing organizational processes. Elish argued that sometimes disruption is bad; other times it is not. When disruption is bad, people must do work to ‘repair’ the social context for processes to continue to function. In the case of Sepsis Watch, the decision support system disrupted longstanding power dynamics between doctors and nurses. While this increased the autonomy of some nurses and elevated their expertise, many nurses felt uncomfortable with their new interactions with doctors. To overcome new points of friction, nurses developed a range of strategies to alter their interactions with doctors in ways that ‘repaired’ the disruption caused by the integration of Sepsis Watch. One working group participant asked if the work of ‘repairing’ the difficult interactions was experienced by the nurses as ‘repair,’ or perhaps it was actually experienced by nurses as being itself ‘disruptive’. This question, among others, remain open as Elish and co-authors develop the forthcoming paper. Importantly, the ability to grapple with the nuances of integrated technical systems into social contexts is guided by the questions raised in Elish’s public lecture: What are we integrating into? And for whom?