What does it mean to live well in a neighborhood? The concept of leefbaarheid in Dutch, or (neighborhood) livability, can be understood as the interplay between the quality of the social and physical environment characteristics (Hart et al., 2002). Drawing on the notion of positive safety (Schuilenburg & van Steden, 2016), I conceptualize neighborhood livability as a balance between care and control: like positive safety, it involves not only the repression of unwanted behavior, but also the fostering of positive social dynamics, such as care-giving, social support and community-building.
Housing corporations play a key role in neighborhood livability, but they operate within a fragmented network of responsibilities, alongside public and private actors, including the local police, municipalities, support and welfare services, local business owners, and residents themselves. This network is also shaped by technological mediation: e.g., digital communication between housing corporations and tenants, automated flagging systems for rent deferral and camera surveillance (whether installed by housing corporations or residents themselves through 鈥渟mart鈥 video doorbells).
With fragmented responsibilities in socio-technical contexts, there are several well-documented accountability challenges. In essence, accountability is a relationship: it involves an actor, an individual or organization, being answerable to a relevant forum (or fora), which evaluates their conduct and may impose negative or positive consequences (Bovens, 2007). A key challenge with fragmentized responsibilities is the 鈥減roblem of many hands,鈥 where no single individual or organization can be reasonably held accountable for an overarching result, because many people contributed in small and overlapping ways (Doorn, 2012; Thompson, 1980; Van de Poel et al., 2012). Digital mediation introduces additional complexities: digital technologies can be used to offload accountability, 鈥渃omputer says no鈥 (Nissenbaum, 1996), or to outsource moral responsibility, 鈥渂lame it on AI鈥 (Constantinescu et al., 2022). Additionally, as digital technologies facilitate connections across space and time, they can create moral distance, running the risk that users feel detached from the individuals their decisions affect (Villegas-Galaviz & Martin, 2024).
The article I am currently writing as part of my PhD on the legal aspects of AI and public safety explores (the challenges of) accountability networks in socio-technical contexts, through a case study on the role of housing corporations in neighborhood livability in the Netherlands. A key focus is a current legislative proposal that would enable (sensitive) data-sharing between housing corporations and other organizations involved in neighborhood livability and resident well-being. The proposal is driven by housing corporations鈥 experience that tenants with rent arrears typically face multiple complex problems. By combining different data sources 鈥 such as rent arrears and medical histories 鈥 housing corporations want to gain a more holistic understanding of tenant vulnerabilities, allowing them to connect tenants to appropriate support services, rather than merely enforcing rent collection. With data-sharing as the foundation for automation, it presents an opportunity to design and automate with care. One conceptual answer to prevent (prospective) automation from reinforcing punitive approaches and obscuring accountability, comes from care ethics, which considers relationships, (inter)dependency and vulnerability as the basis of existence, consequently invoking duties of care. Imagining automated systems through the lens of care shifts the focus from risk assessment and control to relational accountability and patterns of vulnerability. Technology for social support rather social control.
References
Bovens, M. (2007). Analysing and Assessing Accountability: A Conceptual Framework. European Law Journal, 13(4), 447鈥468.
Constantinescu, M., Vic膬, C., Uszkai, R., & Voinea, C. (2022). Blame it on the AI? On the moral responsibility of artificial moral advisors. Philosophy & Technology, 35(2), 35.
Doorn, N. (2012). Responsibility Ascriptions in Technology Development and Engineering: Three Perspectives. Science and Engineering Ethics, 18(1), 69鈥90.
Hart, J., Knol, F., Maas-de Waal, C., & Roes, T. (2002). Zekere banden: Sociale cohesie, leefbaarheid en veiligheid. Sociaal en Cultureel Planbureau.
Nissenbaum, H. (1996). Accountability in a computerized society. Science and Engineering Ethics, 2(1), 25鈥42.
Schuilenburg, M., & van Steden, R. (2016). Positieve veiligheid. Een inleiding. Tijdschrift over Cultuur & Criminaliteit, 6(3), 3鈥18.
Thompson, D. F. (1980). Moral Responsibility of Public Officials: The Problem of Many Hands. American Political Science Review, 74(4), 905鈥916.
Van de Poel, I., Nihl茅n Fahlquist, J., Doorn, N., Zwart, S., & Royakkers, L. (2012). The problem of many hands: Climate change as an example. Science and Engineering Ethics, 18, 49鈥67.
Villegas-Galaviz, C., & Martin, K. (2024). Moral distance, AI, and the ethics of care. AI & Society, 39(4), 1695鈥1706.
- More information
While the article is still in development, I will present and discuss an early version at the upcoming symposium Unveiling the Program for AI-Imaginations and Public Safety Symposium, organized by AI-MAPS! , organized by (among others) AI MAPS colleagues Majsa Storbeck and Marc Schuilenburg, on June 19 and 20. I hope to see you on this inspiring event!
- Related content
- Related links
- Overview blogposts | AI Maps