Institutions are investing in responsible research and innovation efforts, to address anxieties around the uncertainties of sociotechnical innovation and in attempts to responsively take care of the future (Bevilacqua, 20171Tesla Car With Self-Driving Features Strikes and Kills UK Cyclist: 2017. https://www.bicycling.com/news/a20034037/cyclist-killed-by-tesla-car-with-self-driving-features/. Accessed: 2022-09-12.). Rights- and risk-based frameworks attempt to ensure the protection of values of perceived importance, such as privacy, independence, and autonomy, centering people as autonomous and independent individuals who can make informed decisions about technology use given sufficient information. This perspective puts impossible responsibilities on people as they make decisions about technology (Dourish, 20042Dourish, P. 2004. What we talk about when we talk about context. Personal and Ubiquitous Computing. 8, 1 (Feb. 2004), 19–30. DOI:https://doi.org/10.1007/s00779-003-0253-8., Stewart et al., 20083Stewart, J. et al. 2008. Accessible contextual information for urban orientation. Proceedings of the 10th international conference on Ubiquitous computing – UbiComp ’08 (Seoul, Korea, 2008), 332.). Despite increasing work on the development of ubiquitous computing systems as holistic care services (Key et al, 20214Key, C. et al. 2021. Proceed with Care: Reimagining Home IoT Through a Care Perspective. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (Yokohama Japan, May 2021), 1–15., Light & Akama, 20145Light, A. and Akama, Y. 2014. Structuring future social relations: the politics of care in participatory practice. Proceedings of the 13th Participatory Design Conference on Research Papers – PDC ’14 (Windhoek, Namibia, 2014), 151–160.) care remains a difficult term to define. When ambient sensing environments are designed to provide care, notions of care relations and care networks move into the focus of technological development, acknowledging the fluidity of relations between actors.
In contrast, the logic of care positions people in relation to each other, entangled in networks of varying needs, relying on each other to make decisions, and embedded in time (Mol, 20086 Mol, A. 2008. The Logic of Care: Health and the Problem of Patient Choice. Routledge.). Designing algorithmic systems which purport to care means to also take on political dimensions of care as they judge about whom to care for, in which ways, and to which extent (Martin et al., 20157Martin, A. et al. 2015. The politics of care in technoscience. Social Studies of Science. 45, 5 (Oct. 2015), 625–641. DOI:https://doi.org/10.1177/0306312715602073.). In order “to imagine a world organized to care well” (Weiser, 19918Weiser, M. 1991. The computer for the 21st century. ACM SIGMOBILE Mobile Computing and Communications Review. 3, 3 (Sep. 1991), 3–11. DOI:https://doi.org/10.1145/329124.329126.) context-aware systems must take account of uncertainty, tensions and conflict.
Conceptual Thinking
From a critical algorithm studies perspective, algorithms are viewed as social and cultural artefacts that mediate social interactions, shape individual behaviour, and influence societal outcomes (Kitchin, 20179Kitchin, R., 2017. Thinking critically about and researching algorithms. Information, communication & society, 20(1), pp.14-29.). In turn, the outcomes of these mediations lead to bottom-up understanding of the agency and potential of algorithmic systems (Seaver, 201710Seaver, N., 2017. Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big data & society, 4(2)). To research how diverse notions of what algorithms are and what they can do become shared and adopted, researchers are increasingly relying on ethnographic methods to approach everyday situations where algorithmic systems are discussed, imagined and appropriated (Gran et al., 202111Gran, A.B., Booth, P. and Bucher, T., 2021. To be or not to be algorithm aware: a question of a new digital divide?. Information, Communication & Society, 24(12), pp.1779-1796.).
Some of these approaches include algorithmic folk theories as a method to collect lay understandings of algorithms that are constructed due to negative or positive experiences with algorithmic platforms (Eslami et al., 201612Eslami, M., Karahalios, K., Sandvig, C., Vaccaro, K., Rickman, A., Hamilton, K. and Kirlik, A., 2016, May. First I” like” it, then I hide it: Folk Theories of Social Feeds. In Proceedings of the 2016 cHI conference on human factors in computing systems (pp. 2371-2382).), stories about algorithms (Schellewald, 202213Schellewald, A., 2022. Theorizing “stories about algorithms” as a mechanism in the formation and maintenance of algorithmic imaginaries. Social Media+ Society, 8(1)) and algorithmic gossip (Bishop, 201914Bishop, S., 2019. Managing visibility on YouTube through algorithmic gossip. New media & society, 21(11-12), pp.2589-2606.) as methods to document shared ideas of algorithms by users of the same platform, algorithmic personas as a method to visualise algorithmic functions through stereotypical human characters (Buchi et al., 202115Büchi, M., Fosch-Villaronga, E., Lutz, C., Tamò-Larrieux, A. and Velidi, S., 2021. Making sense of algorithmic profiling: user perceptions on Facebook. Information, Communication & Society, pp.1-17.), and algorithmic imaginaries as an approach to capture what algorithms mean for people in the intersection of use, gossip and reflection (Bucher, 201716Bucher, T., 2017. The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. Information, communication & society, 20(1), pp.30-44.).
We build on these approaches to ask:
i) through which everyday practices are users of aging-in-place technologies discussing and imagining the agency and intention of these technologies?
ii) how are these imaginaries shaping the relationships they develop with aging-in-place technologies?
ECOSYSTEM
Algorithms as Care-Takers
According to social learning theory (Bandura, 197717Bandura, Albert, and Richard H. Walters. Social learning theory. Vol. 1. Prentice Hall: Englewood cliffs, 1977.), humans rely on observing how other people behave–and to which consequences–to adopt behavioural patterns. There are diverse actors in our environment that influence which information we learn and reproduce, which are broadly classified according to their social and cultural relevance and divided into vertical (parents/children), oblique (non-parental adults/children) and horizontal (peers) hierarchies. The influence of these actors changes over time, with actors replacing or displacing each other alongside changes in culture and context.
In the context of ageing and care practices, old adults have mostly relied on their children (vertical) and medical professionals (oblique), yet with the exponential integration of algorithmic systems into caring practices it is relevant to ask: which actors might be replaced or displaced as algorithms increasingly take on caring roles, and to what consequences? We address this questions by using a social learning lens to look at how the imaginaries of aging-in-place technology users become positioned in their caring networks.