In recent years, educators have started to notice something curious about their Gen Z (born roughly 1997-2012) students. Millennial and Gen X computer users grew up with the idea that you save files in a particular folder, and when you want to find the file, you go back to that location. Gen Z users, it seems, don’t think of files as stored in specific locations at all; instead, they think in terms of one big bucket from which you simply call up whatever information you need. Questions like “Where did you store the file?” don’t make sense to these users. Different age cohorts are using the same technology, but guided by very different structuring metaphors.
Haunted By Bots
I’ve been thinking about this a lot lately, partly because it makes me feel old, but also because it relates to work my colleague Adam Buben and I are currently doing on the phenomenon variously called ‘deathbots,’ ‘griefbots,’ ‘ghostbots,’ ‘thanabots’ etc. (Our own preferred term is ‘Interactive Personality Constructs of the Dead,’ or IPCDs). These are AI-driven reuses of ‘digital remains,’ using the traces the dead leave behind to train bots that can interact with the living in ways that mimic the character, concerns, and linguistic style of the deceased. As both Katarzyna Nowaczyk-Basińska and Elaine Kasket have recently noted on this blog, access to this technology has expanded at a dramatic rate. Technologies that were until recently only available to a well-resourced and technologically sophisticated few are rapidly democratizing.
Much has been written about the risk of these technologies to the grieving processes of the living (e.g. Lindemann 2022). In our earlier work (e.g. Buben 2015; Stokes 2021), Adam and I have argued that IPCDs also pose at least two different kinds of risk to the dead. Firstly, there’s a risk that we reduce the dead to mere resources – what Heidegger called ‘standing reserve’ – for the living. Secondly, the risk is that we’ll use those resources to replace, rather than commemorate the dead. Replacement may make the bereaved feel better, perhaps, but it comes at a steep moral cost: treating the dead, and indeed the living as well, as replaceable.
Sceptical Responses
However, these worries are premised on the idea that IPCDs will be able to replace the dead on an experiential level – that is, that we can experience sufficiently well-formed chat bots as being the dead. When I describe this scenario to people, many are deeply sceptical. Perhaps, they say, we can indeed develop bots that would be convincing in all the relevant ways. Yet we wouldn’t forget that the dead are dead – indeed, we cannot forget that the dead are dead – no matter how lifelike the bots we make of them are. We are not going to be like the (apocryphal) 1890s audiences seeing cinema for the first time and fleeing in terror from footage of an approaching train.
There is certainly empirical evidence for this objection. Tal Morse, writing in this blog, describes recent work showing that the Israeli public is resistant to the idea of embracing such technologies (Morse 2024). We can reasonably expect that these responses aren’t confined to Israeli society either.
Adapting to New Technologies
Yet it’s also the case that humans adapt and re-embody ourselves to new technologies in ways that do not leave our concepts unchanged. A generation that grew up with telephones came to hear the voice of the other person through the earpiece and not simply a mechanical reproduction thereof, and the first generation to grow up using Zoom regularly can be expected to have fewer qualms about telepresence than their forebears.
Elsewhere (Stokes, 2021), I’ve used David Oderberg’s notion of ‘telic possibility’ (Oderberg 2012) to describe the way in which we experience digital assistants like Siri and Alexa as if they might as well be real people. It’s not that we think Siri is a real person; it’s that for the purposes for which we talk to Siri, it simply doesn’t matter. The question is whether all our interactions with bots, including ghostbots, could become like that. Like those Gen Z students who no longer think in terms of file locations, it’s entirely possible that generations that have grown up around synthetic agents will no longer use the conceptual distinction between the dead person themselves and their digital simulacrum.
The concern here, then, is not so much that we might come to treat deathbots as if they really are the person who has died instead of a replica. The real danger is that we will stop caring about the difference at all.
References
Buben, Adam (2015), 'Technology of the Dead: Objects of Loving Remembrance or Replaceable Resources?', Philosophical Papers, 44 (1), 15-37.
Lindemann, Nora Freya (2022), 'The Ethics of ‘Deathbots’', Science and Engineering Ethics, 28 (6), 60.
Morse, Tal (2024), 'Digital necromancy: users’ perceptions of digital afterlife and posthumous communication technologies', Information, Communication & Society, 27 (2), 240-56.
Oderberg, David S. (2012), 'Disembodied Communication and Religious Experience: The Online Model', Philosophy & Technology, 25 (3), 381-97.
Stokes, Patrick (2021), Digital Souls: A Philosophy of Online Death (London: Bloomsbury).