As a PhD student who has spent over 24 months immersed in the varying concepts of ‘digital death’ (my PhD ponders whether or not some manifestations of it, like the ‘grief tech industry’, should be governed), I was recently tasked to think about my research abstractly, through the medium of music, in preparation for a PhD workshop at DORS#7, facilitated by Dr Dorthe Refslund Christensen and Dr Anu Harju.
This was hard but surprisingly helpful.
It was hard because no songs came to mind that directly referred to digital death and governance but helpful as it made me take stock of my progress, and the direction I need to take to complete my doctoral research.
After forgoing songs that I knew that were about death or loss by Nick Cave, Stevie Nicks, Kylie Minogue and George Michael, I chose Marilyn Monroe’s Diamonds Are a Girl’s Best Friend.
There are two lines Monroe sings that made me think about my research:
- …but I prefer a man who lives and gives expensive jewels.
- …these rocks don’t lose their shape.
The starting point of my research was Digital Immortality, a term that I have criticised but grown to understand better. While there is no formal definition, meanings of Digital Immortality range from persistent online remains of those who once lived, to efforts in life extension via mind cloning. My perception is that Digital Immortality has been appropriated by Transhumanist and Posthumanist ideologies; they believe that digital clones can be imported into synthetic bodies. This association has become more prominent following recent advancements in Artificial Intelligence (AI) and is diverting attention away from sociological framings.
Our increasing reliance on technology is gradually encroaching into non-life matters, such as leaving behind a digital legacy. There are undoubtedly valid matters that deserve policymakers’ attention but paying undue attention to dystopian views of AI risks diminishing the concept of digital death as a serious policy topic.
Further, the desires of the technological elite to move into the Metaverse or biologically merge with technology suggests that what we have around us physically is insignificant and not worth maintaining or sustaining.
I expressed at the workshop that I would rather have a ‘real’ diamond than a synthetic or virtual one. I then analogised this to physical touch: I would prefer the warmth of someone’s body from a hug than the simulation of it via haptic technology, although I have recently been informed that there are now efforts to recreate body temperature virtually.
Also, humans naturally ‘lose their shape’. For me, when the body of a loved one is no longer here, they become diamond-like, in terms of their value and worth in memories. However, when some Transhumanists among the technological elite wish to eradicate death or grief or both, are they potentially diminishing the value and worth of our loved ones?
A crux of the matter brought to light by this musical exercise was the question of what is ‘real’.
Though the term could be deemed conservative in both a limiting and traditional sense, I prefer ‘real’ despite being aware of its sensitivity within digital death. For example, by suggesting that the simulation of someone who once lived is not ‘real’ or, further perhaps, a ‘fake’, could be harmful to a loved one interacting with it to manage their grief. Simultaneously, there may be a distinction that needs to be maintained: suggesting that the simulation is the person could also be harmful (and this point was raised at digital death events I attended post DORS#7, including ‘Death and the Digital Realm’, Leiden University and the 2024 Digital Legacy Conference in Bern, Switzerland, where philosophers spoke of ontological objections to considering synthetic humans as ‘real’). This is one of many ethical quandaries relating to governance and digital death.
However, ‘fake’ may not be so unpalatable a term after all, if one thinks about deepfakes, per AI manipulating ‘real’ images, including visual and audio recordings. This is becoming a governance concern, particularly with dis- and misinformation. Further, with deepfakes of murdered children being created without the awareness of surviving loved ones (e.g. Vallance, 2024), there have been calls to include ethics and morals into the use of AI.
Since the workshop, I was also reminded of the idea of the simulacra. Unlike a simulation which can imitate or a replicate something or someone, the simulacra is linked to the creation of a reality that is more real than what was replicated, i.e., hyperrealism. Disneyland is considered an example.
Therefore, perhaps we need to rethink what we consider the digital or virtual human-like entity to be. Maybe the simulation is neither ‘real’ nor ‘fake’ but of a different status altogether. However, this needs to be agreed upon socially. If so, then perhaps it requires new modes of understanding, recognition and conduct within an increasing technologically mediated world. However, this cannot be left to the technological elite to dictate.
Baudrillard, J. (1994), Simulacra and simulation (S.F. Glaser, Trans.), University of Michigan Press (Original work published 1981).
Grimshaw-Aagaard, M. (2013), The Oxford Handbook of Virtuality, Oxford University Press.
Jacobsen, M.H. (2017), Postmortal society : towards a sociology of immortality, Routledge.
Miekle, G. (2023), Deepfakes, Polity Press.
Stokes, Patrick (2021), Digital Souls: A Philosophy of Online Death (London: Bloomsbury).
Strub, et al. (2024). La mort à l’ère numérique Chances et risques du Digital Afterlife, vdf Hochschulverlag AG.
Vallance, C. (2024), ‘'Sickening' Molly Russell chatbots found on Character.ai’, BBC, 20 October [Online]. Available at https://www.bbc.co.uk/news/articles/cg57yd0jr0go [Accessed 11 November 2024].