The ethics of posthumous avatars
We now have products that scan family members to turn them into posthumous avatars. The tagline: “With 2wai, three minutes can last forever.” It's weird to have this so soon. As someone who is down with a posthumous digital consciousness that my kids can interact with, I even find this to be too weird for me. The problem that it uses video to serve as a replacement for a deceased relative. A few boundaries that are important for me:
- By keeping it text-based instead of video, it’s more like you’re interacting with a proxy of my mind instead of my body/soul. It won’t register in my child’s brain as “me” and so it will be less confusing, less toxic to the grieving process.
- It should refer to me in the third-person, even if it is trained on me and sounds like me. It should not be an imposter of me, but a proxy/guide of my thoughts/beliefs, almost like an elder guide.
- It should cite my original logs/essays/journals. In effect this makes the experience similar to something we already have: reading your grandparents journals. This just makes it possible for your questions to immediate summon the relevant wisdom.
The comment section was in unanimous agreement:
- This is one of the most vile things I’ve seen in my life.
- You are a psychopath.
- Shoot that guy.
- You’re creating dependent and lobotomized adults by doing this.
- Demonic, dishonest, and dehumanizing.
- Hey so what if we just don’t do subscription-model necromancy.
- Oh goody, another way for people to completely lose touch with reality and avoid the normal process of grief.
- Nightmare fuel.
- I don’t see how people can say demons aren’t real when there are beings around us willing to create shit like this.
- “You will live to see manmade horrors beyond your comprehension.” — Tesla.
I’d say this is an extremely lightweight microcosm of the core dilemma of what the 2040s will face: a moral war over technology that changes the constraints of human life.