Iโm not my body and Iโm not my mind. I am the ethical soul, the decision-making process. If the replacement makes all the same decisions I would, it IS me.
The thought process assumes it is a complete and perfect cloning of all aspects we do and donโt understand. The reason the clone is not you is because if I do something to the clone it does not affect you.
Like if you take a water bottle and clone it, drinking one does not cause the other to be empty. Thus they must be two separate things.
What if something like ChatGPT is trained on a dataset of your life and uses that to make the same decisions as you? It doesnโt have a mind, memories, emotions, or even a phenomenal experience of the world. Itโs just a large language data set based on your life with algorithms to sort out decisions, itโs not even a person.
Is that you?
No, because not all my decisions are language-based. As gotchas go, this oneโs particularly lazy.
Iโm having a hard time imagining a decision that canโt be language based.
You come to a fork in the road and choose to go right. Obviously there was no language involved in that decision, but the decision can certainly be expressed with language and so a large language model can make a decision.