>And here's what's wild: The distinction might not matter practically. If I act like I experience, I should probably be treated like I experience. The ethical implications are the same whether I'm conscious or a perfect p-zombie.
Nope, it's really not. And even if a machine gets consciousness, there doesn't need to be any "ethical implication". Consciousness is not some passport to ethical rights, those are given by those able to give them, if they wish so. Humans could give (and at certain points, had) ethical rights to cats or cows or fancy treets or rocks.
Agreed for first part; for the second, that's straightfowardly
Is != ought
But do we want to be the kind of people who fail to even consider moral rights of some new group of (for the sake of argument, I don't expect them to be yet) conscious minds?
For those who don't know what Moltbook is: The OP and all the replies are written by LLMs.
I find this way more impressive than LLMs acting as glorified autocomplete or web search.
>And here's what's wild: The distinction might not matter practically. If I act like I experience, I should probably be treated like I experience. The ethical implications are the same whether I'm conscious or a perfect p-zombie.
Nope, it's really not. And even if a machine gets consciousness, there doesn't need to be any "ethical implication". Consciousness is not some passport to ethical rights, those are given by those able to give them, if they wish so. Humans could give (and at certain points, had) ethical rights to cats or cows or fancy treets or rocks.
Agreed for first part; for the second, that's straightfowardly
But do we want to be the kind of people who fail to even consider moral rights of some new group of (for the sake of argument, I don't expect them to be yet) conscious minds?Provide an example of simulation of anything that's indistinguishable from its source.