I do not disagree that life should be given more respect. However, where do we draw the line? What is considered a precious life? Is a worms life precious? If so, what about a machine that emulates the "consciousness" of a worm? Is it now precious and worth protecting?. Is it no longer a "machine?" There is an interesting convergence between this question posed here and the post I link to. There are going to be some interesting ethical questions for us moving forward.
Those are good questions that deserve some deep thinking and very honest answers. I think the easiest way to arrive at the answers is to look at the motivations behind the ending of a life. Whatever life it may be. Why are we killing the worm? Is it for a noble cause or a selfish one? I believe the motivation behind every action should be the method of valuing if that action was worth its cause. In the case of the conscious killing of a being, however small, you have to ask what have we gained? Was the sacrifice worth it? Most importantly, was it necessary?
Would you apply the same reasoning and questions to the ending of a machine that emulates the consciousness of a worm?
That question is about consciousness though, not about life. And certainly not about suffering or the right to exist or anything we associate with terminating an existing thing.
I admit this isn't something which I have dedicated much time researching or thinking about. I'm way more concerned about reducing current existing suffering. But my first impression is, why not? If we value humans because of the arguably higher consciousness, I don't see why we should have double standards for AI.
Because we don't understand it? You have nerves in your hands or feet (I don't know the exact number, but suppose it's more than in all of the worm under discussion). But amputations are not considered destruction of consciousness. Hell, if consciousness is the result of any large enough spacial pattern, it may be enough that the particles in a liquid are thoughts created and destroyed. A crowd of enough humans may be enough to form a primitive, "higher" consciousness. This same question extends to the electrons in a computer circuit. They may be enough to represent a consciousness, or they may not be. No one has a clue.If we value humans because of the arguably higher consciousness, I don't see why we should have double standards for AI.
Fair point. I don't put aside the possibility that consciousness may exist outside of the brain. In which case it may be wise to use the precautionary principle and approach all of existence with respect, until we figure out what actually is, this thing we call consciousness.Hell, if consciousness is the result of any large enough spacial pattern, it may be enough that the particles in a liquid are thoughts created and destroyed. A crowd of enough humans may be enough to form a primitive, "higher" consciousness.