This is an interesting ethics exercise. I'm sure that in today's dominant view of humanity vs nature relationship, few will see any dilemma in this question. For most a human life is far more important than any amount of lives of other species. The interesting exercise in this conundrum is to question, why is it that we feel this way? What makes a human being's life worth more than any amount of other beings? Why do we feel superior? Are we worth more because we're more self-aware or more conscious (if we can prove scientifically that this is actually true of course)? Maybe being anthropocentric is simply a hard-wired survival instinct of self-preservation of the species? I know that if I had to choose between the life of a close one and the life of an animal I'd of course choose the human animal. The truth of the matter is that this particular example is an extreme and rare situation, which covers a tiny percentage of all animal suffering and which in reality it's not something that (fortunately) many of us have to worry about. And as all extreme and rare cases, perhaps it should be treated as the exception. On the other hand, if we're going to have an honestly concerned evaluation of animal rights, would it not make sense that we start with the primary source of suffering? You know, the subjugation, farming and slaughtering of millions of beings, for meat, furs, leathers, cosmetics, all of which are no longer necessary for the survival of 21st century Homo-sapiens (or "wise" man). There are plenty of lifestyle choices that we make on a daily basis, which may or may not contribute to very real animal suffering going on right now. To name a few, what I eat, what I wear, what entertainment I choose, are simple decisions which everyone can make based on individual ethical awareness and concern. If we tackle the easier ethics problem first - justifying the main source of animal abuse as a means for human comfort and pleasure - perhaps we'll be better equipped to deal with the exceptions of animal abuse, such as the one of when a human life depends on it.
I do not disagree that life should be given more respect. However, where do we draw the line? What is considered a precious life? Is a worms life precious? If so, what about a machine that emulates the "consciousness" of a worm? Is it now precious and worth protecting?. Is it no longer a "machine?" There is an interesting convergence between this question posed here and the post I link to. There are going to be some interesting ethical questions for us moving forward.
Those are good questions that deserve some deep thinking and very honest answers. I think the easiest way to arrive at the answers is to look at the motivations behind the ending of a life. Whatever life it may be. Why are we killing the worm? Is it for a noble cause or a selfish one? I believe the motivation behind every action should be the method of valuing if that action was worth its cause. In the case of the conscious killing of a being, however small, you have to ask what have we gained? Was the sacrifice worth it? Most importantly, was it necessary?
Would you apply the same reasoning and questions to the ending of a machine that emulates the consciousness of a worm?
That question is about consciousness though, not about life. And certainly not about suffering or the right to exist or anything we associate with terminating an existing thing.
I admit this isn't something which I have dedicated much time researching or thinking about. I'm way more concerned about reducing current existing suffering. But my first impression is, why not? If we value humans because of the arguably higher consciousness, I don't see why we should have double standards for AI.
Because we don't understand it? You have nerves in your hands or feet (I don't know the exact number, but suppose it's more than in all of the worm under discussion). But amputations are not considered destruction of consciousness. Hell, if consciousness is the result of any large enough spacial pattern, it may be enough that the particles in a liquid are thoughts created and destroyed. A crowd of enough humans may be enough to form a primitive, "higher" consciousness. This same question extends to the electrons in a computer circuit. They may be enough to represent a consciousness, or they may not be. No one has a clue.If we value humans because of the arguably higher consciousness, I don't see why we should have double standards for AI.
Fair point. I don't put aside the possibility that consciousness may exist outside of the brain. In which case it may be wise to use the precautionary principle and approach all of existence with respect, until we figure out what actually is, this thing we call consciousness.Hell, if consciousness is the result of any large enough spacial pattern, it may be enough that the particles in a liquid are thoughts created and destroyed. A crowd of enough humans may be enough to form a primitive, "higher" consciousness.