In light of the somewhat recent news of the possibility of a person recieving a full body transplant, is the person still him or herself? What happens if they were grafted onto a dog'd body? Or a dog's head onto a human body?
What about in the future, when human limbs can be replaced by artificial ones, unti perhaps nothing but the brain is left?
p.s someone check my tags Where do we draw the boundary?
I think this is purely a matter of personal perception. A woman might treat her pets as if they were her babies, but not be able to look at intellectually disabled people as fellow humans. A man missing his face might appear to us to be less human than one who has lost both his legs and arms and had them replaced with bionic limbs. I know it's a really boring answer, but I think everyone has different boundaries.
In the future, my strong suspicion is that we will not be grafting your head on to someone else's body or using someone else's limbs etc. We will be growing you limbs and body parts, the foundation of which will be rooted in your own biology via stem cells etc. I think we will see this in our life time.
Yes that's not very far off into the future. They're already using 3D printers to create parts of the bone structure of humans. Next step is printing organs. http://www.nature.com/news/the-printed-organs-coming-to-a-body-near-you-1.17320
Relatedly: Whether someone is the same person after digital uploading - now that's a hard question.
Just depends on where you stand on the question of duality. If you think the mind and brain are separate, than it can be passed from one form to another, but if you don't agree with that then it can't be passed as it IS the brain. Luckily that's a century old debate so there's enough literature to stand on. Unfortunately from what I see most stand on the side opposite of dualism.
Sorry for being so brief, I'm not well read on the matter. It's been about a year since I've looked into it so sorry if I muck up. Basically the question is "Where is the mind?" If you believe in a soul than you'd suppose that the mind is ethereal and only connected and communicating through the flesh of the brain. The other opinion that I know of is Naturalism or something to that effect, which states that the mind is a representation of physical states of the flesh. Just the same way that digestion is done by the digestive tract and not etherically linked to it. This is the opinion I hold simply since I haven't read a convincing argument against it. Unfortunately if the mind and brain are one in the same, it is impossible to transfer, and any attempt at doing so would just be making a copy.
Wait, isn't it the opposite? If the mind was ethereal, we wouldn't be able to reach it, let alone transfer it. I have no idea what I'm talking about, but maybe in the naturalist model it could be possible to transfer the mind by gradually switching off certain parts of the brain one at a time and maintaining the continuity of consciousness?Unfortunately if the mind and brain are one in the same, it is impossible to transfer, and any attempt at doing so would just be making a copy.
>I have no idea what I'm talking about Don't worry man, no one does, that's the point of phylosophy The problem is that if the mind and brain are one in the same, then you can't transfer it without transferring the brain. You can't take something out of it self, you know? If you think differently, prove me wrong, I don't want to be right haha The idea is that if dualism is true then we can model the states that link the mind and brain in a machine and link it to the mind. No one really seems to believe in dualism so it's a moot point either way.
I don't think it's quite that simple. I think there are naturalistic views about the relation between the mind and the brain (ie. the mind isn't made out to be a separate, often spooky, substance or property) that can allow for the possibility of mind "transfer" from biological bodies to electronic ones. Specifically, seems to me that you could think that mental states supervene on physical states (ie. there's no change in mental state without a corresponding change in physical state) and that the mind is not a separate substance from the brain, and still think that it's possible to "pass" a mind from one physical medium to a different one. There's an idea in the philosophy of mind (and probably related fields as well) called "multiple realizability" which basically states that the same mental state can be realized in multiple physical ways. So for example, if the idea is true, then my present mental state could be a result of the firing of neurons (as it actually is), but it could also result from, say, the passing of currents in a circuit (if the physics is in fact amenable). Now we know that minds can be "passed" from the one arrangement of atoms in the human form to another (the atoms in your body are replaced as you age, but you presumably retain the same mind, at least if skeptical arguments against personal identity over time don't hold), so given that the relevant mental states can be realized in some other form, I don't see any reason in principle why it would be impossible to pass a mind from a human form to an electronic one. I mean, there are still a bunch issues regarding personal identity in play here, but I'm not sure they aren't issues for any naturalistic view of the mind at all, regardless of whether it would be possible on that view to pass minds from brains to other mediums. The intelligibility of this entire issue may turn on a dubious understanding of what a mind is though (I'm personally tempted towards this view, but I'm too lazy to elaborate right now).
You know, given everything I've learned about how memories are stored somatic-ly (what IS the adjective) in the body, and how memories, emotions, perception, awareness isn't just brain, it's also sympathetic and parasympathetic branches of the autonomic nervous system, it's muscle memory, it's hormones, it's adrenalin, it's proprieception and so forth. I'm not entirely certain that a brain removed and put onto a new body would be the same person, per se. I may be wrong and misunderstanding, of course. "The Body Remembers" is the primary book behind my guess.
Nice. Very relevant, and a good follow-up. If anyone else is interested, here's the wikipedia article on embodied cognition: https://en.wikipedia.org/wiki/Embodied_cognition (One of those nifty wikipedia pages that branches off into many other interesting topics.)
The idea that the human body is an organism isn't problematic scientifically or philosophically, it's only a problem religiously. What is the brain of a computer? The CPU? The CPU and the GPU? The CPU, GPU, and RAM? etc, etc. It makes much more sense to assume interdependence in an organism than some sort of weird "operational independence" or however you'd like to phrase it. The computer as a poor analog is easy enough to appropriate to approach thought experiments.
Very good analogy, thanks. Much better way of framing what I said in my first comment. I'm thinking in future ages, people will look back at how primitive we were assuming that the pinnacle of human intelligence and personhood was contained in the brain. Much like we look back on people in olden times who thought emotions were situated in the heart.
Before trying to answer that question, I think it's important to ask what an answer to the question might look like and what possible consequences there could be in answering it. Firstly, what are we looking for in an answer to this? "Human" is just a word. A two-syllable English word that serves as a label for some grouping. The boundaries of that label can be defined as something that belongs to the set of things that are classified by some system as belonging to the grouping (where that grouping is meaningful in terms of its relationship to other concepts/ideas). These bring up a few follow-up questions. Firstly, what is the system in question? What properties does it exhibit? To what degree is the system self-consistent in its labeling when related to a variety of other concepts/ideas? What purpose does the labeling as specified serve? Some labels are highly unambiguous and self-consistent. For example, the definition of a circle is defined with great mathematical rigor where the necessary and sufficient conditions for a circle are clearly laid out. There can be alternately worded definitions which yield exactly the same results, but there is essentially no real controversy on the point of what counts as a circle in mathematics. It's really just a tool for communication between people and there isn't any real obvious reason for someone to push a new definition that runs in contradiction to the standard definitions. Having it defined one way or another may be preferable for reasons of simplicity or aesthetics, but for the most part it probably won't really change how people think they should deal with circles (although it might allow them to more easily make connections that were previously harder to make by setting their mind in a certain context). A word like "human" doesn't share the same degree of unambiguous clarity. Different people attach different meanings to the word and often the same person means different things by the word at different times in different contexts. So what is it that you're looking for? A random person's arbitrary stab at a definition at the point in time when they're asked about it? People don't really think of humans in terms of their necessary and sufficient conditions when they talk about humans, they think of a vaguely bounded cluster of things that share similar characteristics when relating to some small sampling of other concepts. What that sampling is and how they feel about those relationships make all the difference. In practice, the definitions they give you will probably be a definition that fits a particular agenda meant to shape how you treat entities which they think of as roughly falling in the same cluster of things they think about as humans in the particular contexts they have in mind. However, if you question them about edge cases and different contexts they will either start contradicting themselves or their definition will start to look either arbitrary, inconsistent with your and other's definitions/intuitions, or both. I think ultimately the answers to this will illuminate things about different people and how they choose to define things, but almost nothing concrete and consistent and meaningful about the concept of a human. Because there isn't a concept of a human. There are many concepts which happen to share the same word because they're not that far apart in terms of the set they capture as belonging to them for most practical day-to-day purposes. Instead of trying to find the one true definition of what a human is (which doesn't really exist), I think it's more useful to think about why you care about the label of what counts as a human in a certain context. Then you can have more substantive discussions on the real matters you care about without letting the inherent fuzziness of certain labels distract us from the matters at heart. When there is a many-to-one relationship between concepts and a label/word, arguing about the label tends to get in the way. It stops being about real and important philosophical questions and becomes about pushing agendas through the use of language manipulation. Instead I think it's more useful to talk about the properties and characteristics of things/entities and how they relate to the matter of inquiry.
So perhaps a human sort of "ceases to be a human" in terms of the language we use for practical purposes when the entity no longer seems to meaningfully share the characteristics relevant to the issue/context at hand that the large majority of other entities in the cluster of things we call humans seem to share, but because any two people talking about a particular issue are probably operating with different criteria and groupings and after-the-fact definitions in mind they need to be careful if they don't want to talk past each other on the topic. (To further complicate the matter the characteristics are often on some spectrum instead of being a strict binary so you some definitions will allow for degrees of "human-ness")
You want to read about the Ship of Theseus. Here's wikipedia and a wireless philosophy video.
If in future we end up having artificial humans (Check out Ex-Machina), I think to an extent they can be regarded as humans, provided they have consciousness. Ultimately, the Human consciousness is what makes us different than all the other animals, take the consciousness away and we're just a better looking ape. So in that light, I think you can regard anything with consciousness as a human. Of course, the other side of that argument is that if you have a artificial human made out of mechanical parts and plastic, you can't really call it a human, because it's just like a very organized, neatly packed tool box. Sorry if I don't sound coherent, I'm a bit sleepy.
I don't have an answer to your question, but I would like to point something out that is related to the head/body transplant... I've read that gut flora has an effect on our minds. With this and the level complexity of the human mind taken into account, prima facie, the mind/head of the man being transplanted will meld with the residual mind of the deceased man's body. I'm really excited for the head/body transplant and I'm really grateful that I'll (likely) be alive for it. In a couple of years, a thought experiment will become an experiment. The future is soon.
If we are arguing on philosophical grounds I don't think there is an answer. We don't know how to define being human, yet that is what we are. We know that we are human, yet we struggle to understand why that is. If we became anything else we would probably still consider ourselves human.