You're right - my question was "what is it." That's a philosophical question, however. The practical question, if we're arguing about assigning human rights to software, is "how do we distinguish it." My degree is in engineering, and my work has always been practical. Abstractions and metaphors are great for understanding but when you need to build something you have to start with lumber and screws (metaphorically). Philosophers love to go "I know it when I see it" without recognizing that sort of definition only enables totalitarianism and if you want any hope of egalitarianism whatsoever, you need everyone to agree on rules that can be applied and standards that can be measured. This is fundamentally the problem with the whole of the TESCREAL movement, which has an unnerving overlap with the Hacker News Posse: I reject your concrete morality of today for an abstract morality of my own choosing tomorrow. Sam Bankman Fried founded Alameda Research with a loan from fellow Effective Altruists on the basis that if they all got rich they could help more people. The EAs of course extended terms to Sam at 50% APR and Sam told Michael Lewis that he needed "infinity dollars" to help everyone he wanted to help, that's why they stole from customers despite pulling in $250m a month in revenue. So in the end he lost a billion, stole eight more and ended up giving a whopping $90m to Democrats (and $10m to Republicans) without so much as donating to a food bank. I have no beef with the concerns others show for hypothetical beings... until it becomes an excuse to disregard actual ones. An actual being needs an actual evaluation. The ASPCA has been able to do this without any difficulty, as have generations of politicians. We inherently understand when our fellow creatures are suffering but there's been a lot of resistance about the idea that suffering of any potential synthetic creature must also be quantifiable. Otherwise, we just have to take Marc Andreesen's word for it.