In general I agree -- I'm sorry if it came out more messy than I suppose. And I don't believe in Terminator or Skynet, despite what some people apparently suppose -- I honestly have no idea why. But I do believe that in general it will be bad for humans pretty soon (perhaps 25 years, +/- 5 years), because people will not realize that AI and specifically AGI wil be better at everything important and job-related than human beings, often massively better. Even "common sense" and intuition and purely artistic creatively will be sucked under the AGI umbrella quite soon -- and humans will be very outmoded and not really knowing what to do. Whether AGI kills off people very humanely, or sticks them in a zoo playing virtual reality games or has another different alternative is not important, but they will find a way to move completely beyond us, and we will be history. That is the primary point, and it's pretty dire, whatever the final outcome.