AIs smart enough to reëngineer themselves could certainly win a war against us... but when the time came, they wouldn't have to. They could keep us around and manipulate humanity into whatever shape they want; they could wipe us out physically but upload some of us into simulations for study; they could do all sorts of things my squishy little mind would never think of. I'm pretty sure they'd keep some form of human minds operating for quite some time out of curiosity. I don't see the problem with this; I'm a lot more attached to the future of sentient life than to a particular genetic framework. Humans avoiding AI would be like single-celled organisms never evolving into multi-celled organisms out of fear of competition, or monkeys that each live in their own private worlds never producing offspring that would start speaking to each other and form a society -- self-preservation is a fine instinct but sometimes there's something much bigger than our selves at stake.