We were discussing this on the Hubski IRC the other day. At some point in the near future, transistors would have to be smaller than the electrons they're switching in order for Moore's Law to hold up. Devac mentioned that there are physicists working on atomic transistors. Even if those do make it out of bleeding edge research labs, I suspect we'll have a lot of issues just because anything that small is incredibly susceptible to glitches due to electromagnetic radiation from cell phones and wifi and whatnot.
Personally, I think the future of computing lies in programming language theory. Effectively, we have three problems: writing fast code is hard, writing code that runs across multiple cores/CPUs/systems is hard, and people want programs that don't fail (or fail less). I think that developing languages that incorporate enough information for the compiler to prove that they are correct is the way to solve these problems. Once your compiler can prove things about your code, it can take advantage of this to generate fast, parallelizable, correct code.
But...who knows! The future may just as easily lie in various quantumn-ish computers or some computational application of biology.
I think the future of programming lies in the same place the future of the motor does. A quick era of massive expansion followed by the technology hitting a fairly impassable brick wall and our ideals about it's possibilities becoming sensible.
There are still people out there that believe it is possible to create a technological singularity where a computer can become infinitely smart by improving itself. This is like people pre-relativity thinking we will travel instantly to anywhere if we keep making vehicles faster.
Ultimately we've had hyper efficient, complex, thinking machines around for years, easily produced, and constantly finding ways to improve their capabilities. They just demand food, water, and payment, and aren't great at menial tasks.
So we make something that is, and we are finding that with things like neural networks and machine learning, as computers become as smart and capable as humans are they are losing that "ultra efficient" trait and start making the same stupid mistakes, generalities, and creative failures that humans are.
We will reach a day where the computer is nothing more than a device we use for things. Today it is a cultural and technological icon of the present power of technology, tomorrow computers will be a fact of life like the telephone, the atom bomb, or the car.