We were discussing this on the Hubski IRC the other day. At some point in the near future, transistors would have to be smaller than the electrons they're switching in order for Moore's Law to hold up. Devac mentioned that there are physicists working on atomic transistors. Even if those do make it out of bleeding edge research labs, I suspect we'll have a lot of issues just because anything that small is incredibly susceptible to glitches due to electromagnetic radiation from cell phones and wifi and whatnot.
Personally, I think the future of computing lies in programming language theory. Effectively, we have three problems: writing fast code is hard, writing code that runs across multiple cores/CPUs/systems is hard, and people want programs that don't fail (or fail less). I think that developing languages that incorporate enough information for the compiler to prove that they are correct is the way to solve these problems. Once your compiler can prove things about your code, it can take advantage of this to generate fast, parallelizable, correct code.
But...who knows! The future may just as easily lie in various quantumn-ish computers or some computational application of biology.