If this is accurate (
Memory storage density - Wikipedia, the free encyclopedia) you can fit a very large amount of data in a very small space. 1 square inch is roughly the size of your thumbprint and 3 exabytes is (also according to Wikipedia) something like 20 times the data stored in all the printed material in the world. (That's ignoring all the stuff around the actual storage medium that you'd need to read it though.)
The speed limit on processors is a more difficult question. On one hand, you have to move data around which is limited by the speed of light like everything else (and in fact electrons in metal don't move quite that fast). On the other hand, with multicore processors and parallel processing in general you can still get stuff done while data is being moved around. But it's no free lunch as making programs that take advantage of parallel processing has been largely unsuccessful outside of certain embarrassingly parallel tasks (and yes, that's a technical term).
The basic trend in computing has been smaller and faster, and while there are UI issues with making interfaces too small, the chips and stuff that run those interfaces don't have to worry about that specific problem. There's also the growing issue of heat and heat dissipation (the cooling unit on my CPU dwarfs the CPU itself). Eventually someone will work out a good way to deal with that problem, maybe using room temperature super conductors if anyone ever discovers one of those (though there are other ways around the issue), though thermodynamics is against it ever going away entirely.
As for the size limits of the computer chips themselves, we're already bumping up against those. They're getting small enough that electrons are starting to quantum tunnel between pathways degrading the performance of the chip. It's not at the theoretical limit just yet, but you at least need some way to get electrons or photons from one end of the chip to the other, though you could imagine getting that down to a single string of atoms.
All of that doesn't change the basics of computing though. There are still problems (like the Travelling Salesman Problem) that will still take too long for any traditional computer. But then there's quantum computing. If it lives up to its promise it could radically change the kinds of things a computer can do in ways that are hard to predict the consequences of.