OK, I have presented this "thought experiment" maybe a year ago, had a couple of responses that gave me some more contemplation (which I will include here) but I still am stumped:

Imagine we have two identical rockets that have a thrust that allows them each to accelerate at 11 G's. We put them on the surface of the earth (or better yet, on a planet with the same mass as the earth but with no atmosphere), and we launch them. The first rocket's engines go for one second before shutting off, and the second rocket goes for two seconds before shutting off. As I understand it, they both would accelerate at approximately 320 feet/sec/sec. The paradox for me here is that while the second rocket would reach a height of four times the first rocket before falling back to the planet, it seems that it would only use twice the amount of fuel. I reason this by Newton's famous equation F=ma. If the mass of both rockets is the same, and the force from both engines is the same, the acceleration should be constant, especially if there is no friction from the atmosphere. There is nothing in the equation F=ma that qualifies the rockets' acceleration based on relative speed. In my previous post, someone clued me into the equation W=Fs, or Work equals Force times distance. Using this equation, I think the second rocket may only go up twice as high; however, I am not able to reconcile the two equations together.

Any help or further insights into this quandary of mine would be greatly appreciated.