The thing is that you are essentially using the waterfall method as the basis for your reason, and seemingly making the false assumption that all development follows that paradigm - which is not the case. It is understandable, mind you - the waterfall paradigm is in many ways the most immediately obvious because you have a problem, you work on a solution and then you test a solution in a very sequential manner - like a waterfall, hence the name. The issue with it is that as the problem increases in complexity, it becomes increasingly more cumbersome to deal with as you will have many more possible points of error that you have to address in retrospect, and each of these errors may or may not require altering a significant amount of what you already have done.
In response to these kinds of problems, a great deal of thought and research has been put into inventing other paradigms that avoid the problems the waterfall method faces with increasing complexity - several accomplish this by dividing the initial problem into many smaller problems which are then solved one or a few at a time - in my field, such methods are called "agile" methods.
I will not bore you with the entirety of the details (not to mention this is already a significant digression and I would rather not stray too far from the topic at hand), but suffice it to say that agile paradigms take a very Darwinian approach to development, creating a new iteration (or "generation", if you will) of the thing being developed with rather short intervals - usually every one or two weeks. This iteration is implemented, tested and observed at the end of its iteration and then the next iteration (or generation) will build from that version and implement a new version that is slightly more complicated, until the project arrives at the desired result. Essentially, you end up with a model where the very first version of the X you are creating is built in one or two weeks, but the final version that is ready for release can easily take up to 18 months (or, at worst, several
years!) to complete.
Essentially, this manner of development model, where the time required to improve X is vastly longer than the time to create X,(
no problem with that Sir!) the process that has created the web browser you are reading this on, the operative system that said web browser is running on, the HTTP protocol with which the web browser communicated with the server that hosts this forum and the HTML standard with which this text is encoded. Basically, if your abstraction was right in a general sense, I very much doubt you would be able to read this reply right now.
I would like to end this significant digression(
its interesting reading you are hereby invited to construct more of the sort) with a question though, in the hopes that it takes my post somewhat back on the relevant track. Namely - considering how when we attempt to mimic an evolutionary development paradigm for human projects we end up with a situation where indeed creating an initial version of X
is vastly swifter than improving upon X, does it not also reasonably follow that evolution itself would follow a similar sort of time curve? (
Im not comfortable in using the consept "evolution itself")
(edit: reference links to certain concepts
Waterfall model - Wikipedia, the free encyclopedia
Agile software development - Wikipedia, the free encyclopedia)