Q: As far as I can tell, in spite of the countless millions or billions spent on OOP education, languages, and tools, OOP has not improved developer productivity or software reliability, nor has it reduced development costs. … In general, though much lip service is given to talk of "re-use" the reality is that unless a piece of code does exactly what you want it to do, there's very little cost-effective "re-use." … The real world isn't "OO," and the idea implicit in OO—that we can model things with some class taxonomy—seems to me very fundamentally flawed. … So what am I missing here?
What are the benefits and drawbacks of OOP?
There's a rigid structure inherent in OO development that demands consistency, attention to detail, and a specific separation of concerns. However, by the time the GoF book came out, software engineering was no longer the domain of dedicated and experienced software engineers. Today, all coders are expected to have a grasp on engineering even though, in my experience and that of my students, there is no education beyond basic syntax and module assembly paradigms. In other words, very few are actually teaching proper engineering. The result is exactly what you describe.
But you also challenge the very nature of OO programming, and it deserves to be challenged. The entire family of C-based languages and their offshoots completely rely on this noun-based thinking, which, as you pointed out, needs to be built properly in order to truly be reusable. Verb-based functional programming is seeing a resurgence lately and attempts to correct these issues. However, functional programming also seems to be more challenging to learn, even if it is theoretically more intuitive than OO.
So, what's the fundamental problem? Is OO itself flawed? Is functional programming too much of a mind bender? I venture to say that once one has absorbed and processed enough syntax, computer science, and engineering, software development becomes more of an art than a science. That's the real reason there's so much generalization and variation in coding best practices, and why preference of one paradigm over another is personal and highly subjective. Challenge ten professional engineers to build the same application and you'll get ten working applications, each done very differently from the others. Why? Because at that level coders are more like artists with their own idioms, preferences, and styles.
When you ask why all the time and money has failed to make software any better, perhaps consider yourself a critic of some artistic era and know that new eras, as with art, will come and go. Innovators and first adopters will seem prescient, but years later they'll experience the same demand for something new and better. Be assured, though, that plenty of computer scientists are constantly pushing the boundaries, trying to create a better programming language. Maybe you're one of them?
And here's a thought: people are currently writing machine-learning algorithms so that computers can write their own code. This entire conversation of ours may be obsolete at some point in the near future. I'd truly miss the art, if you ask me.