He mentions the importance of automated tested once or twice I think, but it isn't hammered on as a modern Agilist would. But, remember, 1970 was a different world. You may literally have not had room to include unit tests in your code, or literally be unable to afford the sort of abstractions that would enable unit testing, for instance.Other parts of that book talk about the difficulty of squeezing bytes out of pages. A lot of modern Agile practices are not possible back then, or are so different as to be entirely different things. TDD would probably get laughed at by all, for instance. "What, you want me to waste my precious timeshare time to run tests I know are going to fail?"
I don't have the reference, but I read somewhere several years ago when digging through old journals that regular testing was part of the Project Mercury? Gemini? software development process.
I would not be surprised to find that someday we understand why some projects work from agile, and see that it basically comes back to the "vision" observation in Mythical Man Month.
Of course, this really just betrays that I want that to happen. So... take it for what its worth.
It's the politics that change. Sometimes we use a different tool to prove a point. What we lose in productivity we sometimes make back in social progress (ex: yanking control away from certain corporations).
I do hate the reinvention of the wheel too, but it's not all pointless.
I don't think hardware has anything to do with it. I think the most important factor in this evolutionary cycle is that every 4 - 6 years, new Computer Science/Software Developer STUDENTS enter the market, spend a few years catching up, and then trying to re-invent everything. Rediscovering the territory that they'd not really learned that the industry had already proceeded through at much greater pace than Academia.
No, I mean hardware is genuinely progressing. Software is going in circles. It's a culture thing. Legacy code should mean "battle tested code" not "boring, let's replace it!" We are so busy replacing perfectly working code with the latest fashion that we never advance.
Yeah - and I mean that its not being driven by the hardware folks, but in fact is a cultural phenomenon derived from the academia world and industrial world having different gears.
Yes, hardware is progressing (though we've reached a few limits) at an amazing pace - and we've pushed technology into quite a lot of cracks and crevices in the last 40 years (at least, thats as long as I've been in the computer scene) .. but what I observe is that, every 3 - 6 years, there are cycles of entropy and creation which are a result of the endless churn of 1. Industry -> 2. Academia -> GOTO 1.
Today's Rubyist is tomorrow's Cobol'er. It goes on and on, and this is - in my opinion - a people thing. Not a hardware thing.