Skip to main content

The Waterfall Model was a straw man argument from the very beginning

Over the years, I've read a lot of books on software engineering. Agile books in particular like to refute the "Waterfall Model". Every time I read about the Waterfall Model, I think to myself: I'm sure there must be companies out there that have tried this approach, but I have a hard time believing some software book would seriously propose it as the best way to write software. It must be the boogie man of software engineering methodologies.

I was reading The Practice of Cloud System Administration: Designing and Operating Large Distributed Systems, and I came across this quote:

Royce’s 1970 paper, which is credited with “inventing” the model, actually identifies it so Royce can criticize it and suggest improvements. He wrote it is “risky and invites failure” because “design iterations are never confined to the successive step.” What Royce suggests as an alternative is similar to what we now call Agile. Sadly, multiple generations of software developers have had to suffer through waterfall projects thanks to people who, we can only assume, didn’t read the entire paper (Pfeiffer 2012) [p. 175].

I think this quote lends credence to my theory that the "Waterfall Model" has been a straw man argument from the very beginning. I know that there were companies that used it, and that there are companies that still do. However, I think this proves that no one ever wrote a book that proposed that the "Waterfall Model" was the best way to write software.

Comments

Anonymous said…
FYI. Seems like an interesting update by Royce at: http://www.nasa.gov/pdf/293180main_56571main_wroyce_east_west_0301.pdf
jjinux said…
I think he could have simplified that document to simply saying, "I told you so." ;)
I'm convinced that Royce is not the real source of Waterfall. He even promotes a 2 cycle loop to learn from the first cycle findings.
The Waterfall model IMHO comes from Taylorism. Is a serial assembly line that follows an strict division of labor, management that thinks and workers that execute, etc. Having worked with professionals beyond sw development, they don't call it "Waterfall" but it is everywhere.

Popular posts from this blog

Ubuntu 20.04 on a 2015 15" MacBook Pro

I decided to give Ubuntu 20.04 a try on my 2015 15" MacBook Pro. I didn't actually install it; I just live booted from a USB thumb drive which was enough to try out everything I wanted. In summary, it's not perfect, and issues with my camera would prevent me from switching, but given the right hardware, I think it's a really viable option. The first thing I wanted to try was what would happen if I plugged in a non-HiDPI screen given that my laptop has a HiDPI screen. Without sub-pixel scaling, whatever scale rate I picked for one screen would apply to the other. However, once I turned on sub-pixel scaling, I was able to pick different scale rates for the internal and external displays. That looked ok. I tried plugging in and unplugging multiple times, and it didn't crash. I doubt it'd work with my Thunderbolt display at work, but it worked fine for my HDMI displays at home. I even plugged it into my TV, and it stuck to the 100% scaling I picked for the othe

ERNOS: Erlang Networked Operating System

I've been reading Dreaming in Code lately, and I really like it. If you're not a dreamer, you may safely skip the rest of this post ;) In Chapter 10, "Engineers and Artists", Alan Kay, John Backus, and Jaron Lanier really got me thinking. I've also been thinking a lot about Minix 3 , Erlang , and the original Lisp machine . The ideas are beginning to synthesize into something cohesive--more than just the sum of their parts. Now, I'm sure that many of these ideas have already been envisioned within Tunes.org , LLVM , Microsoft's Singularity project, or in some other place that I haven't managed to discover or fully read, but I'm going to blog them anyway. Rather than wax philosophical, let me just dump out some ideas: Start with Minix 3. It's a new microkernel, and it's meant for real use, unlike the original Minix. "This new OS is extremely small, with the part that runs in kernel mode under 4000 lines of executable code.&quo

Haskell or Erlang?

I've coded in both Erlang and Haskell. Erlang is practical, efficient, and useful. It's got a wonderful niche in the distributed world, and it has some real success stories such as CouchDB and jabber.org. Haskell is elegant and beautiful. It's been successful in various programming language competitions. I have some experience in both, but I'm thinking it's time to really commit to learning one of them on a professional level. They both have good books out now, and it's probably time I read one of those books cover to cover. My question is which? Back in 2000, Perl had established a real niche for systems administration, CGI, and text processing. The syntax wasn't exactly beautiful (unless you're into that sort of thing), but it was popular and mature. Python hadn't really become popular, nor did it really have a strong niche (at least as far as I could see). I went with Python because of its elegance, but since then, I've coded both p