A few months ago, I joined a new startup. Almost all of it is backend processing that doesn't even use a database until almost all the work is done. Our only Web application is for a Web services API. Since I was mostly writing the code from scratch, my boss and I agreed that taking a Unix approach was best. Hence, we have a bunch of simple, standalone tools. Writing such tools is so refreshing. You know exactly what you need to do. They're only a few hundred lines long. You build a nice command line interface using optparse, you write some tests using nose, etc. It's all very straightforward and linear. Wanna know how to do a UNIX-style mashup? You use a pipe.
Besides implementation concerns, the Web is simply complex these days. Have you read all the papers on session fixation attacks, cross site scripting vulnerabilities, SQL injection attacks, and cross site forgery attacks? Do you remember the HTTP response codes for SEE_OTHER and TEMPORARY_REDIRECT? Do you know when you should use each? I see tons of books about various Web frameworks and libraries, but where is there a really good book on how to be good at plain old Web development?
How do you deal with logins? OpenID hasn't really taken off yet, and not everyone can depend on Facebook for authentication. That means you'll need account services. What if the user forgets his password? Did you know that if you URL encode something that's been base64 encoded and then send it in an email, it might not make it through Hotmail in all cases? However, if you make the link too long, users will get confused when their email client breaks it into two lines.
By the way, concerning standards bodies--you know, the ones who spent so much time creating XHTML?--I'll remind you that the Mozilla Web Author FAQ still says that "Serving valid HTML 4.01 as text/html ensures the widest browser and search engine support." I.e., use HTML not XHTML. The Webkit guys say pretty much the same thing.
The Web is a strange place where the HACKS become the standard by which you get stuff done.
Hello, who are you? Oh, you have a cookie? Let me see if I know anything about you. Oh, the memcache server says it has a session for you. Let me talk to the database to see if he can tell me more. Ok, here's a form. Get back to me when you're ready.And let's not forget that you're simultaneously carrying on about a hundred such conversations at any given time. You have all the drawbacks of multithreaded coding except, you really can't count on anything being in memory because you're spread across several servers. It's the worst of both worlds.
Hello, who are you? Oh, you have a cookie? Let me see if I know anything about you...
The nice thing about UNIX tools is that once you get them working, you don't need to think much about them anymore. When was the last time you worried much about cut, uniq, or sort? On the other hand, with a Web app, plan on rewriting it five years from now. Oh, and it'll be even harder and more messed up by then.
Of course, all my complaints just don't matter because the Web has too many good properties. It's vendor and OS neutral. You can run millions of different applications, and the only thing you need to download is a browser. (Oh wait, you already have one? An ancient version of IE? No worries, we can support that too!)
Yet again, we are reminded that worse is better. Apparently, much, much worse is also much, much better.