Web development is popular because it's fast, versatile, and relatively inexpensive -- and it's certainly easy to find developers. But that doesn't mean the alternatives don't have advantages of their own, and in some cases the Web's weaknesses might outweigh its strengths. In the interest of healthy debate, here are five reasons why concentrating your development efforts on browser-based apps might not be the best idea.
1. It's client-server all over again.
Web applications encourage a thin-client approach: the client handles UI rendering and user input, while the real processing happens on servers. What sense does that make when any modern laptop packs enough CPU and GPU power to put yesterday's Cray supercomputer to shame?
Concentrating computing power in the datacenter is fine if you're a Google or a Microsoft, but that approach puts a lot of pressure on smaller players. Scaling small server farms to meet demand can be a real challenge -- just ask Twitter.
Furthermore, security vulnerabilities abound in networked applications, and the complexity of the browser itself seemingly makes bugs inevitable. Why saddle your apps with that much baggage?
2. Web UIs are a mess.
The Web's stateless, mainly forms-based UI approach is reliable, but it's not necessarily the right model for every application. Why sacrifice the full range of real-time interactivity offered by traditional, OS-based apps? Technologies such as AJAX only simulate in the browser what systems programming could do already.
And while systems programmers are accustomed to building apps with consistent UI toolkits such as the Windows APIs, Apple's Cocoa, or Nokia's Qt, building a Web UI is too often an exercise in reinventing the wheel. Buttons, controls, and widgets vary from app to app. Sometimes the menus are along the top, other times they're off to the side. Sometimes they pop down when you roll over them, and sometimes you have to click. That inconsistency hurts your development budget, but it hurts usability more.
3. Browser technologies are too limiting.
What's more, HTML and CSS are clearly deficient when it comes to rich interactivity. Witness the proliferation of multimedia plug-ins such as Flash, QuickTime, and Silverlight. Relying on these outside dependencies increases the complexity and support cost of your applications. Why bother? These tricks wouldn't be necessary if you weren't trying to shoehorn interactivity into the browser instead of sticking to the desktop.
4. The big vendors call the shots.
Recently, Sun Microsystems CEO Jonathan Schwartz described the browser as "hostile territory" for independent developers. It's a world divided between giants, he said, with Microsoft's Internet Explorer on one side and Google's stake in Chrome and Firefox on the other.
Schwartz's statements may be self-serving, but he does have a point. Increasingly, the evolution of Web standards is being driven by major browser vendors -- new features are implemented first and standardized later. Independent developers have little genuine input into the future direction of the Web. And that's to say nothing of the ongoing bickering between the various vendors. Does it make sense to rely on client-side software that's such a moving target?
5. Should every employee have a browser?
At one point, a computer on an employee's desk was for work. [Ed: Oh, the irony!] Today, every Web-enabled PC is a gateway to shopping, TV and movies, games, music, online chat, and countless other diversions -- up to and including more illicit activities, including porn and copyright infringement -- to say nothing of making them vulnerable to phishing and malware attacks.
I've been saying this since the first cunt came along with a browser-based app.
Can we please go back to green screen applications now?