Tuesday, July 22, 2008

Coupling, the enemy of web applications

Recently I have encountered yet another example of a good website hampered by very bad architecture. The once standard policy of de-coupling the application layer from the data was completely ignored, and over time has resulted in a spiders web of dependencies that the operations group can barely manage. Although the site performs adequately, it counsumes sevral orders of magnitude more infrastructure than would normally be required for it's operation

I have seen it time after time. An initial concept site (in this case e-commerce) is well designed and deployed. But over time, additional applications are added by multiple application developers resulting in hundreds of inter-dependencies that makes maintenance a nightmare. Now add the complexity of moving it to a new data center, it's nearly impossible.

The most effective way to avoid this, is to isolate the knowledge of the location of the data from the applications and move it to a middle tier, or middleware. This enables the migration of applications and data to a new location in distinct pieces, and also makes it easier to identify the interrelationships between the various applications that act on the data. If each application has direct access to the data, then moving the application and data must occurr in one large move, very risky, expensive and likely not to succeed on the first try.

Therefore a 2nd environment must be built to shadow the first, perpetuating all the bad design, and complex relationships. The applications must be duplicated to the new location, and the data must be replicated. Depending on the volitility of the data, and the applications tolerance to downtime, this can range from backup and restore from tape, to synchronous replication. The coupling of the data has increased the cost of a move by at least an order of magnitude, and perhaps more. Not to mention the added complexity of maintaining a spiders web of applications and infrastructure.

No comments: