David Suydam | November 15, 2012
Our “POV” series features insight and opinions from Architech’s thought leaders on the software development industry, technology issues and trends.
As President Barack Obama enters his second term in office, he may well be thinking a lot about the legacy he’ll leave, how he’ll be remembered as the 44th President of the United States.
At Architech, we also think a lot about what we do, and what we can be proud of. We want to look back on our years and be able to say that we accomplished some incredible things. There’s a big difference, though, between building a legacy and building a legacy system. Some time ago I heard a fantastic definition for a legacy system that really resonated for me:
“A legacy system is one without automated testing.”
By automated testing, I’m not talking about using traditional user-interface test automation tools where record and playback test scripts are created after the application is built. Rather, I’m talking about following test-driven development (TDD) and creating code-level tests through xUnit frameworks that can be automated by chaining together modern toolsets like Selenium, Thucydides and several others.
So why does a lack of automated testing create a legacy system? Here’s how this happens:
- Higher effort to change. Without proper automated testing, the effort to test the application is dramatically higher because it requires manual work. Every change involves a manual regression test suite that can take days or weeks to complete. With each subsequent release, the overall effort (i.e. cost) of manual regression testing begins to quickly overtake the effort that would have been required to create proper automated testing.
- Harder to change. Without building the software specifically for proper automated testing, the software by its nature wasn’t designed for testability. Lack of test-driven development often leads to a high dependency architecture (tight coupling, low cohesion). Changes in one area often affect another with unpredictable results.
- Knowledge is lost. Over the years, the team that built the application often moves on or might leave the organization. New developers can be trained, but this is expensive and knowledge is inevitably lost for any reasonably complex application.
- Avoid making change. There is a tendency to avoid making changes because of high costs, difficulty, and fear of change from lack of comfort with the code base. Over time the application becomes brittle. People avoid making changes to the application. You now have a legacy system.
In the long run, the best way to reduce costs, make change easy, and retain this knowledge would have been proper code-level automated tests. These provide a quick and easy way to run regression test suites, give developers comfort in their ability to change the application (and quickly see what breaks when they do so that they can fix it), and keep a documented account of the original intent of the code.
Most enterprise software teams are in the business of building legacy systems, because most of these teams are not building proper automated tests.
“But we’ve been doing automated testing for years…”
I frequently hear from executives and IT managers within banks, telcos and governments that they already do automated testing and that they’ve been doing it for years. The testing they’re doing is not what I’m talking about. There are now incredible tools and techniques that a new breed of companies are using to drive rapid change, lower costs, and better user experiences – companies like Amazon, British Airways, Facebook, GE, Google and Salesforce.
We can show you how. Drop us a line and ask for a free demo.
As Architech’s President, David’s focus on creative solutions is based on a strong belief that traditional software development practices are flawed, and his team routinely demonstrates a better approach with open source, Lean and Agile methods. Connect with David on Twitter or Linkedin.