Revisiting Refactoring


a safety-net of tests - are full of duplicate code; developers cut-and-paste code instead of modifying the design because of the prohibitive costs and dangers of doing so.

So it seems we're damned if we do and damned if we don't. What is the solution? Since designs do become stale and we do tend to over-generalize, we should always have a safety-net of tests to allow us to respond to the changes over time. At the same time, we need to get away from pure YAGNI. There are times where we know with a large degree of certainty the requirements of the application upfront - the larger the degree of certainty we have in the overall requirements, the greater amount of initial design needed .

Different Viewpoints
Here are some of the current viewpoints regarding the effectiveness of Refactoring and TDD:

Jim Coplien, in "Religion's Newfound Restraints on Progress," takes aim at testing and test driven development. [iii] Since Refactoring is an integral part of TDD, the comments on TDD implicitly apply to refactoring Coplien states:

Integration and system testing have long been demonstrated to be the least efficient way of finding bugs. Recent studies (Siniaalto and Abrahamsson) of TDD show that it may have no benefits over traditional test-last development and that in some cases has deteriorated the code and that it has other alarming (their word) effects. The one that worries me the most is that it deteriorates the architecture. And acceptance testing is orders of magnitude less efficient than good old-fashioned code inspections, extremely expensive, and comes too late to keep the design clean.

... TDD, about engaging the code. TDD is a methodology. TDD is about tools and processes over people and interactions. Woops. ...

James Shore, in " The Agile Engineering Shortfall ," acknowledges that there is a shortfall of TDD when taken by itself as a practice. However:

There is an agile engineering shortfall. Short planning horizons necessitate alternative engineering practices. The problem isn't with the practices, though--good agile engineering practices exist and work well. The problem is with agile methods that don't provide agile engineering practices, and with teams that adopt a small subset of the agile engineering practices (typically, just TDD). It's unfortunate, but no surprise, that they run into trouble as a result.

Kent Beck, in his new book Implementation Patterns, writes:

Three values that are consistent with excellence in programming are communication, simplicity, and flexibility. While these three sometimes conflict, more often they complement each other. The best programs offer many options for future extension, contain no extraneous elements, and are easy to read and understand.

... Flexibility can come at the cost of increased complexity.

... Choose patterns that encourage flexibility and bring immediate benefits. For patterns with immediate costs and only deferred benefits, often patience is the best strategy. Put them back in the bag until they are needed. Then you can apply them in precisely the way they are needed.

Of course, to put the new functionality in later, when it is needed, you will need to change the design - i.e. refactor.

Refactoring is a greedy algorithm - it reaches local minima. Large refactorings are very expensive. They are many times prohibitively expensive. Therefore pure reliance on refactoring without some upfront design and architecture is costly and suboptimal.

At the same time, no matter how good an upfront design and architecture is, it will grow out of date and be a bad one. The development community is not a fortune-telling community; traditional upfront design and architecture over-generalizes in the wrong places. Generalization and flexibility by design

About the author

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.