Kent Beck
2005-01-26 07:36:09 UTC
Invest in the design of the system every day. Strive to make the design of
the system an excellent fit for the needs of the system that day. When your
understanding of the best possible design leaps forward, work gradually but
persistently to bring the design back into alignment with your
understanding.
I was taught exactly the opposite of this strategy in school: "Put in all
the design you can before you begin implementation because you'll never get
another chance." The intellectual justification for this strategy came from
a Barry Boehm study of 1960's defense contracts showing that the cost of
fixing defects rose exponentially over time. If the same data also hold for
adding features to today's software, then the cost of large-scale design
changes should rise dramatically over time. In that case, the most
economical design strategy is to make big design decisions early and defer
all small-scale decisions until later.
For an assumption that shaped software development orthodoxy for decades,
the increasing cost of change over time received little scrutiny. This
assumption may no longer be valid. Do changes also increase in cost, the
same way defects do? Even assuming changes do increase in cost sometimes,
are there conditions under which the cost of changes does not increase? If
changes do not grow increasingly expensive, what does that imply about the
best way to develop software?
XP teams work hard to create conditions under which the cost of changing
the software doesn't rise catastrophically. The automated tests, the
continual practice of improving the design, and the explicit social process
all contribute to keep the cost of changes low.
XP teams are confident in their ability to adapt the design to future
requirements. Because of this, XP teams can meet their human need for
immediate and frequent success as well as their economic need to defer
investment to the last responsible moment. Some of the teams who read and
applied the first edition of this book didn't get the part of the message
about the last responsible moment. They piled story on story as quickly as
possible with the least possible investment in design. Without daily
attention to design, the cost of changes does skyrocket. The dire
predictions of the critics comes true: poorly designed, brittle,
hard-to-change systems.
The advice to XP teams is not to minimize design investment over the short
run, but to keep the design investment in proportion to the needs of the
system so far. The question is not whether or not to design, the question is
when to design. Incremental design suggests that the most effective time to
design is in the light of experience.
If "small, safe steps" is how to design, the next question is where to
design. The simple heuristic I have found helpful is to eliminate
duplication. If I have the same logic in two places, I work with the design
to understand how I can have only one copy. Designs without duplication tend
to be easy to change. You don't find yourself in the situation where you
have to change the code in several places to add one feature.
As a direction for improvement, incremental design doesn't say that
designing in advance of experience is horrible. It says that design done
close to when it is used is more efficient. As your expertise grows in
making changes to a running system in small, safe steps, you can afford to
defer more and more of the design investment. As you do so, the system will
get simpler, progress will start sooner, tests will be easier to write, and
because the system is smaller there will be less to communicate with the
team.
As more teams invest in daily design, they notice that the changes they
are making are similar regardless of the purpose of the system. Refactoring
is a discipline of design that codifies these recurring patterns of changes.
These refactorings can occur at any level of scale. Few design decisions are
difficult to change once made. The result is systems that can start small
and grow as needed without exorbitant cost.
the system an excellent fit for the needs of the system that day. When your
understanding of the best possible design leaps forward, work gradually but
persistently to bring the design back into alignment with your
understanding.
I was taught exactly the opposite of this strategy in school: "Put in all
the design you can before you begin implementation because you'll never get
another chance." The intellectual justification for this strategy came from
a Barry Boehm study of 1960's defense contracts showing that the cost of
fixing defects rose exponentially over time. If the same data also hold for
adding features to today's software, then the cost of large-scale design
changes should rise dramatically over time. In that case, the most
economical design strategy is to make big design decisions early and defer
all small-scale decisions until later.
For an assumption that shaped software development orthodoxy for decades,
the increasing cost of change over time received little scrutiny. This
assumption may no longer be valid. Do changes also increase in cost, the
same way defects do? Even assuming changes do increase in cost sometimes,
are there conditions under which the cost of changes does not increase? If
changes do not grow increasingly expensive, what does that imply about the
best way to develop software?
XP teams work hard to create conditions under which the cost of changing
the software doesn't rise catastrophically. The automated tests, the
continual practice of improving the design, and the explicit social process
all contribute to keep the cost of changes low.
XP teams are confident in their ability to adapt the design to future
requirements. Because of this, XP teams can meet their human need for
immediate and frequent success as well as their economic need to defer
investment to the last responsible moment. Some of the teams who read and
applied the first edition of this book didn't get the part of the message
about the last responsible moment. They piled story on story as quickly as
possible with the least possible investment in design. Without daily
attention to design, the cost of changes does skyrocket. The dire
predictions of the critics comes true: poorly designed, brittle,
hard-to-change systems.
The advice to XP teams is not to minimize design investment over the short
run, but to keep the design investment in proportion to the needs of the
system so far. The question is not whether or not to design, the question is
when to design. Incremental design suggests that the most effective time to
design is in the light of experience.
If "small, safe steps" is how to design, the next question is where to
design. The simple heuristic I have found helpful is to eliminate
duplication. If I have the same logic in two places, I work with the design
to understand how I can have only one copy. Designs without duplication tend
to be easy to change. You don't find yourself in the situation where you
have to change the code in several places to add one feature.
As a direction for improvement, incremental design doesn't say that
designing in advance of experience is horrible. It says that design done
close to when it is used is more efficient. As your expertise grows in
making changes to a running system in small, safe steps, you can afford to
defer more and more of the design investment. As you do so, the system will
get simpler, progress will start sooner, tests will be easier to write, and
because the system is smaller there will be less to communicate with the
team.
As more teams invest in daily design, they notice that the changes they
are making are similar regardless of the purpose of the system. Refactoring
is a discipline of design that codifies these recurring patterns of changes.
These refactorings can occur at any level of scale. Few design decisions are
difficult to change once made. The result is systems that can start small
and grow as needed without exorbitant cost.