Hitchhiker's Guide to Software Architecture and Everything Else - by Michael Stal

Saturday, January 12, 2008

Architectural Entropy

Several years ago I introduced the term entropy in the context of software architecture. Entropy defines the number of concepts appearing in a software architecture. If your design consists of two simple components associated with one unidirectional relationship, this design probably has a low entropy. I only say "probably" because it depends on whether the internals of the component are complex or not. If, however, you increase the number of components or relations or their internal complexity, entropy inevitably also increases. High entropy often is an indicator for high complexity. High complexity always means high entropy. But high entropy does not enforce high complexity. A system that consists of almost infinite simple components that are not related with each other implies high entropy but not high complexity. You should also consider the other direction. A system that consists of only one single god component may be very complex. Hence, entropy can non only be measured by addressing one view or abstraction level (e.g., the component layer) but must consider all views and abstraction layers.

Obviously, incremental architecture design must lead to higher entropy, because we constantly add things. One of the fundamental issues in achieving high  architecture quality is minimizing entropy. Activities such as software architecture refactoring are doing exactly this. Quality factors such as loose coupling, symmetry, conceptual integrity, orthogonality strive for reducing entropy or at least keeping it small.

The art of software architecture is controlling entropy.

2 Comments:

  • very well said..

    By Anonymous Anonymous, at 8:43 PM  

  • My thoughts exactly, only said much better than I probably could.

    By Blogger Unknown, at 6:31 AM  

Post a Comment

<< Home