- we could define the basic infrastructure and let complex parts be integrated into this infrastructure
- we could define a set of rules to dynamically build and evolve such an infrastructure as well decribe integration constraints for parts
interacting with and over the infrastructure or - we could combine the aforementioned approaches by defining a small set of rules and smaller predefined infrastructural parts
Those rules are not only axiomatic rules as in a language grammar, but also define constraints and requirements for the parts that are integrated into the infrastructure. In such an approach, the non-functional properties must be derived from those rules and constraints.
They cannot be evaluated in advance for the reasons already introduced. Obviuosly, to cope with future challenges, such a system must also provide extension rules and hooks to even adapt the infrastructure to changing requirements.If you think about it: Why could we come up with a technology such as the Internet with all of its complexity? Internet technologies already
fulfil the properties mentioned above. The Internet consists of smaller infrastructures, e.g., Internet backbones. Technologies such as IP, DHCP, or DNS define a rather simple (but not simplistic!) rule set for extending and evolving the infrastructure as well as for integrating servers and clients into the infrastructure. All those very simple technologies are combined to a highly evolvable and dynamic, large-scale system which is open for extension. Basically, HTTP and the Web are only one of those extensions. As a real-life example take the brain which consists of rather simple constituents and combinations between those constituents, but reveals the most complex emergent behavior we have ever dealt with. From my viewpoint, we should direct our research activities to those generative architectures, especially when dealing with large-scale software systems. Emergent behavior is the key to master complexity in such systems.
No comments:
Post a Comment