Decomplexity - a simple business


What is it?

Decomplexity™ process is an organizational development technique which decentralizes decision making: decisions are made as autonomously as possible and as far down the organization as possible consistent with maintaining coherence within the business. Specifically, an organization’s decision making is decoupled to a phase-transition point (colloquially the ‘edge of chaos’). Decoupling is not undertaken to the same degree in each area: each decision-making point is not insulated from impacting others to the same degree. Instead, each decision-making point is decoupled by an amount determined by the likely impact of its changes on others using Highly Optimized Tolerance. Nevertheless, in businesses which have implemented comprehensive tightly-coupled transaction and planning processes, this might lead to instability caused by volatile areas of the business adversely impacting others. Such instability is damped by the creation of organization buffers between decision-making areas using one of five design patterns which were originally developed for object-oriented system design.


Responsive business

Businesses typically want to be poised to be able to respond to their markets quickly. This implies decision making which is not slowed down by the need to consult widely and follow a lengthy bureaucratic process to gain approval from many different parts of the business (colloquially ‘corporate treacle’). Individuals need sufficient freedom to act, but, at the same time, there needs to be control over the impact of such decisions on the rest of the business. If not, the business would lose all coherence as a result of freewheeling individuals pulling the business in different directions. The decomplexity process proposes that a business can balance individual freedom with corporate control through a mechanism which:          while at the same time These two steps must happen together. If the first happens before the second, the business would be unstable. If the second happens before the first, the business would simply remain a slow-to-react decentralized business.

The decomplexity process puts this in more precise terms. Evolutionary biologist Stu Kauffman popularised a ‘language’ (NKCS) to describe collections of ‘things’ evolving together (coevolving systems), and the decomplexity process uses NKCS to define the two steps described above. In NKCS terms, reducing connections within a decision maker’s area — speeding up decision making — is reducing K. Cutting the connections between his or her business area and others — containing the impact of decisions on the rest of the business — is reducing C. But a business of zero K and zero C would not exist as a business: it would be a collection of individuals acting autonomously. This raises the obvious question of whether there are optimal values for K and C such that the business is as responsive as possible consistent with its hanging together as one business and not being so unstable that it effectively disintegrates. The decomplexity process proposes that this optimal point (values of K and C) is a phase transition colloquially known as the ‘boundary between order and chaos’ or ‘edge of chaos’. The process does not posit that decoupling to this phase transition point is a precise science, or that a sudden transition always exists, or that if one exists it exists throughout a business, but rather it supplies a procedure and language to help manage organization development. The process may seem to imply that the responsiveness a business achieves from decentralized decision making also means that it cannot also have common business processes, or central IT systems or other economies of scale. This is untrue: it is possible with some limitations to have both at the same time.


Background

The decomplexity process had its origins in evolutionary biology and the physics of avalanches (phase transitions), but was heavily influenced by problems encountered in the design of large and complex artefacts, notably computer systems and buildings. Computer software systems historically suffered from three distinct logical problems which were unrelated to the applications they were being used for:
It was the third problem which was most difficult to address: if something were large, more developers were brought in; if something were heterogeneous, more specialists in those areas are brought it; but if something such as computer operating system had lots of unmanaged cross connections, it was inherently fragile because of unpredictable side effects. These days, operating systems and large business applications are designed and programmed in small parts whose communication to other parts is strictly controlled. Each part is a black box, and parts cannot ‘see’ inside others and cannot (must not) know how other parts work. This decomposition reduces side effects and leads to manageable systems. In IT parlance, they are objects and a systems development methodology has been created around objects called object-oriented design (or object-oriented programming). Many of the ideas behind the use of autonomous objects came originally from two sources:

Business processes

In business organizations, some departments (Sales, Marketing or Commodity Trading for example) are volatile in the short term because of their close relationship with the outside world and relative lack of legal framework constraining how they operate. Other departments such as Asset Accounting are less likely to be impacted by external changes and are inherently more stable in the short term. If a large business manages to achieve what most are still trying to achieve — wall-to-wall business processes which are common and well defined and have an organization structure which is closely aligned to them — it nevertheless has a potential problem when a process changes (e.g. a fundamental change in trading relationships in response to new legislation). The business may behave in a brittle and over-reactive way like a de-stocked logistics pipeline. The decomplexity process uses patterns to insert buffers into an organization structure. Architect Chris Alexander invented this concept of a pattern (template) which was later used for object-oriented design, and a few of these templates which were defined by Erich Gamma et al (Adapter, Façade, Mediator, Chain of responsibility and Bridge) are used by the decomplexity process to design these buffers which form points of extra resilience (controlled decoupling) between business areas.


Buffers for over-responsiveness

There remains the question of how the decomplexity process determines where to place these buffers. Implementers obviously need to identify just those areas which are most likely to be hit by external events or perturbed by internal decisions. If buffers are placed between all business processes, the business transaction flow slows down unnecessarily. So a mechanism is needed to identify and to buffer the volatile areas more than the others. The problem is mathematically akin to the problem of where to best place firebreaks in a forest; the simplistic answer is that they are placed in areas where ignition as a result of a spark is likely, i.e. where the trees are more densely planted, the likelihood of a spark higher, or both. This forest fire model is often used as an example of an avalanche phase transition: when the trees are few and far between, any fires die out with little effect, but when the tree density becomes high enough, a fire can destroy a large part of the forest. The forest thus self-organizes its tree density to a critical value (theoretically around 59% in a simple square forest). If, however, the foresters are allowed to build firebreaks, the yield of the forest is increased (with tree density rising up to 76% in this simple example) even though firebreaks have a cost (valuable trees which would otherwise be growing in the firebreak spaces plus the labour needed to keep the firebreaks clear). And if some areas are more vulnerable to sparks — picnic sites for example — than others, this information can be used to place more firebreaks around these areas and less in other areas. In other words, instead of allowing a forest to self-organize, design can be used together with a knowledge of risk to optimize yield. The mathematics of this Highly Optimized Tolerance (HOT) was published in 1998 onwards (the mathematics of the simpler self-organizing forest is similar to that of the well-known self-organizing “sandpile” which was published ten years earlier), and HOT is exactly what the decomplexity process needs to decide where to place buffers. HOT allows a business either to have decision making decoupled (by reducing both K and C in the NKCS model) past the usual edge-of-chaos point without the business becoming unstable or, alternatively, to become more robust to disturbances while remaining at the same level of decentralization.

Decentralize and grow faster

The effects of business organization complexity on business growth using published data from a basket of US public companies was first analyzed by a group based at Boston University and MIT under the leadership of Gene Stanley in the mid-1990s. They found that the more decentralized the businesses were, the larger their growth within their size sector.


Find out more

Our main publications may still still be available from booksellers but were recently delisted by Amazon. The Pattern Organization is out of print but copies may still be available from stock. In addition, the text (but not all the illustrations) of the books listed below can also be viewed online or downloaded for personal use free from this site.
The Controlling Chaos article (below) is material edited out of The Robust Organization before the latter was printed. It is reproduced here because of the prevailing instability in the financial markets – exacerbated by high-frequency trading – and consequent interest in the mechanics of controlling such unstable systems.


Downloads

--NEW --
Controlling Chaos (A4 page format)
Controlling Chaos (US LTR page format)
--NEW--

The Coevolving Organization (A4 page format)
The Coevolving Organization (US LTR page format)

The Robust Organization (A4 page format)
The Robust Organization (US LTR page format)

The Pattern Organization (A4 page format)
The Pattern Organization (US LTR page format)


Who are we?

Decomplexity Associates is a business consultancy based in the UK. Since 1999 we have undertaken assignments in the US, Canada, Switzerland, Asia Pac and most EU countries.

Our trapeze artist logo is © Microsoft Corporation. Decomplexity™ process is a trademark of Decomplexity Associates.


You can contact us by email us using this form:


Your name:


Your own email address:


The area of our business you wish to email:


Your message:




Where else to look

Most of the theory behind the decomplexity process was published in peer-reviewed academic journals from 1988 - 1995, although work continues in areas such as highly optimized tolerance for systems robustness. The list below is selective: for a more comprehensive list see The Coevolving Organization and for reviews see The Robust Organization and The Pattern Organization.


MAJOR REFERENCES
 
Self-organization

Bak, P., Tang, C. and Wiesenfeld, K. Self-organised criticality (Phys Rev A. 38 No 1 July 1988)

 

Flyvbjerg, H. and Lautrup, B. Evolution in a rugged fitness landscape (Phys Rev A 46 15th November 1992)

 

Bak, P., Flyvbjerg, H. and Lautrup, B. Coevolution in a rugged fitness landscape (Phys Rev A 46 15th November 1992)

 

Paczuski, M., Maslov, S. and Bak, P. Avalanche dynamics in evolution, growth and depinning models (Phys Rev E January 1996)

 

Kauffman, S.A. The Origins of Order – self-organization and selection in evolution (Oxford University Press 1993 – ISBN 0-19-505811-9)

 

Kauffman, S.A. At home in the Universe (Oxford University Press 1995 – ISBN 0-14-017414-1)

 

HOT

Carlson J.M. and Doyle J. Highly optimized tolerance: Robustness and power laws in complex systems (Condensed Matter 9812127 8 Dec 1998)

 

Carlson J.M. and Doyle J. Highly optimized tolerance: A mechanism for power laws in designed systems (Phys. Rev E 60 1412-1427 1999)

 

Carlson J.M. and Doyle J. Power laws, highly optimized tolerance, and generalised source coding (Phys. Rev. Lett. 84 5656 12 June 2000)

 

Jen, E. (ed.) Robust Design: A Repertoire of Biological, Ecological, and Engineering Case Studies (Oxford University Press 2005  –  ISBN 0-19-516533-0)

 

Patterns

Alexander, C. The timeless way of building (Centre for Environmental Structure Series – OUP) (1979)

 

Gamma, E., Helm, R., Johnson, R. and Vlissides, J. Design Patterns - Elements of Reusable Object-Oriented Software (Addison-Wesley 1994  –  ISBN 0-201-63361-2)

 

Decomposition

Alexander, C. Notes on the synthesis of form (Harvard University Press 1964  –  ISBN 0-674-62751-2)

 

Decomplexity

Stewart, M. The Coevolving Organization: poised between order and chaos (Decomplexity Associates 2001 –  ISBN 0-9540062-0-8)

 

Stewart, M. The Robust Organization: highly optimized tolerance (Decomplexity Associates 2003  –  ISBN 0-9540062-3-2)

 

Stewart, M. The Pattern Organization: designed for change (Decomplexity Associates 2004  –  ISBN 0-9540062-6-7)

 

Size vs growth

Amaral, L.A.N., Buldyrev, S.V., Havlin, S., Leschhorn, H., Maass, P., Salinger, M.A., Stanley, H.E. and Stanley, M.H.R. Scaling behaviour in economics: 1. Empirical results for company growth (J. Phys. I. France, 7 1997)

 

Buldyrev, S.V., Amaral, L.A.N., Havlin, S., Leschhorn, H., Maass, P., Salinger, M.A., Stanley, H.E. and Stanley, M.H.R. Scaling behaviour in economics: 11. Modelling of company growth (J. Phys. I. France, 7 1997)