Systems (Three Different Kinds)
Editor | On 22, Sep 2019
Ask ten systems engineers to define what a system is and youâ€™ll get at least eleven answers. Worse: theyâ€™re probably all correct. Worst: the differences, valid as they might be, create enormous potential confusion and as a result of this, most attempts to change â€˜the systemâ€™ end in failure. Here are some of the more common definitions of the word:
To place together
Latin systema from Classical Greek – from synistanai, to place together from syn-, together + histanai, from syn-, together + histanai, https://www.yourdictionary.com/
A set of elements in interaction
(Bertalanffy 1968) https://www.sebokwiki.org/wiki/
The cohesive interactions between a set of parts
(Hitchins 2009, 59-63) https://www.sebokwiki.org/wiki/
A system is an interconnected set of elements that is coherently organized in a way that achieves something (function or purpose)
(Donella Meadows, Thinking In Systems) http://donellameadows.org/systems-thinking-resources/
A set of variables selected by an observer
(W.Ross Ashby) https://cepa.info/fulltexts/892.doc
A viable system is any system organised in such a way as to meet the demands of surviving in the changing environment
(Stafford Beer) https://en.wikipedia.org/wiki/Viable_system_model
A multitude of interconnected elements that possesses a common property which is not reduced to the properties of these elements
A. Bogdanov. Universal Organizational Science. Tectology. Book 1 â€“ Ðœ., 1989. â€“ P. 48.
A way of doing things
A portion of the physical universe chosen for analysis. Everything outside the system is known as the environment. The environment is ignored except for its effects on the system
A way of working, organizing, or doing something which follows a fixed plan or set of rules
A set of things working together as parts of a mechanism or an interconnecting network; a complex whole
A set of principles or procedures according to which something is done; an organized scheme or method
A group of interacting, interrelated, or interdependent elements forming a complex whole
Â An organism as a whole, especially with regard to its vital processes or functions
A group of physiologically or anatomically related organs or parts
A group of interacting mechanical or electrical components
A network of structures and channels, as for travel, communication or distribution
An organized set of interrelated ideas or principles
A social, economic, or political organisational form
An arrangement or configuration of classification or measurement
An organised and co-ordinated method; a procedure
A naturally occurring group of objects or phenomena
The prevailing social order
A group or combination of interrelated, interdependent, or interacting elements forming a collective entity; a methodical or coordinated assemblage of parts, facts, concepts, etc
Any scheme of classification or arrangement
A network of communications, transportation, or distribution
Orderliness; an ordered manner
A particular set of actions for doing something
A group of organizations that work together for a particular purpose, or have similar activities
An organized, purposeful structure that consists of interrelated and interdependent elements (components, entities, factors, members, parts etc.)
A set of detailed methods, procedures and routines created to carry out a specific activity, perform a duty, or solve a problem
From a TRIZ perspective, strange as it might seem, founder, Genrich Altshuller does not give a definition of systems. His focus, of course, was the â€œTechnical Systemâ€. It becomes clear from the context of his descriptions of technical systems that he means some system pertaining to technology and technical objects. The three laws formulated by Altshuller then give an indirect definition of what a Technical System is:
- The law of completeness of system’s parts.
- The law of â€œenergy conductanceâ€ of a system.
- The law of coordination of system parts (â€˜similar to an orchestra, to a sports team and is good when all â€œpartiesâ€ plays organically, harmoniouslyâ€™)
From our perspective â€“ where thereâ€™s a certain sense of guilt that weâ€™ve also never really taken the time to offer up a definitive definition â€“ the simplest way to define what a system is would look something like:
A collection of elements that do something
A collection of elements that effect change
When we then take into account the complexities of the world, taking up Altshullerâ€™s lead, we can clarify these definitions further by taking into account the idea of a â€˜minimumâ€™ system. In fairness to Altshuller and team, at this point, it is worth noting that mankindâ€™s knowledge of complex systems was somewhat limited when the original TRIZ research was being undertaken. Most â€˜technical systemsâ€™ in the 1950s and 60s were what we might today think of as â€˜complicatedâ€™ rather than â€˜complexâ€™. Today, by contrast, we experience complexity almost everywhere we go, and it is rare to encounter problems that are anything but complex. The moment we cross the line from complicated and enter the realm of complex, we need to think about three distinctly different kinds of system.
System 1 â€“ â€˜Controlledâ€™
Let this first type of system be the â€˜collection of elements that do somethingâ€™â€¦ â€˜useful and in a coordinated fashionâ€™. i.e. the sort of system that we are able to design in order to achieve a desired useful outcome. This is the sort of â€˜systemâ€™ that, when we are designing it, needs to follow the rules described by the modern form of the Law Of System Completeness:
This is the picture we use almost all the time when weâ€™re talking about systems, and as such probably doesnâ€™t merit any further discussion here. In that, if we know we have a requirement to achieve a desired, useful outcome â€“ like for example, â€˜educate childrenâ€™ â€“ then we know that weâ€™re only going to be able to have a hope of delivering those outcomes if each of the six minimum elements are present.
System 2 â€“ â€˜Uncontrolledâ€™
Oftentimes, when we design a â€˜controlledâ€™ system (i.e. System 1) that is expected to operate in a complex environment, in addition to delivering the useful outcomes we desired to achieve, we also find ourselves with a number of unexpected non-useful outcomes. We might, for example, set up an education system and find that, while it does serve to educate children, we also end up with a highly undesirable situation where there is a substantial achievement gap. Nobody designed the system with the intention of creating this gap, but we got one anyway. The fact that an outcome was produced, however â€“ and this is a critical idea if weâ€™re to actually solve the problems created as a result of these kinds of unexpected outcome â€“ means that it was produced by â€˜a systemâ€™. A system that we might think of as â€˜uncontrolledâ€™. Which in turn is a system that looks like this:
Per what the S-Field part of TRIZ tells us, the minimum requirement to achieve a function is two substances and a â€˜fieldâ€™. If we stick with the Law Of System Completeness block diagram, we can see that the two â€˜substancesâ€™ are the Tool and the Interface, and the â€˜Fieldâ€™ is the â€˜Engineâ€™. The â€˜engineâ€™, however, only gets to do its job if there is a â€˜Transmissionâ€™ to connect the source of energy to the tool.
Whatâ€™s missing from this â€˜uncontrolledâ€™ system are the Coordination and Sensor elements. The two things that are required in order to ensure that we obtain good outcomes and not bad ones. This is not to say that a system without Coordination or Sensor canâ€™t generate useful outcomes, but rather that we have no control over whether it does or not.
The implications of this kind of uncontrolled system are quite profound when it comes to solving a problem like an â€˜achievement gapâ€™ in schools. If there is an achievement gap, then, by definition, that outcome was produced by an Engine, Transmission, Tool and Interface. Which in turn means that if we wish to eliminate said achievement gap, we need to find and eliminate at least one of the four elements.
Or. Add appropriate Control and Sensor elements in order to bring the uncontrolled outcome under control.
System 3 â€“ â€˜Humanâ€™
Our most frequent definition of what makes a system complex talks about â€˜two or more humansâ€™. Something we tend to emphasize somewhat less is a parallel belief that each of us is already â€˜two or moreâ€™ people. Not to say that everyone on the planet is schizophrenic, but rather that, as has become â€˜normalâ€™ in todayâ€™s thinking thanks to Daniel Kahnemanâ€™s classic book, â€˜Thinking Fast And Slowâ€™, the way we all think is controlled by two different parts of our brain, crudely, our limbic â€˜systemâ€™ and our pre-frontal cortex. Our version of the same idea is another of our frequently used expressions, â€˜a person makes a decision for two reasons, a good one and a real oneâ€™. The â€˜goodâ€™ one being all the (slow) conscious stuff our prefrontal cortex comes up with to rationalize our (fast) limbic-generated â€˜realâ€™ reason decisions.
The implications for these two brain parts is that, whether we like it or not, any â€˜systemâ€™ that includes human beings in effect possesses two â€˜Coordinationâ€™ elements. If we generalize that a step further to encompass the idea of systems that comprise multiple humans, it means that all six of the minimum elements that make up the Law of Completeness are also doubled up. The Law of (Human) System Completeness, in other words, looks like this:
The implications of this, too, are potentially quite profound. Sticking with the â€˜systemâ€™ that somehow manages to generate an â€˜achievement gapâ€™ outcome, rather than being a type 2 â€˜uncontrolledâ€™ system, might actually be a type 3, Human system. Which perhaps then leads to the somewhat more depressing conclusion that there are humans operating in the education system that, despite the fact they might say (good reason) they donâ€™t want an achievement gap, their limbic system is making decisions that will ensure an achievement gap is the generated outcome.