Systems Flashcards
General Defintion of a System:
a set of connected things/parts forming a complex whole or working together as parts of a mechanism/ interconnecting network.
Scale
One of the most important principles of systems is that they are sensitive to scale. Meaning that properties, behaviors, or outcomes tend to change when you scale them up or down (change quantities or the levels). In studying complex systems, we must always be roughly quantifying – in orders of magnitude, at least – the scale at which we are observing, analyzing, or predicting the system. - Systems will work differently when its constituents are changed in quantity, intensity etc.
Example: One police officer may be able to uphold the law and order in a small village (small scale) but if the number of residents increase, this society may be in chaos and disfunctional. One officer will not yield the same results on a large scale.
Law of Diminishing Returns
Related to scale, most important real-world results are subject to an eventual decrease of incremental value. A good example would be a poor family: Give them enough money to thrive, and they are no longer poor. But after a certain point, additional money will not improve their lot; there is a clear diminishing return of additional dollars at some roughly quantifiable point. Often, the law of diminishing returns veers into negative territory – i.e., receiving too much money could destroy the poor family.
Pareto Principle
Named for Italian polymath Vilfredo Pareto, who noticed that 80% of Italy’s land was owned by about 20% of its population, the Pareto Principle states that a small amount of some phenomenon causes a disproportionately large effect. The Pareto Principle is an example of a power-law type of statistical distribution – as distinguished from a traditional bell curve – and is demonstrated in various phenomena ranging from wealth to city populations to important human habits.
Examples are relationships between amount of work done and quality of the outcome, number of customers and their revenue share, or: in 2002, Microsoft even reported that 80% of the errors and crashes in windows were caused by 20% of the bugs involved.
Feedback Loops
All complex systems are subject to positive and negative feedback loops whereby A causes B, which in turn influences A (and C), and so on – with higher-order effects frequently resulting from continual movement of the loop. In a homeostatic system, a change in A is often brought back into line by an opposite change in B to maintain the balance of the system (via negative feeback loops), as with the temperature of the human body or the behavior of an organizational culture. Automatic feedback loops maintain a “static” environment unless and until an outside force changes the loop. A “runaway feedback loop” (positive loop) describes a situation in which the output of a reaction becomes its own catalyst (auto-catalysis).
Chaos Dynamics / Butterfly Effect
In a world such as ours, governed by chaos dynamics, small changes (perturbations) in initial conditions have massive downstream effects as near-infinite feedback loops occur; this phenomenon is also called the butterfly effect. This means that some aspects of physical systems (like the weather more than a few days from now) as well as social systems (the behavior of a group of human beings over a long period) are fundamentally unpredictable.
Network Effects
A network tends to become more valuable as nodes are added to the network: this is known as the network effect. An easy example is contrasting the development of the electricity system and the telephone system. If only one house has electricity, its inhabitants have gained immense value, but if only one house has a telephone, its inhabitants have gained nothing of use. Only with additional telephones does the phone network gain value (value is positively correlated with the amount of users)
These effects can for instance be observed in means of communication, social platforms, markets, Uber, Paypal, currencies etc.
Preferential Attachment / Cumulative Advantage
A processes in which some quantity or value typically some form of wealth or credit, is distributed among a number of individuals or objects according to how much they already have. “The rich get richer” etc. Another example is the attractiveness of a large market or a large social platform. Both are more attractive to people than their smaller counterparts (due to network effects) and this increases their numbers even further.
Emergence
The condition of an entity having properties its parts do not have, due to interactions among the parts. Higher-level behaviors or results tend to emerge from the interaction of lower-order components. The result is frequently not linear – not a matter of simple addition – but rather non-linear, or exponential. An important resulting property of emergent behavior is that it cannot be predicted from simply studying the component parts. For instance, the phenomenon of life as studied in biology is an emergent property of chemistry.
Irreducibility
The concept of a system only working / maintaining its properties if no consituents are removed. Most systems have irreducible quantitative properties, such as complexity, minimums, time, and length. Below the irreducible level, the desired result simply does not occur. One cannot get several women pregnant to reduce the amount of time needed to have one child, one cannot reduce a successfully built automobile to a single part, and one cannot reduce water into it’s fundamental parts (H2 + 0) without it losing its properties. These results are, to a defined point, irreducible.
Tragedy of the Commons
This concept introduced by the economist and ecologist Garrett Hardin, states that individual users acting independently according to their own self-interest behave contrary to the common good of all users by depleting or spoiling a shared resource through their collective action. This occurs in a system where a common resource is shared, with no individual responsible for the wellbeing of the resource. The Tragedy is reducible to incentives: Unless people collaborate, each individual derives more personal benefit than the cost that he or she incurs, and therefore depletes the resource for fear of missing out. Good examples for this are over-fishing, climate change, pollution, littering -all of which are being combatted through regulations and laws.
Gresham’s Law / Copernicus Law
This law, named after the financier Thomas Gresham, states that in a system of circulating currency, forged currency will tend to drive out real currency, as real currency is hoarded and forged currency is spent. This also means that if two forms of commodity money are in circulation, which are accepted by law as having similar face value, the more valuable (with the higher intrinsic value) commodity will gradually disappear from circulation. This has happened with silver and copper coins when alternatives made from cheaper metals were introduced. We see a similar result in human systems, as with bad behavior driving out good behavior in a crumbling moral system, or bad practices driving out good practices in a crumbling economic system. Generally, regulation and oversight are required to prevent results that follow Gresham’s Law.
Algorithms
While hard to precisely define, an algorithm is generally an automated set of rules or a “blueprint” leading a series of steps or actions resulting in a desired outcome, and often stated in the form of a series of “If → Then” statements. Algorithms are best known for their use in modern computing, but are a feature of biological life as well. For example, human DNA contains an algorithm for building a human being.
Fragility, Robustness, and Antifragility
Popularized by Nassim Taleb, the sliding scale of fragility, robustness, and antifragility refers to the responsiveness of a system to incremental negative variability. A fragile system or object is one in which additional negative variability has a disproportionately negative impact, as with a coffee cup shattering from a 6-foot fall, but receiving no damage at all (rather than 1/6th of the damage) from a 1-foot fall. A robust system or object tends to be neutral to the additional negativity variability, and of course, an antifragile system benefits: If there were a cup that got stronger when dropped from 6 feet than when dropped from 1 foot, it would be termed antifragile.
Backup Systems / Redundancy
A critical model of the engineering profession is that of backup systems. A good engineer never assumes the perfect reliability of the components of the system. He or she builds in redundancy to protect the integrity of the total system. Without the application of this robustness principle, tangible and intangible systems tend to fail over time. Systems with Redundancies are resilient to adverse conditions and if one element fails there is spare capacity or a backup.