Systems Flashcards

1
Q

General Defintion of a System:

A

a set of connected things/parts forming a complex whole or working together as parts of a mechanism/ interconnecting network.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Scale

A

One of the most important principles of systems is that they are sensitive to scale. Meaning that properties, behaviors, or outcomes tend to change when you scale them up or down (change quantities or the levels). In studying complex systems, we must always be roughly quantifying – in orders of magnitude, at least – the scale at which we are observing, analyzing, or predicting the system. - Systems will work differently when its constituents are changed in quantity, intensity etc.

Example: One police officer may be able to uphold the law and order in a small village (small scale) but if the number of residents increase, this society may be in chaos and disfunctional. One officer will not yield the same results on a large scale.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Law of Diminishing Returns

A

Related to scale, most important real-world results are subject to an eventual decrease of incremental value. A good example would be a poor family: Give them enough money to thrive, and they are no longer poor. But after a certain point, additional money will not improve their lot; there is a clear diminishing return of additional dollars at some roughly quantifiable point. Often, the law of diminishing returns veers into negative territory – i.e., receiving too much money could destroy the poor family.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Pareto Principle

A

Named for Italian polymath Vilfredo Pareto, who noticed that 80% of Italy’s land was owned by about 20% of its population, the Pareto Principle states that a small amount of some phenomenon causes a disproportionately large effect. The Pareto Principle is an example of a power-law type of statistical distribution – as distinguished from a traditional bell curve – and is demonstrated in various phenomena ranging from wealth to city populations to important human habits.

Examples are relationships between amount of work done and quality of the outcome, number of customers and their revenue share, or: in 2002, Microsoft even reported that 80% of the errors and crashes in windows were caused by 20% of the bugs involved.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Feedback Loops

A

All complex systems are subject to positive and negative feedback loops whereby A causes B, which in turn influences A (and C), and so on – with higher-order effects frequently resulting from continual movement of the loop. In a homeostatic system, a change in A is often brought back into line by an opposite change in B to maintain the balance of the system (via negative feeback loops), as with the temperature of the human body or the behavior of an organizational culture. Automatic feedback loops maintain a “static” environment unless and until an outside force changes the loop. A “runaway feedback loop” (positive loop) describes a situation in which the output of a reaction becomes its own catalyst (auto-catalysis).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Chaos Dynamics / Butterfly Effect

A

In a world such as ours, governed by chaos dynamics, small changes (perturbations) in initial conditions have massive downstream effects as near-infinite feedback loops occur; this phenomenon is also called the butterfly effect. This means that some aspects of physical systems (like the weather more than a few days from now) as well as social systems (the behavior of a group of human beings over a long period) are fundamentally unpredictable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Network Effects

A

A network tends to become more valuable as nodes are added to the network: this is known as the network effect. An easy example is contrasting the development of the electricity system and the telephone system. If only one house has electricity, its inhabitants have gained immense value, but if only one house has a telephone, its inhabitants have gained nothing of use. Only with additional telephones does the phone network gain value (value is positively correlated with the amount of users)

These effects can for instance be observed in means of communication, social platforms, markets, Uber, Paypal, currencies etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Preferential Attachment / Cumulative Advantage

A

A processes in which some quantity or value typically some form of wealth or credit, is distributed among a number of individuals or objects according to how much they already have. “The rich get richer” etc. Another example is the attractiveness of a large market or a large social platform. Both are more attractive to people than their smaller counterparts (due to network effects) and this increases their numbers even further.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Emergence

A

The condition of an entity having properties its parts do not have, due to interactions among the parts. Higher-level behaviors or results tend to emerge from the interaction of lower-order components. The result is frequently not linear – not a matter of simple addition – but rather non-linear, or exponential. An important resulting property of emergent behavior is that it cannot be predicted from simply studying the component parts. For instance, the phenomenon of life as studied in biology is an emergent property of chemistry.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Irreducibility

A

The concept of a system only working / maintaining its properties if no consituents are removed. Most systems have irreducible quantitative properties, such as complexity, minimums, time, and length. Below the irreducible level, the desired result simply does not occur. One cannot get several women pregnant to reduce the amount of time needed to have one child, one cannot reduce a successfully built automobile to a single part, and one cannot reduce water into it’s fundamental parts (H2 + 0) without it losing its properties. These results are, to a defined point, irreducible.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Tragedy of the Commons

A

This concept introduced by the economist and ecologist Garrett Hardin, states that individual users acting independently according to their own self-interest behave contrary to the common good of all users by depleting or spoiling a shared resource through their collective action. This occurs in a system where a common resource is shared, with no individual responsible for the wellbeing of the resource. The Tragedy is reducible to incentives: Unless people collaborate, each individual derives more personal benefit than the cost that he or she incurs, and therefore depletes the resource for fear of missing out. Good examples for this are over-fishing, climate change, pollution, littering -all of which are being combatted through regulations and laws.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Gresham’s Law / Copernicus Law

A

This law, named after the financier Thomas Gresham, states that in a system of circulating currency, forged currency will tend to drive out real currency, as real currency is hoarded and forged currency is spent. This also means that if two forms of commodity money are in circulation, which are accepted by law as having similar face value, the more valuable (with the higher intrinsic value) commodity will gradually disappear from circulation. This has happened with silver and copper coins when alternatives made from cheaper metals were introduced. We see a similar result in human systems, as with bad behavior driving out good behavior in a crumbling moral system, or bad practices driving out good practices in a crumbling economic system. Generally, regulation and oversight are required to prevent results that follow Gresham’s Law.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Algorithms

A

While hard to precisely define, an algorithm is generally an automated set of rules or a “blueprint” leading a series of steps or actions resulting in a desired outcome, and often stated in the form of a series of “If → Then” statements. Algorithms are best known for their use in modern computing, but are a feature of biological life as well. For example, human DNA contains an algorithm for building a human being.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Fragility, Robustness, and Antifragility

A

Popularized by Nassim Taleb, the sliding scale of fragility, robustness, and antifragility refers to the responsiveness of a system to incremental negative variability. A fragile system or object is one in which additional negative variability has a disproportionately negative impact, as with a coffee cup shattering from a 6-foot fall, but receiving no damage at all (rather than 1/6th of the damage) from a 1-foot fall. A robust system or object tends to be neutral to the additional negativity variability, and of course, an antifragile system benefits: If there were a cup that got stronger when dropped from 6 feet than when dropped from 1 foot, it would be termed antifragile.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Backup Systems / Redundancy

A

A critical model of the engineering profession is that of backup systems. A good engineer never assumes the perfect reliability of the components of the system. He or she builds in redundancy to protect the integrity of the total system. Without the application of this robustness principle, tangible and intangible systems tend to fail over time. Systems with Redundancies are resilient to adverse conditions and if one element fails there is spare capacity or a backup.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Margin for Error

A

Closely related to redundancies, a margin for error is an amount (usually small) that is allowed for in case of miscalculation or change of circumstances. In an unknown world, driving a 9,500-pound bus over a bridge built to hold precisely 9,600 pounds is rarely seen as intelligent. Thus, on the whole, few modern bridges ever fail. In practical life outside of physical engineering, we can often profitably give ourselves margins as robust as the bridge system.

17
Q

Murphy’s Law

A

This is an adage stating that anything that can go wrong will go wrong.

While this is not objectively true in reality, remembering this law helps us calculate risk and implement margins of error or redundancies, especially when the systems function is very important. The reason this law often seems to be true is that the law of large numbers leads to events predicted by Murphy’s law to occur occasionally and selection bias will ensure that those ones are remembered and the many times Murphy’s law was not true are forgotten.

18
Q

Via Negativa – Omission/Removal/Avoidance of Harm

A

In many systems, improvement is at best, or at times only, a result of removing bad elements rather than of adding good elements. This is a credo built into the modern medical profession: First, do no harm. Similarly, if one has a group of children behaving badly, removal of the instigator is often much more effective than any form of punishment meted out to the whole group. The same principle can be observed in the profits of a business; sometimes cost reduction by removal of unnecessary parts can yield a better result than a sales increase.

19
Q

The Lindy Effect

A

The Lindy Effect is concept implying that the future life expectancy of some non-perishable things like a technology, an idea, or a religion is proportional to their current age, so that every additional period of survival implies a longer remaining life expectancy. If an idea or object has lasted for X number of years, it would be expected (on average) to last another X years. While this is obviously not true for organisms, non-perishables lengthen their life expectancy as they continually survive. A classic text is a prime example: if humanity has been reading Shakespeare’s plays for 500 years, it will be expected to read them for another 500. Things that have been in existence for a long period of time can be considered more robust/antifragile, i.e., more likely to continue to survive, than new things that haven’t passed the test of time.

20
Q

Complex Systems

A

Complex systems are ones where dynamic networks of interactions, and their relationships are not aggregations of the individual static entities, i.e., the behavior of the ensemble is not predicted by the behavior of the components. Examples are cells in organisms, molecules formed out of atoms, and the formation of weather. One feature of complex systems therefore is emergence but others are linearity, meaning they may respond in different ways to the same input depending on their state or context.

21
Q

Complex Adaptave Systems

A

A complex adaptive system, as distinguished from a complex system in general, is one that can understand itself and change based on that understanding. Complex adaptive systems are social systems. The difference is best illustrated by thinking about weather prediction contrasted to stock market prediction. The weather will not change based on an important forecaster’s opinion, but the stock market might. Complex adaptive systems are thus fundamentally not predictable. Other examples are markets, politics, or societies.

22
Q

Complicated vs. Complex

A

A complicated system is one in which may have a lot of components making it difficult to understand but the components can be separated and dealt with in a systematic and logical way that relies on a set of static rules or algorithms.

A complex system, on the other hand, is one for which there are no rules, algorithms, or natural laws. Things that are complex have no such degree of order, control, or predictability. A complex system is different from the sum of its parts, because its parts interact in unpredictable ways.

Therefore, complicated problems like building a reusable (SpaceEx) rocket can be solved with enough logic and resources while complex ones like predicting the evolution of viruses, the weather, or the stock market cannot be done.