Chapter I Flashcards
inverse thinking
Inverting your thought process, goal.
Instead of being right more, be wrong less.
Instead of trying to make the most money investing, try to lose the least money.
antifragile
Concept from Nassim Taleb, opposite of fragility.
Beyond resilience or robustness, getting stronger with shocks, volatility, randomness
first principles thinking
thinking from the bottom up, using basic building blocks of what you think is true to build sound conclusions
de-risking
testing your component assumptions of a larger assumption to better assert that your conclusion is correct
mvp
Minimum Viable Product
Early and often prototyping an idea
“if you’re not embarrassed by the first version of your product, you’ve launched too late,” - Reid Hoffman
Ockham’s Razor
The simplest explanation is the most likely.
Break down assumptions into component assumptions, ask yourself:
“What evidence do I have that this is true? Is this a false dependency?”
Conjunction fallacy
latching onto unnecessary assumptions based on component data:
Lisa is concerned with social justice and majored in philosophy. She is outspoken at anti-nuclear demonstrations. Is it more likely that she is:
A. a bank teller
B. a bank teller and is active in the feminist movement
overfitting
when you use an overly complicated explanation when instead a simpler one will do
frame of reference
Heeding the influence of perspective, realizing that people/events can be unknowingly influenced by context.
framing
The way you present a situation or explanation. Think how Fox news and CNN may present the same story
nudging
Giving soft cues to push someone in a certain direction
anchoring
relying too heavily on first impressions when making decisions
availability bias
bias, distortion that creeps into your objective decision-making thanks to information recently made available to you
filter bubble
filtering out of information that is unfamiliar, opposing to your viewpoints, placing you in a bubble
third story
the story that a third, impartial observer would recount. Forcing yourself to think as an impartial observer
most respectful interpretation
explaining a person’s behavior in the most respectful way possible
hanlon’s razor
never attribute to malice that which is adequately explained by carelessness
fundamental attribution error
where you make errors by attributing others’ behaviors to their internal, or fundamental motivations rather than external factors. (Someone is tired, stressed by a client, rather than they ARE some certain characteristic [mean, not very thoughtful])
self serving bias
when you are the actor, you often have self-serving reasons for your behavior, but when you’re the observer you tend to blame others’ intrinsic values
veil of ignorance
imagining ourselves ignorant of our particular place in the world, preventing us from knowing who we are when making decisions that affect others
just world of hypothesis
where people get what they deserve, good or bad, because of their actions alone, not accounting for luck or randomness ( you reap what you sow)
victim-blame
victims of circumstance are blamed for their circumstances, with no accounting for other factors of randomness, birth lottery
learned helplessness
tendency to stop trying to escape difficult situations because we have gotten used to difficult conditions over time
paradigm shift
‘progress one funeral at a time’
semmelweis reflex
when explanations are not in line with conventional thinking and immediately rejected before being thought through
confirmation bias
human tendency to gather and interpret new information in a biased way to confirm pre-existing beliefs
backfire effect
digging in further on a position when faced with clear evidence that disproves it
disconfirmation bias
where you impose a stronger burden of proof on ideas you don’t want to believe
cognitive dissonance
stress felt by holding two contradictory beliefs at once
thinking grey
Idea attributed to Steven Sample, instead of thinking about issues in terms of black and white, the truth is somewhere in the middle, a shade of grey
intuition-based decision making
realizing that trusting your gut will lead to anchoring, availability bias, framing, and other pitfalls. Slowing down and thinking deliberately.
You can use your intuition to guide investigation, but investigation itself should be done with clear thoughtful awareness and first principles thinking
proximate cause
the thing that immediately caused a reaction (where people may look and blame) instead of the root cause (which actually caused the issue):
Ex: Challenger explosion - hydrogen tank explosion (proximate cause) that was caused by endemic internal mismanagement (root cause)
postmortem
examination of a prior situation to understand what happened and how it could go better next time
5 whys
keep asking ‘why did that happen’ until you reach root causes (the number of whys isn’t all that important, can be more or less than 5)
optimistic probability bias
where you want something to be true so badly that you fool yourself into thinking it is likely true
Feynman’s warning
You must not fool yourself- - and you are the easiest person to fool.