Information theory Flashcards
what is the main description of information theory?
- about noise (anything causing ambiguity/ interference in communication) relative to communication
- information is what we communicate
what kind of theory is information theory?
-normative theory, interested in the efficiency of the process
what is information theory sensitive to? (2)
- time (sequence of events, time series, stochastic processes, probability in time)
- behaviour changes in time and space (dynamic and stochastic) - context (noise, other stimuli)
- Ex. coyotes don’t howl on windy nights
what is the focus of information theory?
- focus=transmission
- also interested in reception
what is information in information theory? (7)
- choice/ difference –>information offers a choice in stimuli produced
- order–> communication often not orderly
- pattern
- entropy (disorder, mostly due to noise)
- redundancy (ex. English better in noisy channel as less complexity)
- uncertainty/ probability –> flexibility, adding degrees of freedom allows change
- omission–> can define information by its absence
Terrence Deacon
- definition of information by omission
- absence of details is as telling as those that are present
- info becomes evident when we don’t have it
omission in regards to information
- info becomes evident when we don’t have it
- amygdala picks up on missing information at the unconscious level (likely have theory of mind) –> even when aware may not be able to tell what is going on
- ability to pick up the difference
difference between subcortical and cortical parts of brain in animals
- subcortical parts of brain are similar between animals, yet cortical are very different (social behaviors may be regulated by basic subcortical processes)
- evidence of divergent evolution
- evolution of birds and mammal brains, foundation start in reptiles
what did Walter Freeman suggest about information?
- information is good, but sometimes there is too much
- what is important is the meaning/ relevance of the information
what kind of system is used within information theory? describe what it is?
- A binary system (0s and 1s)
- signifies the presence or absence of either a 1 or 0, which represents a corresponding symbol
- gets rid of irrelevant info and decide what is relevant
- each digit is a bit (binary digits)–> analogous to a coin toss
entropy
=amount of information you have
-look at symbols used in message (how much you have to process) and symbols available in language/code (eg. size of alphabet)
Information for Shannon
- refers to entropy; reducing the probabilities
- give unexpected information
- average logarithm of the improbability of the message, measure of unexpectedness/ surprise
how does redundancy affect information?
- redundancy is the antidote to confusion (if confused then we repeat)
- more redundancy= less efficiency but better error correction
- more a msg is regular/has order= more predictable= more redundant
- more redundant=contains less info (reduces confusion)
how do uncertainty and information relate?
- more information= less uncertainty
- uncertainty increases with number of possible alternatives