SSR Exam 4 Flashcards
global network analysis
looks at the overall structure of the network
local network analysis
looks at the differences between nodes in terms of connectivity, centrality etc
nodes
set of entities
edges
nodes are connected through
hysterises
there is more needed for high connectivity networks (vulnerability network) to get back to normal when experiencing a traumatic life event
degrees
number of connections
betweenness
how often does a node lie on a path between two other nodes
network theory
mental disorders are alternative stable states in a symptom network
small world structure
high level of clustering , small average path lengths.
goal of psych assessment
to characterise an individual’s standing on individual differences construct of clinical relevance
latent trait models
posit the presence of one or more underlying continuous distributions
latent class models
based on the supposition of a latent group (class) structure for a personality construct’s dikstribution, and they are typically evaluated via latent class analysis
hybrid models
these models combine the continuous aspects of the latent trait models with the discrete aspects of latent class models
zones or rarity
locations along the dimension that are unoccupied by some individuals
quasi-continous
construct would be bounded at the low end by zero, a complete absence of the quality corresponding with the construct.
discrimination
measure of how strongly the item taps into the latent trait
assumption of conditional independence
classes are defined by patterns of item endorsement across individuals, assuming that inter-item correlations solely reflect class membership
self organisation
process where some form of overall order arises form local interactions between parts of an initially disordered system. the process can be spontaneous when sufficient energy is available, not needing control by any external agent.
bimodality
in exactly the same circumstances two stable states are possible
hysteresis
is the dependence of the state of a system on its history
finite mixture models
the distribution of data (length) is not described by one distribution (normal distribution) but a weighted sum of distributions
latent classes
both observed and latent variables are categorical
wilcoxon signed rank test
a non parametric aletnartive to the paired samples t-test. it assigns + or - signs to the difference between two repeated measures
reflective latent variable model
the attribute is seen as the common cause of observed scores: neuroticism causes worrying about things going wrong
formative latent variable model
observed scores define or determine the attribute
subjective probability
degree of conviction we have in a hypothesis.
P(H|D) is proportional to P(D|H) x P(H) says that
your posterior is proportional to the likelihood times the prior
likelihood
if you want to update your personal probability in a hypothesis, the likelihood tells you everything you need to know about the data. it captures all support for a hypothesis provided by the data
likelihood principle
the notion that all the information relevant to inference contained in data is provided by the likelihood
probability density distribution
if the dependent variable can be assumed to vary continuously, (that is the values do not come in steps)
p-value
the probability of rejecting the null, given the null is really true
bayes theorem
says that posterior is proportional to likelihood times prior
flat prior/uniform prior
if the std dev is infinite , you think all population values are equally likely
bayes factor
Bayes equivalent to null hypothesis testing or significance testing.
what is bayes statistics?
it is when you use probability to represent uncertainty in all parts of a statistical model/ flexible extension of maximum likelihood
bayesian data analysis is a method for figuring out unknown that requires three things
data, a generative model, priors (what ifs the model has before seeing the data)
P(X)
degree of belief that X is true.
probability
is a measure of the degree of belief or confidence one has in the truth of a proposition
bayesian program steps
find a way of assigning a number to a person’s degree of belief, show that a rational betting strategy must satisfy the rules of probability theory, adopt bayes’ rule as a general principle for how to learn from experience
p value
prob of the data of encountering a test statistic at least as extreme as the one observed, given that the null nhypothesis is true
confidence interval
an x% conf interval for a parameter 0 is an interval (L,U) generated by an algorithm that in repeated use has an X% chance to capture the true value of 0
classical definition of probability
proportion of occurrence when a particular experiment is repeated infinitely often (inner different circumstances)