code? Flashcards

1
Q

emmeans

A

provides the estimated means for each group

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

lm()

A

calculates a linear model
DV ~ IV, data =
multiple predictors are separated by +

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

slice()

A

shows a specified section of the data set
e.g. rows 1 - 6 when specify (1:6)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

summary()

A

provides a summary output of the model
provides values for coefficients, r^2, f tests , degrees of freedom

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

round()

A

rounds values to a specified number of decimal places

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

confint()

A

provides the confidence intervals of the model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

z_score()

A

standardises coefficients

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

tab_model

A

formatted full results table

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

head

A

shows the top rows of the dataset

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

t.test()

A

does a t-test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

contrasts()

A

used when dummy coding the data when you have a factor variable in the model
selects the first group as the baseline

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

contr.treatment()

A

specifies dummy coding

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

in dummy coding in r, what does base =

A

the level number of the baseline you want (e.g. 2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

how do you specify interactions in R

A

asterisk or :

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

which of these - (*), (:), (+), (+ and *) does not specify full model results?

A

:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

scale = F is used to do what?

A

mean centre
scale(variable name, scale = F)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

probe_interactions

A

simple slopes
only works for catetgorical * continuous interactions and continuous * continuous predictors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

cat_plot()

A

visualises categorical interactions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

geom_smooth(method = loeess)

A

adds a loess line to plot

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

crPlots()

A

component-residual plots
for multiple predictors
also known as partial residual plots

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

hist()

A

histograms
plots frequency distribution of residuals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

residualPlot()

A

plots residuals vs predicted values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

rstudent()

A

studentised residuals excluding case outlier in diagnostics

24
Q

rstandard()

A

standardised residuals including case outlier in diagnostics

25
hatvalues()
hat values - assesses leverage
26
cooks.distance()
average distance the predicted y vaues will move if a given case is removed
27
influence.measures()
DFFit, DFbeta, DFbetas
28
covratio()
gives covratio values influence on standard errors
29
vif()
quantifies the extent to which standard errors are increased by predictor correlations gives VIF value for each predictor
30
anova()
applies the F-test for model comparison - evaluates statistical significance of improvement in variance explained in an outcome with the addition of further predictors (incremental f test)
31
AIC() / BIC()
compares specific models by comparing values and choose the model with the smaller value
32
contr.sum()
changes constract scheme from default contr.sum = sum to zero coding
33
plot()
plots the model
34
levels()
provides the levels of the dataset in the order they arise in
35
contrast()
tests the effects you have specified
36
pairs()
pairwise comparisons compares all levels of a given predictor with all levels of the other
37
adjust =
adjusts the p-value and compares the adjusted p value to the original alpha
38
Boot
takes the fitted model, f = which bootstrap statistics to compute on each bootstrap sample, (default is f = coef returning the regression coefficients) , r = how many bootstrap samples to compute, ncores = to perform calculations in parallel (default ncores = 1)
39
glm()
runs a generalised linear model
40
family =
in glm() what family of probability distribution you want for DV (what type of variable is it)
41
family = binomial
for a binary variable
42
exp()
exponentiates the coefficients (converts log-odds to odds ratio)
43
test = chisq
in anova, this performs a likelihood ratio test
44
pwr.t.test
t test power calcuation can be directional (alternative = less or greater) or two sided (alternative = two.sided)
45
pwr.r.test
correlations
46
pwr.f2.test
for linear models
47
code for coefficients of a model
coefficients(model) coef(model) model$coefficients model$coef
48
How to treat data as categorical data?
factor()
49
data as continuous data?
numeric()
50
plotMod$simplesslopes
provides simples slopes data and johnson-neyman plot
51
plotMod$interactplot
provides a simples slopes plot for cat * cont or cont * cont interactions
52
how to set control group as reference level
data$group <- relevel(data$group, 'control')
53
geom_line()
connects points of each bar to the other in the same group (colour) on a cat_plot
54
predict()
can be used to get predicted values of y from a model object
55
how to get residual from a model
model$residuals resid$model residuals(model) predict(model) - model$y
56
group_by()
groups the data into the different groups you want to measure by