Chapter 11 Flashcards
This allows for an assessment of the influence of change in X and change in Y once the effects of other
factors (e.g., A, B, and C) are considered.
MULTIVARIABLE ANALYSIS
Statistical models that have one outcome variable, but more than one independent variable.
MULTIVARIABLE MODELS
Multivariable analysis helps us to understand the relative importance of different independent variables for explaining the variation in a dependent (_______) variable (y) when they act alone and when they work together (_______).
OUTCOME, INTERACTION
For example, in the first two decades of life, age and height predict body weight, but age and height are usually correlated. During the growth years, height and weight increase with age, so age can be considered the ___ ___ ___ and height can be viewed as an ___ ___ influencing weight.
UNDERLYING EXPLANATORY VARIABLE, INTERVENING VARIABLE
Sometimes randomization is impossible, however, or important factors may not be adequately controlled by this strategy. One way to remove the effects of these
unwanted factors is to control for them by using ___ ___ ___.
MULTIVARIABLE STATISTICAL ANALYSIS
Several important assumptions underlie most multivariable methods commonly used in medical research. Most methods of regression analysis assume that the average value of
the outcome can be represented in terms of a __ __ __ __ __ (__ __ __).
LINEAR FUNCTION OF THE COVARIATES (ASSUMPTION OF LINEARITY)
The effects of independent variables are assumed to be ____ (__ __ __), and if not, testing of interaction is warranted. This involves entering a term in a multivariable equation that represents the interaction between two of the independent variables.
INDEPENDENT (ASSUMPTION OF INDEPENDENCE)
The assumption of
___ refers to homogeneity of all levels of the independent variables. In other words, it is assumed that variance and error are ____ across a range of values
for a given variable in the equation.
HOMOSCEDASTICITY, CONSTANT
Presence or absence of other disease.
COMORBIDITY
A weighting factor measuring its relative
importance in predicting prognosis.
COEFFICIENT
Because the error term is unknown at the beginning,
the statistical analysis uses various values for the coefficients, regression constant, and
observed x values to predict the value of y, which is called ___.
Y-HAT (ŷ)
a is the starting point, usually called the __ __.
REGRESSION CONSTANT
In multivariable analysis, the error term e is often called a ___.
RESIDUAL
The values of a and the several bs that, taken together, give the smallest value for the squared error term are the best estimates
that can be obtained from the set of data. Appropriately enough, this approach is called the ___ __, because the process is stopped when the sum of squares of
the error term is the least
LEAST-SQUARES SOLUTION
ANOVA
ANALYSIS OF VARIANCE