Exam 1 Flashcards

1
Q

What are the oxford centre for evidence based medicine levels of evidence?

A
  • Level 1: systematic review of RCTs
  • Level 2: Randomized trial or observational study with dramatic effect
  • Level 3: Non-randomized controleed cohort/follow-up study
  • Level 4: Case-series, case-control studies, or historically controlled studies
  • Level 5: Mechanism-based reasoning (formerly known as expert opinion)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What do critical analysis of research reports do?

A
  • Determines validity of the report

* Applicability for clinical decisions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What do the guidelines for reporting of studies do?

A
  • CONSORT statement
  • Enables reader to better assess validity of the results
  • Many others (Ex. STROBE)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Describe evaluating research reports

A
  • Critical analysis of research report
  • Guidelines for reporting of studies
  • Success of evidence-based practice dependent on incorporating research findings into clinical decision making
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How do you distinguish the quality of the journal?

A
  • When evaluating scientific merit of an article, consider journal’s reputation
  • Peer-reviewed/refereed journals
    • Content experts
    • Accepted based on recommendation of reviewers
    • Processes ensure that articles meet standards (importance of study, originality, methods, interpretations/conclusions)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What questions should be asked when evaluating components of a study?

A
  • What is the study’s intent?
  • Is the study sound in it’s methodology?
  • Are results meaningful?
  • Can the results be applied to my patient?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the question ‘what is the study’s intent’ looking at?

A

The problem under investigation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What should be considered when asking ‘is the study sound in it’s methodology’?

A
  • If not, results may not be valid
  • Details of subjects (how selected/inclusion/exclusion criteria)
  • Random assignment? Blinding?
  • Reliable and valid measures?
  • Equal treatment of groups (apart from intervention)?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What should be considered when asking ‘are results meaningful’?

A
  • Was there an effect of the intervention?

* Clinically significant and statistically significant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What should be considered when asking ‘can the results be applied to my patient’?

A
  • Depends if your patient is similar to the patients studied
  • Is treatment feasible in my clinic?
  • Is treatment feasible for my patients based on their preferences?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are the characteristics of clinical research?

A
  • Structured and systematic
  • Objective process
  • Examines clinical conditions and outcomes
  • Establishes relationships among clinical phenomena (ex. how strength affects balance)
  • Provides evidence for clinical decision making
  • Provides impetus for improving practice
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are examples of clinical phenomena?

A

Manual muscle testing, ROM, propensity for falls/balance, balance confidence
*Things we can document and keep track of over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How does clinical research shift in the 20th century and what influenced the shift?

A
  • the shift was after influenced research priorities
  • Focus on outcomes research to document effectiveness
  • application of models of health and disability
  • attention to evidence based practice (EBP)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What did rehabilitation outcomes used to be related to?

A

*were related to improvements in pathologies or impairments

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What do outcomes include now?

A
  • WHO definition of health to include physical, social, and psychological well-being
    • consider patient satisfaction, self-assessment of functional capacity, quality of life (QOL)
    • Now clinicians must document outcomes to substantiate effectiveness of treatment
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What do outcomes research do?

A
  • How successful are our interventions in clinical practice specifically in terms of disability and survival
  • Studies use large databases including info not only on functional outcomes, but also on utilization of services, insurance coverage etc.
  • Measure the effectiveness of treatment in terms of patient satisfaction and outcomes as well as in terms of revenue/costs; staff productivity
  • Questionnaires are often used to measure outcomes in terms of function and health status
  • Health status scales (ex. instruments such as the Medical outcomes study short-form 36 reflects physical function, mental function, social function, and other
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What are the models in research and what do they focus on?

A
  • Biomedical Model
    • Focuses on relationship b/w pathology and impairments
    • Physical aspects of health
    • No consideration for how patient is affected by illness
  • Disablement Model
    • Pathology, impairment, functional limitation, disability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

In a disablement model: Nagi, describe pathology, impairment, functional limitation, and disability

A
  • Pathology- interference with normal bodily processes or structures
  • Impairment- anatomical, physiological, or psychological abnormalities
  • Functional limitation- inability to perform an activity in a normal manner
  • Disability- limitation in performance of activities within socially defined roles
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What does the ICF model do?

A
  • Describes how people live with their health condition
  • Includes references to environmental and personal factors affecting function
    - Contextual factors
  • Has parallels to Nagi model
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What are the parallel between the ICF and Nagi model?

A
  • Health condition: pathology
  • Body function/structure: impairments
  • Activity: functional limitation
  • Participation: disability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Describe ICF outcomes

A
  • Outcomes may be related to (targeted to) the impairment level
    • BUT must also establish functional outcomes that influence performance at the activity or at the participation levels
      - Ex. increasing strength and balance will allow the person to ambulate in the community and socialize with friends (activity level and participation level)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What does evidence based practice (EBP) do?

A

*Provision of quality care depends on ability to make choices that have been confirmed by sound scientific data, and that decisions are based on best evidence currently available

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

How does EBP begin?

A

*It begins by asking a relevant clinical question related to Patient diagnosis, prognosis, intervention, validity of clinical guidelines, safety or cost effectiveness of care

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

PICO is a good clinical question, what does it stand for?

A
P = patients/population
I = Intervention
C = comparison/control
O = outcome of interest
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
In a patient 2-weeks post hip replacement, is active exercise more effective than passive ROM exercise for improving hip ROM; what represents PICO in this inquiry?
``` P = patient 2 wks post hip replacement I = active exercise C = more effective than passive ROM exercise O = improving hip ROM ```
26
Describe what PICO does
* Question is a precursor to searching for the best evidence to facilitate optimal decision making about a patient's care * Terms in PICO can be used as search terms in a literature search for best evidence * Clinicians search and access literature * Critically appraise studies for validity * Determine if research applies to their patient
27
What are the components of EBP for clinical decision making?
* Clinical expertise * Best research evidence * Clinical circumstances and setting * Patient values and preferences
28
What are some sources of knowledge for clinical decisions and to guide clinical research?
* Tradition (always done this way) * Authority (expert opinion) * Trial and error (Try something and if it fails try something else) * Logical reasoning * Scientific method
29
What is logical reasoning?
* A method of knowing which combines - Experience - Intellect - Thought * Systematic process to answer questions and acquire new knowledge * 2 types of reasoning - Deductive - Inductive
30
Describe Deductive reasoning
* Acceptance of a general proposition and the inferences that can be drawn in specific cases * General observation- specific conclusion * Ex.= poor balance results in falls, exercise improves balance, therefore exercise will reduce risk of falls
31
Describe Inductive reasoning
* Specific observation- general conclusion * Ex.= Patients who exercise don't fall, Patients who don't exercise fall more often, Therefore exercise is associated with improved balance (and fewer falls)
32
What kind of reasoning is used in the introduction section of a research manuscript?
*Deductive logic is used when developing research hypotheses from existing general knowledge
33
What kind of reasoning is used in the discussion section of a research manuscript?
*Inductive reasoning is used when researchers propose generalizations and conclusions from data in a study
34
What is the scientific method?
* Rigorous process used to acquire new knowledge * Based on 2 assumptions - Nature is orderly/regular and events are consistent and predictable - Events/conditions are not random and have causes that can be discovered
35
What is the scientific approach defined as?
*Systematic, empirical, controlled and critical examination of hypothetical propositions about the associations among natural phenomena
36
What does the systematic nature of research imply?
* Implies a sense of order to ensure reliability | - Logical sequence to identify a problem, collect and analyze data, interpret findings
37
In the scientific method, what are the empirical, control, and critical examination?
Empirical: in research refers to direct observation to document data objectively Control: control of extraneous factors Critical examination: scrutiny of your findings by other researchers
38
How do you classify research?
*Can classify research based on a number of schema according to purposes and objectives
39
What are qualitative and quantitative research?
* Quantitative: measurement under standardized conditions- can conduct statistical analysis * Qualitative: understanding through narrative description; less structured- interviews
40
What are basic and applied research?
* Basic: 'bench research'- not practical immediately, may be useful later in developing treatments * Applied: solving immediate practical problems- most clinical research
41
What is translational research?
* Translational: - Scientific findings are applied to clinical issues - Also the generating scientific questions based on clinical issues - 'bedside to bench and back to bedside' * Collaboration among basic scientists and clinicians
42
What is experimental research?
* Experimental: researcher manipulates one or more variables and observes - major purpose is to compare conditions or intervention groups to suggest cause and effect relationships - RCT is the gold standard of experimental designs - Quasi-experimental: limited control but can get interpretable results
43
What is non-experimental research?
*Non-experimental: investigations that are descriptive or exploratory in nature
44
What is exploratory research?
* Exploratory: examine a phenomenon of interest including its relationship to other factors - In epidemiology researchers examine associations to predict risk for disease by conducting cohort and case-control studies - Methodological studies use correlational methods to examine reliability and validity of measuring instruments - Historical studies reconstruct the past on the basis of archives and other records to suggest relationships of historical interest to a discipline
45
What is descriptive research?
* Descriptive: describe individuals to document their characteristics, behaviors and conditions * Several designs
46
What are the several designs in descriptive research?
* Descriptive surveys: use questionnaires, interviews * Developmental research: patterns of growth and change over time in a segment of the population, natural history of a disease * Normative studies: to establish normal values for diagnosis and treatment * Qualitative research: Interview and observation to characterize human experiences * Case study or case series
47
How do you collect data?
* Collect data based on subject's performance on defined protoccol * Surveys * Questionnaires * Secondary analysis of large databases: use data collected for another purpose to explore relationships
48
What is a research process and what are the 5 major steps?
* Logical framework for a study's design * 5 major steps: - Identify the research question - Design the study - Methods - Data analysis - Communication
49
Why were Theories created?
*Created because we need to organize and give meaning to complex facts and observations
50
What do theories entail?
*Interrelated concepts, definitions or propositions that specifies relationships among variables and represents a systematic view of specific phenomena
51
What does scientific theory deal with?
*Scientific theory deals with empirical observation and requires constant verification
52
Why do we use theory?
* Use theory to generalize beyond specific situations and to make predictions about what we expect to happen - Provide framework for interpretation of observations - Giving meaning to research findings and observations * Stimulate development of new knowledge - Theoretical premise to generate new hypotheses which can be tested
53
What are the components of theories?
* Concepts: Building blocks of a theory - Allow us to classify empirical observations - We 'label' behaviors, objects, processes that allow us to identify them and refer to/discuss them * Concepts can be non-observable - Known as constructs - Constructs are abstract variables (Ex. intelligence)
54
What are propositions?
*Once concepts are identified they are formed into a generalization or proposition
55
What do propositions do and what are the kinds?
* They state relationships btwn variables * Hierarchial proposition (Maslow's needs) * Temporal proposition (stages of behavioral change)
56
What are models and why do we use them?
* Models are symbolic representations of the elements in a system * Can represent processes - Ex. ICF, Nagi models * Concepts can be highly complex so use models to simplify them - Ex. double helix model in genetics
57
How do you develop theories?
* By inductive or deductive processes * Most formulated using both processes - Observations initiate theory and then hypotheses tested
58
What are inductive theories?
* Data based * Begin with empirically verifiable observations * Multiple studies and observations (Patterns emerge) * Patterns develop into a systematic conceptual framework which forms basis for generalizations
59
What are deductive theories?
* Intuitive approach * Hypothetical deductive theory is developed with few or no observations - Not developed from existing facts, must be tested constantly
60
Do you test theories?
* Theories are not testable * Test hypotheses that are deducted from theories - If hypothesis is supported then theory from which it was deduced is also supported
61
What is theory a foundation for?
*It's a foundation for understanding research findings
62
Describe the importance of authors in research
* Results of studies must be explained and interpreted by authors within the realm of theory * Authors must help readers understand context within which results can be understood - Researchers should offer interpretation of findings - Contribute to the growth of knowledge
63
What constitutes researchers integrity?
* Relevant research question * Meaningful research * Competent investigators * Personal bias in measurement * Misconduct - Falsification of data - Manipulation of statistics * Publish findings * Authorship
64
What are the 3 principles that protect human rights in research?
* Autonomy * Beneficence * Justice
65
What is autonomy?
* Self determination and capacity of individuals to make decisions - Authorized decision maker is available to make decisions for subjects such as those unable to understand
66
What is beneficence?
* Attend to well-being of individuals | - Risk/benefit
67
What is justice?
* Fairness in research process - Selection of subjects appropriate for a given study - Also in randomization process
68
What are some regulations for conduct of research with humans?
* Nuremberg Code of 1949 * Declaration of Helsinki 1964 * National Research Act 1974, 1976 * Belmont Report 1979 * HIPAA 1996
69
What does the IRB do?
* Reviews research proposals - Scientific merit - Competence of investigators - Risk to subjects - Feasibility of the study * Risk-Benefit ratio - Risks to subjects are minimized and outweighed by potential benefits of the study
70
Compare Expedited vs. Exempt review
* Full Review: high risk studies * Expedited: Low risk * Exempt: surveys, interviews, studies of existing records provided data are collected in such a way that subjects can't be identified
71
What are the information elements in informed consent?
* Information elements - Subjects fully informed - Subject info confidential and anonymous - Consent form in lay language - Researcher answers questions at any time
72
What are the consent elements in informed consent?
* Consent elements - Consent must be voluntary - Special consideration for vulnerable subjects - Ex. Mental illness (Legal guardian consents) - Free to withdraw at any time
73
Why should you use measurements in research?
* Clinical decision making * Compare * Draw conclusions * Process governed by rules
74
What is a numeral measurement?
* Symbol or label (ex. 1=older adults) | * Numeral becomes a number when it represents a quantity (ex. 25 kg grip strength)
75
What do variables do in research?
* They differentiate objects or individuals | - take on different values either quantitatively (ex. height in inches) or qualitatively (ex. gender, fall status)
76
What are the different types of variables?
* Continuous variable - quantitative variable that can take on any value along a continuum - limited by precision (exactness) of the instrument - Ex. gait speed (TUG) * Discrete variables - Described in whole units - Ex. HR - Qualitative variables represent discrete categories (ex. faller/non-faller) - Dichotomous variables are qualitative variables with 2 values
77
What is Precision?
* How exact is a measure * it indicates the number of decimal places * Relates to the sensitivity of the measuring instrument * also depends on the variable - Ex. HR measure in whole numbers - Ex. Strength measure to half of a kg
78
What are examples of variables measured directly?
* ROM * Height * Distance walked in the 6MWT
79
What are examples of variables measured indirectly (not directly observable)?
* Temperature * Balance * Strength of a muscle * Health/disability
80
What are the 4 scales of measurement and why do we need them?
* They're a hierarchical based on relative precision of assigned values (nominal up to ratio) * Nominal * Ordinal * Interval * ratio
81
What is the Nominal scale?
* Numbers are labels for identification (Ex. 0=male) * Categories are mutually exclusive * Mathematically can count number in each category (proportions, frequencies, etc.)
82
What is the Ordinal scale?
* Numbers indicate rank order (ex. MMT scale) * Lack of arithmetical properties - Ordinal values represent relative position only - Ex. MMT of 4 is not 2x greater than 2 in terms of strength * Appropriate for descriptive analysis only (Ex. average rank of a group of subjects)
83
What is the Interval Scale?
* Has rank order characteristics of ordinal scale - Equal distances/intervals btwn units (Temp, calendar) - No true/natural zero - So cannot say 20 dg is twice as hot as 10 dg - ex. no ratios "allowed" - Can quantify the difference between interval scale values (ex. from 20-25 means something)
84
What is the ratio scale?
* Interval scale with zero point that has empirical meaning * Equal distances/intervals btwn units * Zero means absence of what is being measured; start measuring at zero * No negative values * Ex. height, weight, strength, age * Represent actual amounts being measured (can say age of 20 yrs is twice as much as 10 yrs)
85
What is reliability?
* It's the extent to what a measurement is consistent, free from error * You conceptualize as reproducible, dependability, agreement * Things that are reliable are the patient, examiner, or instrument
86
What happens without reliability?
* No confidence in the data | * Can't draw conclusions from the data
87
Why are measurements rarely 100% reliable?
* Human element/error | * Instruments
88
What is the observed score?
*True score +/- error
89
What is systematic error?
* Predictable errors * Constant and biased * Unidirectional * Consistent so not a problem for reliability * Can correct your readings (Ex. subtract a constant amount from each reading you take) * Are a problem for validity as measurements do not truly represent quantity being measured
90
What is random error?
* Chance * Unpredictable * Due to mistakes, instrument inaccuracy, fatigue, etc * Assume that if take a lot of measurements then random errors will eventually cancel each other out and average of a lot of scores will be a good estimate of the true score
91
What are the sources of measurement error?
* Individual (rater/tester) * Measuring instrument * Variability of the characteristic (response variable) being measured - Ex. blood pressure, HR, balance confidence * Environment (noise and temp), subjects motivation and fatigue
92
How are sources of error minimized?
* They're minimized by training, equipment inspection, maintenance, etc. * We assume these factors are random and effect cancels out over time
93
What are reliability coefficients?
*Range from 0.00 (no reliability) to 1.00 (perfect reliability) *0.50 = poor reliability *0.50-0.75 = moderate *>0.75 = good (These are guidelines only)
94
What is the relationship between correlation and agreement?
*Perfect correlation does not always mean agreement- which then means poor reliability
95
What is test-retest reliability?
* To establish that an instrument is capable of measuring with consistency * Individuals given same test on 2 occasions under identical conditions * Raters not involved * Ex. self-report surveys of balance confidence (ABC scale); measures of mechanical or digital readouts
96
What is intrarater reliability?
*stability of data recorded by 1 person on 2 or more trials
97
What is interrater reliability?
* Variation/agreement btwn 2 or more raters who measure same group of subject(s) - Preferably at exact same time - They don't discuss results with each other until done recording data
98
What is the classification of the ICC?
* Intraclass correlation coefficient * Designated as ICC (1,1) ICC(2,1), or ICC(3,1) * 1st number is the model and 2nd number is the form
99
What is alternate forms reliability?
* Reliability of 2 equivalent forms of a measuring instrument * Sometimes referred to as parallel forms * Ex. GRE or SAT- given a number of times a year but in different forms, need to establish reliability * Correlation coefficients have been used to evaluate them
100
What is alternate forms reliability based on?
* Based on administering 2 alternate forms of the test to a single group at a single session - Correlating paired observations * Usually used in educational/psych testing * Clinical examples: alternative forms (parallel forms) of strength evaluations and/or gait evaluation
101
what can determination of limits of agreement estimate?
*Can estimate range of error expected when using 2 different forms of an instrument
102
What is internal consistency (or homogeneity)?
* Items in an instrument (ex. scale) measure various aspects of the same characteristic and nothing else * Physical functioning scale should have items only relating to physical functioning * No items relating to psychological characteristics * If psychological characteristics were included then items are not homogeneous * Can be assessed conducting an item-to-total correlation
103
What is Cronbach's coefficient alpha and how is it assessed?
* Internal consistency most often is used to assess this * Evaluates items in a scale to determine if they measure the same construct * Can be used to determine which items could be removed from a scale to improve scale homogeneity
104
What is measurement validity?
* Instrument or test measures what it's intended to measure - Ex. Hand held dynamometer is a valid instrument for measuring strength because we can assess strength from pounds force - MSL is a valid measure of balance * Invalid tests may be reliable
105
What is validity?
* Measurement is relatively free from error - Ex. valid tests are reliable - Low reliability is automatic evidence of low validity
106
What is specificity of validity?
* An instrument or test is usually valid for a given purpose or situation or population * Validity is not a universal characteristic of an instrument - Ex. Assessing disability in pts with PD with instrument A does not mean that instrument A is valid for assessing disability in people with SCI - MMT isn't a valid measure of strength bc of ppl with diseases
107
What are the types of measurement validity?
* Face validity * Content validity * Criterion-related validity - Concurrent or predictive * Construct validity
108
What is face validity?
* Instrument appears to test what it's supposed to (a plausible method) * Weakest form of validity - Difficult to quantify how much face validity an instrument has (no standards to judge it) - Subjective assessment * Often established through direct observation - Ex. instrument that measures ROM; strength; gait
109
What is content validity?
* Variables have a universe of content - characteristics and info that are observable about that variable * Established if an instrument covers all parts of the universe of content - Reflects the relative importance of each part - Important in questionnaires, exams, interviews
110
What must content validity not include?
* Test must not include factors irrelevant to the purpose of measurement - Ex. test of motor performance should not contain items assessing psychological function
111
What does content validity imply?
* Implies test contains all the elements that reflect variable being studied - Ex. visual analog scale reflects only one element of pain (intensity) - McGill Pain questionnaire has greater content validity as it assesses many elements of pain such as location, intensity, duration, etc.
112
What kind of process is content validity?
* Subjective process to establish content validity - No statistics available - Experts determine if the questions cover the content domain - When experts agree that content is adequately sampled, content validity is established
113
What is Criterion-related validity?
* Most practical and objective approach to testing validity * Test to be validated is the target test * This test is compared to the "gold standard" or criterion measure that is already valid * Correlations are calculated * High correlations imply the target test is a valid predictor of performance on the criterion test - Can be used as a substitute for established test
114
What is concurrent validity?
* Target and criterion measure are assessed at approximately the same time - Ex. MSL is a valid measure of balance * Useful to establish concurrent validity when the target test may be more efficient, easier to administer, more practical, safer than the gold standard and can be used instead of gold standard - Sensory screening
115
What is predictive validity?
* A measure will be a valid predictor of some future criterion score - Ex. College admission criteria as a predictor of future success - Ex. TUG as a measure of future risk for falls
116
What is construct validity?
* Ability of an instrument to measure an abstract concept (construct) - Ex. health, disability; confidence * Constructs are not observable but exist as concepts to represent an abstract trait * Constructs are usually multi-dimensional - Ex. how do you measure "health" or "function"?
117
When is there support of construct validity?
*When a test can discriminate btwn individuals who are known to have a condition and those who do not
118
What is the known groups method in construct validity?
*Construct validation is determined by the degree to which an instrument can demonstrate different scores for groups known to vary (Ex difficulty walking versus no difficulty walking)
119
How can construct validity of a test be evaluated?
* It can be evaluated in terms of how it relates to other tests of the same and different constructs * Determine what a test does and doesn't measure- convergence and discrimination
120
What is convergent validity?
* 2 measures reflecting same phenomenon will correlate * Convergence not enough for construct validity - Must show that the construct can be differentiated from other constructs
121
What is discriminant validity?
*Measures that assess different characteristics will have low correlations
122
What do you start with when asking a research question?
* start with selecting a topic of interest - Pt population (ex. geriatrics, HD, CMT) - Intervention - Clinical assessments
123
What is the second step when asking a research question?
* Identify a research problem - Broad statement to focus the direction of the study - What is known and what is not about the topic - Clinical experiences - Clinical theory (ex. motor control theory) - effects of practice/repetition - Literature review: initially a preliminary review (not extensive; an orientation to the issues) - Gaps (areas without info to make clinical decisions - Conflicts/contradictory findings/flawed studies - Replication (to correct for design limitations)
124
What should be considered when reviewing the literature?
* Initial review is preliminary - General understanding of the state of knowledge in the area of interest * Once research problem is formulated begin a full and extensive review - Provides complete understanding of background to assist in formulating the research question * Problem provides foundation for a specific research question answerable in our study
125
What are the components in shaping a question?
Question should be: * Important = results should be meaningful and useful * Answerable = questions involving judgements and philosophy are difficult to study * Feasible for study = researchers must have the necessary skill and resources. Are there sufficient subjects?
126
What is a target population?
* Must be well defined so it is obvious who will be included in the study - Ex. community dwelling adults age 60 or more
127
What should be done for the development of a research rationale?
* Begin a full review of the literature once the research problem is delineated * A full review of the literature will establish the background for the research question - this clarifies the rationale for the study
128
What does research rationale present?
* Presents a logical argument that shows why and how a question was developed - Shows why question makes sense - Rationale provides a theoretical framework for the study - If no rationale for a study, difficult to interpret results
129
What are variables?
*Building blocks of research question
130
What is the independent variable?
* Predictor variable - Condition that ppl have - Intervention used - Characteristics of people that will predict or cause an outcome (young vs. old)
131
What is the dependent variable?
*It's the outcome variable that varies depending on the independent variable, it does not have levels like an independent variable
132
What is a conceptual definition?
* Dictionary definition; general * Leg flexibility is the degree of motion in the leg * No info on what measure of flexibility/motion is used in a particular research study (ex. ROM or SLR)
133
What is operational definition?
* Variable is defined according to its meaning within a particular study - Ex. definition of "balance impairment" in a study
134
What is the final step in delineating a research question?
* clarifying objective of study | * Research objective must specifically and concisely delineate what a study is expected to accomplish
135
How might the objectives be presented?
* Hypotheses * Specific aims * Purpose of research * Research objectives - terms vary according to journals, researchers, disciplines
136
What are the 4 general types of research objectives?
* Descriptive- to characterize clinical phenomena or conditions in a population * Measurement properties of instruments - investigation of reliability and validity * Explore relationships- to determine interactions among clinical phenomena (ex. strength and balance study) * Comparisons- outlining cause and effect relationships using experimental model - evaluates group differences regarding effectiveness of treatment
137
What is the purpose of a study?
*To test the hypothesis and provide evidence to either accept or reject it
138
What is a hypotheses?
* In experimental studies and exploratory studies researcher proposes an educated guess about study outcomes - Statement called a hypothesis - statement that predicts relationship between independent and dependent variables
139
How is a hypotheses made?
* Not proposed on the basis of speculation * Derived from theory * Suggested from previous research, clinical practice/experience, observation
140
What is a research hypothesis?
* States researchers true expectations of results * more often than not research hypotheses propose a relationship in terms of a difference * Some research hypothesess predict no difference btwn variables
141
What is analysis of data based on?
* Based on testing the statistical (null) hypothesis | - always predicts no difference or no relationship
142
What is a non-directional hypothesis?
*Do not predict a direction of change
143
What is a directional hypothesis?
*Predict a direction
144
What is a simple hypothesis?
*Includes 1 IV and 1 DV
145
What is a complex hypothesis?
*More than 1 IV or DV
146
What is important in reviewing the literature?
* Initial review is preliminary - General understanding of the state of knowledge in the area of interest * Once research problem is formulated begin a full and extensive review - Provides a complete understanding of background to assist in formulating the research question
147
What should be in the scope of the literature review?
* Depends on your familiarity with the topic * How much has been done in the area and how many relevant references there are * review info on: pt population, methods, equipment used, operational definition of variables, statistical techniques used
148
What is the difference between primary and secondary sources?
* Primary: research articles | * Secondary: review articles (systematic review) and textbooks
149
What is validity in experimental design?
*Issues in experimental control that must be addressed so that we have confidence in the validity of the experimental outcomes
150
What is the most rigorous form of investigation to test hypotheses?
* The scientific experiment | - Researcher manipulates and controls variables
151
What is an experiment?
*Looking for cause-and-effect relationship btwn independent variable and dependent variable
152
What are experiments designed for?
* Designed to control for extraneous (nuisance) variables that exert a confounding (contaminating) influence - Ex. 2 types of treatment received simultaneously
153
What are the 3 characteristics of experiments?
* Independent variable is manipulated by the researcher * Random assignment * Control (comparison) group must be incorporated into the study
154
Describe "independent variable is manipulated by the researcher"
* intervention (independent variable) is administered to one group (experimental) and not to another (control) * Active variable: IV with levels that can be manipulated and assigned (ex. treatment) * Attribute variable: IV with levels that cannot be manipulated as they represent subject characteristics (ex. occupation, gender, age group) - not a true experiment
155
Describe Random Assignment
* In assigning subjects to groups * Each subject has an equal chance of being assigned to any group * No systematic bias exists that might affect differentially affect the dependent variable * If groups are equivalent at the beginning of a study - Differences observed at end of study are not due to differences that existed before the study began * eliminates bias by creating a balanced distribution of characteristics across groups
156
Describe "control group must be incorporated into the study design"
* to rule out extraneous effects, use control group to compare to experimental group * Must assume equivalence btwn control and experimental groups at start of study - If we see a change in treatment group (experimental) but not control group, can assume it's the treatment
157
What is the research protocol in experimental control?
* To control for extraneous (confounding) factors in a study - Eliminate them as much as possible (implies the researcher must know what they are) - Standardize research protocol so that each subject has a similar testing experience - Ensure they affect all groups equally (Ex. Testing environment, fatigue, instructions to subjects)
158
Why do you maximize adherence to research protocol?
* To limit loss of data - Data losses compromise effect of random assignment - Decreases power of a study
159
What are the reasons for loss of data?
* Subjects may drop out during the study * subjects may cross over to another treatment * Subjects may refuse assigned treatment after allocation * Subjects may not be compliant with assigned treatments
160
Why do you do blinding in experimental control?
* To avoid observation bias * Participants knowledge of treatment status or investigators expectations can influence performance or recording outcomes - double blind study = subjects and investigators are unaware of the identity of groups until after study - Single blind study = only investigator/measurement team is blinded
161
What are the design strategies for controlling for intergroup variability?
* Random Assignment - Eliminates bias by creating a balanced distribution of characteristics across groups - Not perfect and may not always work (groups may not be balanced on important variables) * Can use other strategies beside randomization * Choose homogenous subjects (ex. only males or older adults) * Matching (age and gender) across groups * Use subjects as own controls (repeated measures design) * Analysis of covariance (ANCOVA) - not a design strategy but a statistical strategy - covariates
162
What are the 4 types of design validity that form a framework for evaluating experiments?
* Statistical conclusion validity * Internal validity * Construct Validity * External Validity