Finals Part 2 - Evaluating Operational Definitions Flashcards
The same variables can have many definitions, how can we know which definition is best?
Concepts of Reliability and Validity
Reliability means…
Consistency and Dependability
If we apply variable in more than one experiment, they ought to work in similar ways each time.
Reliability
If possible, we select _________________ in measuring dependent variable because it is shown to be reliable.
Standardized test
Types of Reliability
Inter-rater reliability
Test-retest reliability
Inter-item reliability
The degree of agreement or consistency between 2 or more scorers regarding a measure
Inter-rater reliability
Also known as inter observer, inter coder, inter scorers
Inter-rater reliability
Reliability of measure by comparing scores of people who have been measured twice with the same instrument.
Test-retest reliability
Also known as time sampling reliability
Test-retest reliability
Different parts of a questionnaire, test, or other instruments designed to assess the same variable attain consistent results.
Inter-item reliability
Inter-item reliability can be assessed by:
Split half method
Cronbach’s Alpha
Principle of actually studying the variables intended to be manipulated or measured
Validity
The degree to which a manipulation or measurement technique is self evident
Face validity
Describes a judgement of how adequately a test samples representatives of the universe of behavior that the test was designed to sample.
Content Validity
Index of the degree to which a test scores predict some criterion measure
Predictive Validity
Index of the degree to which a test score is related to some criterion measure obtained at the same time.
Concurrent Validity
Answers the question: Does the content of our measure fairly reflect the content of the quality are measuring?
Content Validity
Answers the question: Do our procedure yield information that enables us to predict future behavior or performance?
Predictive validity
Evaluated by comparing scores on the measuring instrument with another known standard for the variable being studied.
Concurrent Validity
Established through a series in which a researcher simultaneously define some construct and develops the instrumentation to measure it.
Construct Validity
Deals with the transition from theory to research application
Construct Validity
Two types of Construct Validity
Convergent Validity
Divergent (Discriminant) Validity
Newly developed test will be correlated with a related construct
Convergent Validity
Newly developed test will be correlated with an unrelated construct
Divergent or Discriminant Validity
The degree to which we can confidently infer a causal relationship between the variables
Internal Validity
Variable other than the IV and DV. Not the focus of an experiment but can produce effects on the DV if not controlled
Extraneous Variable