Test Construction Analysis Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

A test is a set of questions, problems, or exercises that have been systematically organized to determine a person’s knowledge, abilities, aptitude, or qualifications. (Jacobs, 1992)

A

Definition and uses of a test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Test as a tool to evaluate:

A. To determine students’ entry knowledge and skills.
B. To identify students’ weaknesses and learning difficulties.
C. To certify students’ performance.
D. All of the above

A

D. All of the above

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Refers to the extent oraccuracy with which a test measures what it intends to measure.

A

Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Threats to test reliability;

A
  1. Unclear directions
  2. Insufficient time allotment
  3. Lengthy examination
  4. Presence of distractions, and disturbances during administration
  5. Ambiguous test items
  6. Lack of objectivity in scoring leading to wide disagreement among raters
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Threats to test validity;

A
  1. Lack of standardization in test administration
  2. Response bias or evaluation apprehension
  3. Inadequate sampling of content
  4. Inadequate sampling of content
  5. Complex and subjective scoring
  6. Poorly written test items
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Refers to the consistency with which a test measures what it is measuring.

A

Reliability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

VALIDITY

  • Achieved when a test measures accurately and adequately the specific objectives or content area of a subject
  • Test blueprint, table of specifications

A. Construct validity
B. Predictive validity
C. Content validity
D. Concurrent validity

A

C. Content validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

VALIDITY

  • Relationship between test scores and an accepted contemporary or current criterion ofperformance.
  • FEU-CAT, UP-CAT,SATs

A. Construct validity
B. Predictive validity
C. Content validity
D. Concurrent validity

A

D. Concurrent validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

VALIDITY

  • Concerned whether a test can substitute for the actual observation of the person performing the skill in real-life situations

A. Construct validity
B. Predictive validity
C. Content validity
D. Concurrent validity

A

A. Construct validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q
  • Relationship between the test scores and performance using another evaluationtool.
  • “Will the examinees’ scores in an entrance test predict how well they will perform in school once admitted?

A. Construct validity
B. Predictive validity
C. Content validity
D. Concurrent validity

A

B. Predictive validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

RELIABILITY

  • A wide variation in results may point to a highly unreliable evaluation tool.

A. Test-retest method
B. Alternate form method
C. Comparing results from different raters

A

C. Comparing results from different raters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

A. Test-retest method
B. Alternate form method
C. Comparing results from different raters

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

RELIABILITY

  • The same test is given twice at different times.
  • A reliable test should show similar scores.

A. Test-retest method
B. Alternate form method
C. Comparing results from different raters

A

A. Test-retest method

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

RELIABILITY

  • Two equivalent or parallel forms of a test are administered at the same time.
  • Exam Set A, Exam Set B

A. Test-retest method
B. Alternate form method
C. Comparing results from different raters

A

B. Alternate form method

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Refers to the usefulness or applicability of the test procedure.

A

Practicality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Preparing the test blueprint

A. List the specific objectives or content area on the left-hand column of the table
B. List the mental process (learning dimension) you expect your students to utilize in order to answer the test items (Remember, Understand, Apply, Analyze,Evaluate, and Create)
C. Assign the weight, percentage, or number of test questions that should be devoted to each objective or content area.
D. Determine the percentage or number of test questions that should relate to the different mental processes to be tested.
E. All of the above

A

E. All of the above

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Two-way table of test blueprint for certifying examination in colon and rectal surgery

A

Test blueprint sample

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

This is the most appropriate and commonly utilized tool to assessthe cognitive domain of learning

A

Objective written examinations

18
Q

Two major types of written examination items

A
  1. Supply-type items
  2. Selection-type items
19
Q

Options are given and thestudents select the correct answer.

  • True or false
  • MCQ
  • Matching type
  • Rank in order
  • Relation-comparison
A

Selection-type items

20
Q

Students, provide answers to the questions.

  • Completion
  • Short answer
  • Essay
  • Problem solving
A

Supply-type items

21
Q

This type of examination is mainly utilized to assess higher levels of mental abilities, such as creative, integrative, and critical thinking; organization; problem solving; self-expression; and written communications skills.

A

Essay examination

22
Q

Three common forms of essay

A
  1. Restricted response
  2. Extended response
  3. Modified essay
23
Q
  • Short answers
  • Specific answers

A. Restricted response
B. Extended response
C. Modified essay

A

A. Restricted response

24
Q

Examinees are given a series of progression situations with open-minded questions at the end of each situation

A. Restricted response
B. Extended response
C. Modified essay

A

C. Modified essay

25
Q
  • Long answers
  • Gives the students the chance to be creative

A. Restricted response
B. Extended response
C. Modified essay

A

B. Extended response

26
Q
  • A formal examination involves a face-to-face interaction between at least two people; the examiner and the examinee.
  • Is most appropriately utilized to assess attributes that require direct observation and interaction.
  • The most important use of oral examination is to test the clinical reasoning skills of examinees
A

Oral examination

27
Q

This examination is useful in evaluating the learner’s achievement of objectives

A

Practical examination

28
Q

What are the four objectives that are involved in the practical examination?

A
  1. The recognition of a structure
  2. A psychomotor behavior or practical skill
  3. An interpersonal skill
  4. Attitudes
28
Q

Three Major Types of Practical Examination

A
  1. Objective Practical Examination
  2. Process Practical Examination
  3. Product Practical Examination
29
Q

Examinees identify individual structures or interpret individual findings

A. Objective Practical Examination
B. Process Practical Examination
C. Product Practical Examination

A

A. Objective Practical Examination

30
Q

The examiner makes an indirect assessment of a professional task by observing a product that the examinees have produced

A. Objective Practical Examination
B. Process Practical Examination
C. Product Practical Examination

A

C. Product Practical Examination

31
Q

Examinees are directly observed as they perform tasks in a real or simulated setting.

A. Objective Practical Examination
B. Process Practical Examination
C. Product Practical Examination

A

B. Process Practical Examination

32
Q
  • Used to assess competencies through OSCE
  • Consist of 15-to-20-minute encounters between patients and trainees, which are observed by the examiner.
A

Mini-clinical evaluation exercise (Mini-CEX)

33
Q

It is an effective mechanism to support and facilitate personal learning and growth

A

Portfolio assessment

34
Q

Records of events and experiences, completed projects, list of critical reviews of articles read, journals, and diaries, and even video clips or photographs of patient encounters.

A

Student portfolio

35
Q

A collection of works by an artist or architect.

A

Portfolio

36
Q
  1. Raw Score
    – merely a numerical summary of performance
  2. Derived score
    – score scale that has well-defined characteristics and yields normative meaning.
    - Percentile rank
    - Standard Score
  3. Stanines
    – populations of scores are converted to a scale of nine equal units, with each unit being one-half of the standard deviation.

A. Single Marks
B. Narrative or Descriptive system
C. Multiple Reporting Systems

A

C. Multiple Reporting Systems

37
Q
  1. Percentage Grades
    – usually on a 100-point scale.
  2. Two-step system
    – pass or fail; satisfactory or unsatisfactory; credit or no credit
  3. Three-or-more step system
    – honors, pass or fail (three steps), P+, P, P- and F (four steps), A, B, C, D, F, or numbers (five steps).

A. Single Marks
B. Narrative or Descriptive system
C. Multiple Reporting Systems

A

A. Single Marks

38
Q
  1. Narrative
    – the student’s performance described in words and phrases, usually in the form of an informal letter to parents.
  2. Checklist and rating scale
    – this is a list of behavioral descriptions or performance objectives that are rated according to specific categories by teachers as they apply to each student
  3. Parent teacher conference.

A. Single Marks
B. Narrative or Descriptive system
C. Multiple Reporting Systems

A

B. Narrative or Descriptive system

39
Q
  1. Classify items into: Relevance (essential, important, acceptable, questionable, or must-know, useful to know, nice to know) and Difficulty (easy, medium and hard)
  2. Determine the percent probability that a borderline student can answer each group of questions correctly.
  3. Determine the MPL
    - MPL = MPL for each group X number of items in each group.

A. Nedelsky’s method (MCQ)
B. Angoff’s method (Objective Type)
C. Ebel’s method (Any Type)

A

C. Ebel’s method (Any Type)

40
Q
  1. For each item, determine which option a borderline student can reject outright as wrong
  2. Determine the acceptability index (Ai) of the item:
    - Ai = 1/number of items not rejected
  3. Determine the MPL
    - %MPL = Ai for all items X 100/total number of questions.

A. Nedelsky’s method (MCQ)
B. Angoff’s method (Objective Type)
C. Ebel’s method (Any Type)

A

A. Nedelsky’s method (MCQ)

41
Q
  1. Assess each question individually
  2. Determine the percentage of questions of similar difficulty as the item analyzed as answerable by a borderline student, or determine the percentage of 100 borderline students who will be able to answer the item correctly
  3. Determine the MPL
    - MPL = MPL of all items X 100/number of items

A. Nedelsky’s method (MCQ)
B. Angoff’s method (Objective Type)
C. Ebel’s method (Any Type)

A

B. Angoff’s method (Objective Type)

42
Q

Is the process of applying statistical techniques to assess the difficulty and discriminating power of each item in order to improve their quality.

A

Item analysis