111 Testing Flashcards

1
Q

111.1 State the purpose of a testing program?

A

To ensure a quality testing process is implemented to effectively assess the trainee’s achievement of learning objectives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

111.3 State the primary course source data for creating test items.

A

JDTA, OCCSTDS, CTTL/PPP Table, COI.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

111.4 List usable course source data to be used when the primary course source data is not available or has not been created.

A

If JDTA data is not available then curriculum developers will bridge the absence of JDTA data using data elements from a combination of : Occupational Standards (OCCSTDs), CTTL, PPP Table, and a COI.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

111.5 Define the following tests:

A. Formal

A

Test is graded and is used in the calculation of the trainee’s final grade.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

111.5 Define the following tests:

B. Informal

A

May or may not be graded - regardless, the grade will not be used in the calculation of the trainee’s final grade.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

111.6 For the below items, define the three levels of proficiency levels contained with each:
A. Skill

A

Level 1: Imitation
Level 2: Repetition
Level 3: Habit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

111.6 For the below items, define the three levels of proficiency levels contained with each:
B. Knowledge

A

Level 1: Knowledge/Comprehension
Level 2: Application/Analysis
Level 3: Synthesis/Evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

111.7 List the five categories for performance and knowledge test.

A

Pretest: For validation of material, acceleration, pre-requisite, advance organizer
Progress: Test blocks of instruction
Comprehensive Test: Within Course or Final Exam
Oral Test: Normally by board assesses trainees comprehension
Quiz: Short test to assess achievement of recently taught material

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

111.8 Discuss the process of piloting a test.

A

It is a review process to assess test reliability and make corrective adjustments before actually collecting data from the target population. Includes: Review by SMEs, piloting by CCMM and forwarded to LSO for approval, testing trainees who are in the end stages (test results not to count), surveying trainee test results, using test item analysis and survey to improve the test instrument.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

111.9 Describe the use of each test instrument as they relate to knowledge and performance test:
A. Job Sheet

A

Job sheets direct the trainees in the step-by-step performance of a practical task they will encounter in their job assignment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

111.9 Describe the use of each test instrument as they relate to knowledge and performance test:
B. Problem sheet

A

Problem sheets present practical problems requiring analysis and decision making similar to those encountered on the job.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

111.9 Describe the use of each test instrument as they relate to knowledge and performance test:
C. Assignment sheet

A

Assignment sheets are designed to direct the study or homework efforts of trainees.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

111.9 Describe the use of each test instrument as they relate to knowledge and performance test:
D. Multiple-choice

A

Multiple-choice test item is the most versatile of all knowledge test item formats.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

111.9 Describe the use of each test instrument as they relate to knowledge and performance test:
E. True or false

A

True or false test items provide only two answers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

111.9 Describe the use of each test instrument as they relate to knowledge and performance test:
F. Matching

A

Matching test items are defined as two lists of connected words, phrases, pictures, or symbols.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

111.9 Describe the use of each test instrument as they relate to knowledge and performance test:
G. Completion

A

Completion test items are free response test items in which the trainees must supply the missing information from memory.

17
Q

111.9 Describe the use of each test instrument as they relate to knowledge and performance test:
H. Labeling

A

Labeling or identification test items are used to measure the trainee’s ability to recall facts and label parts in pictures, schematic, diagrams, or drawings.

18
Q

111.9 Describe the use of each test instrument as they relate to knowledge and performance test:
I. Essay

A

Essay test items require trainees to answer a question with a written response.

19
Q

111.9 Describe the use of each test instrument as they relate to knowledge and performance test:
J. Case study

A

Case studies should be used when posing a complex issue, when a comprehensive understanding of material is required.

20
Q

111.9 Describe the use of each test instrument as they relate to knowledge and performance test:
K. Validation of Test Instruments

A

After test instruments have been constructed, and before they are actually assembled into a test, the content must be validated.

21
Q

111.10 What are the two types of testing methods used in testing?

A

Criterion-Referenced Test: Assesses whether required level of skill or knowledge is met.
Norm-Referenced: Estimates individual skill or knowledge in relation to a group norm (e.g., Navy Advancement Exams).

22
Q

111.11 Discuss test failure policies and associated grading criteria within you learning environment.

A

Test (if failed), Re-train, Re-test. If passed, the highest score the student can receive is an 80%

23
Q

111.12 Discuss during performance test design how the skill learning objective critically is determined.

A

Will be developed using job sheets. Problem sheets are normally not used as a means of performance assessment, but may be used to evaluate achievement of less critical learning objectives. Criticality of performance points to the need for selecting tasks for training that are essential to job performance, even though the tasks may not be performed frequently. The following levels of criticality (high = 3, moderate = 2, and low = 1) will be useful when determining criticality of performance: High - value of 3. Skill is used during job performance. Moderate - value 2. Skill influences job performance. Low - value 1. Skill has little influence on job performance.

24
Q

111.13 Discuss during knowledge test design how the knowledge learning objective criticality is determined to perform a task.

A

Knowledge tests will be developed using test items. Test items knowledge test design begins with determining the criticality of each learning objective. This process determines which learning objectives to assess through formal testing and which learning objectives should be assessed by informal testing. At the completion of this step, the assessment of each learning objective is determined. Analysis of task data provides the information for determining learning objective criticality. To determine criticality refer to the following elements of course source data, at a minimum: criticality of performance, and frequency of performance. Additional fields may be considered if deemed necessary by curriculum developers. The factors used to determine the criticality of each learning objective will be listed in the testing plan.

25
Q

111.14 Identify the ten sections of a testing plan

A
Course Data
Course Roles and Responsibilities
Course Waivers
Test Development
Test Administration
Course Tests and Test Types
Grading Criteria
Remediation
Test and Test Item Analysis
Documentation
26
Q

111.15 State the purpose of test and test item analysis

A

To determine statistical validity, test and test item analysis techniques are required. The three types of analysis discussed and required for use are: difficulty index, index of discrimination, and effectiveness of alternatives. Test item analysis will be documented in the course’s testing plan.

27
Q

111.16 In a remediation program, discuss what the primary and secondary goal is.

A

A remediation program’s primary goal is to motivate and assist trainees in achieving the critical learning objectives of a course by providing additional instructional study time. A second goal of remediation is to remove barriers to learning. Because trainees learn in different ways, it may be necessary to use different methods of remediation of realize the most effective results.

28
Q

111.17 Discuss the three methods of remediation available to instructors:
A. Targeted

A

Targeted remediation: designed to assist the trainee who is having difficulty in accomplishing an objective and/or understanding the material during normal classroom time. Limited one-on-one mentorship or SME engagement of each major objective area that the trainee is having difficulty with using text and/or lab material.

29
Q

111.17 Discuss the three methods of remediation available to instructors:
B. Scalable

A

Designed to assist the trainee who is having difficulty in accomplishing objectives or understanding the material for a major portion of a course, during normal classroom time. Involves one-on-one mentorship or SME engagement of each major objective area that the trainee is having difficulty with using a total recall approach using one or a combination of: text, lab, material, flashcards, mentor question and answer sessions.

30
Q

111.17 Discuss the three methods of remediation available to instructors:
C. Iterative

A

Involves one-on-one mentorship or SME engagement of each major objective area that the trainee is having difficulty with using a total recall approach using one or a combination of: text, lab, material, flashcards, mentor question and answer sessions. To complete iterative remediation the trainee must complete a minimum of 20 questions per each objective area with a minimum score of 80% and/or successfully complete two practice exercises or scenarios per each objective area.

31
Q

111.18 Define the following sections of a remediation program:
A. Retest

A

When the trainee does not achieve a test’s minimum passing grade, the retest may cover the portion of the test the trainee had difficulty with or the entire test. This decision should be based on the degree of difficulty the trainee had with the test.

32
Q

111.18 Define the following sections of a remediation program:
B. Setback

A

When the trainee does not achieve a test’s minimum passing grade, the retest may cover the portion of the test the trainee had difficulty with or the entire test. This decision should be based on the degree of difficulty the trainee had with the test.

33
Q

111.18 Define the following sections of a remediation program:
C. Drop from training and attires

A

Every effort will be made to help trainees succeed. However, there are times when the trainee is clearly unsuited, unable, and/or unwilling to complete the course. If this occurs, the trainee is dropped from training. Trainees dropped from training may be classified as an academic drop, nonacademic drop, or disenrollment. Trainees who are discharged from the Navy will be classified as attires.

34
Q

111.18 Define the following sections of a remediation program:
D. Counseling

A

Preventive counseling will be instituted in “A” and “C” schools and should include counseling for performance and personal problems.

35
Q

111.18 Define the following sections of a remediation program:
E. Academic Review Boards (ARBs)

A

Will be convened when other means of academic counseling, remediation, and an initial academic setback have failed to improve trainee performance. The initial academic setback may result from an academic counseling session and be directed by the CS. Additional academic setbacks must be recommended by the ARB and approved by the DOT. Examples of when an ARB may be necessary include the following:
Trainee’s course average falls below minimum passing grade.
Trainee is unable to achieve the objectives after counseling, remediation, retesting, and an initial academic setback.
Trainee’s performance is below expected academic progress.
Trainee fails to achieve the objectives after an academic setback on those same objectives.