111 TESTING Flashcards

1
Q

REFERENCES

A

NAVEDTRA 132, Navy School Testing Program Management Manual

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Purpose of testing program?

A

ensure a quality testing process is implemented to effectively assess the trainee’s
achievement of learning objectives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

NETC Role

A

Testing policy and guidance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

NETC N7

A

Testing policy and guidance oversight

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Learning Center Commanding Officer

A
  • Serves as CCA
  • incorporates TYCOM test banks

NAVIFOR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Learning Center Director of Training

A

testing program(s) are conducted, oversees development of testing plans

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Learning Center Learning Standards Officer

A

Provides guidance to curriculum developers

Total Quality Indicators (TQI)

test item analysis

Approves KTAGs and PTAGs. ( Knowledge and Performance)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Course Curriculum Model Manager (CCMM)

A

Approves test design, maintains master test item bank.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Curriculum developer

A

Designs and develops the testing plan, admin guides, and the tests.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Learning Site Commanding Officer/Officer-in-charge

A

Implements testing plan, designates Testing Officer(s), designates the course supervisor.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Learning Site Testing Officer

A

oversees grading, secures tests, maintains test bank(s),

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Course Supervisor

A

Ensures, monitors, and validates admin, security, and test item analysis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Participating Activities

A

Provides comments, feedback, new test items and maintains test and test item analysis data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

primary course source data

A

JDTA, OCCSTDS, CTTL/PPP Table, COI.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

List usable course source data to be used when the primary course source data is
not available

A
Occupational Standards (OCCSTDs), CTTL, PPP
Table, and a COI.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Formal

A

Test is graded and is used in the calculation of the trainee’s final grade.

17
Q

Informal

A

May or may not be graded – regardless, the grade will not be used in the calculation of the
trainee’s final grade.

18
Q

skill

A

Level 1: Imitation
Level 2: Repetition
Level 3: Habit

19
Q

Knowledge

A

Level 1: Knowledge/Comprehension
Level 2: Application/Analysis
Level 3: Synthesis/Evaluation

20
Q

five categories for performance and knowledge tests.

A
Pre-test
Progress
Comprehensive Test
Oral Test
Quiz
21
Q

piloting a test.

A

Review by SMEs, piloting
by CCMM and forwarded to LSO for approval,testing trainees who are in the end stages (test
results not to count), surveying trainee test results, using test item analysis and survey to
improve the test instrument.

22
Q

Job sheet

A

trainees in the step-by-step performance of a practical task

23
Q

Problem sheet

A

present practical problems requiring analysis and decision making

24
Q

Assignment sheet

A

designed to direct the study or homework efforts

25
Q

Multiple-choice

A

is the most versatile of all knowledge test item formats.

26
Q

True or false

A

provide only two answers.

27
Q

Matching

A

two lists of connected words, phrases, pictures, or symbols.

28
Q

Completion

A

free response test items

29
Q

Labeling

A

measure the trainee’s ability to recall facts and

label parts in pictures, schematics, diagrams, or drawings.

30
Q

Essay

A

answer a question with a written response.

31
Q

Case study

A

used when posing a complex issue, when a comprehensive

understanding of material is required.

32
Q

Validation of Test Instruments

A

After test instruments have been constructed and before hey are actually assembled into a test, content must be validated.

33
Q

two types of testing methods

A

Criterion-Referenced Test: level of skill or knowledge is met.

Norm-Referenced: Estimates individual skill or knowledge

34
Q

test failure policies

A

in accordance with my testing plan

35
Q

ten sections of a testing plan

A
Course Data
Course Roles and Responsibilities
Course Waivers
Test Development
Test Administration
Course Tests and Test Types
Grading Criteria
Remediation
Test and Test Item Analysis
Documentation
36
Q

test and test item analysis

A

To determine statistical validity, test and test item analysis techniques are required.

37
Q

remediation program

A

program’s primary goal is to motivate

second goal of
remediation is to remove barriers to learning

38
Q

three methods of remediation

A

Targeted one section

Scalable two or more sections

Iterative all sections

one on one mentorship for all

39
Q

following sections of a remediation program

A

Retest
Setback
Drop from training and attrites
Counseling

Academic Review Boards (ARBs):will be convened when other means of academic counseling, remediation, and an initial
academic setback have failed