Section 2 C Flashcards

1
Q

What reference governs testing?

A

NAVEDTRA 132

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

State the purpose of a testing program

A

to ensure a quality testing process is implemented to effectively assess the trainee’s achievement of learning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Explain the roles and responsibilities for an effective testing program: NMETC/Naval Education and Training Command.

A

Testing policy guidance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Explain the roles and responsibilities for an effective testing program: NMETC Academic Directorate/NETC N7

A

Oversight and monitors compliance by learning centers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Explain the roles and responsibilities for an effective testing program: learning centers Commanding Officer

A

Serves as CCA; incorporates TYCOM test banks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Explain the roles and responsibilities for an effective testing program: Director of Training

A

Ensures testing programs are conducted.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Explain the roles and responsibilities for an effective testing program: Learning Standards Officer (LSO).

A

Provides guidance to curriculum developers on testing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Explain the roles and responsibilities for an effective testing program: Curriculum Control Model Manager (CCMM)

A

Approves test design; maintains test bank

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Explain the roles and responsibilities for an effective testing program: Curriculum Developer

A

Designs/develops testing plan.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Explain the roles and responsibilities for an effective testing program: learning site/Det CO/OIC

A

Implements testing plan

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Explain the roles and responsibilities for an effective testing program: Testing Officer

A

Test administration, grading, scores tests, maintains test bank.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Explain the roles and responsibilities for an effective testing program: Course Supervisor

A

Ensures, monitors, validates admin, security, and test item analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Explain the roles and responsibilities for an effective testing program: Participating Activities

A

Provides comments, feedback, new test items, and maintains test and test item analysis data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

State the primary course source data for creating test items.

A

Job Duty Task Analysis (JDTA)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

List usable course source data to be used when the primary course source data is not available or has not been created.

A

Use a combination of Occupational Standards (OCCSTD), Course Training Task List (CTTL), Personal Performance Profile (PPP) Table, and a Curriculum Outline of Instruction (COI).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Define Formal Testing

A

Graded and is used in the calculation of the trainees final grade.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Define Informal Testing

A

May or may not be graded - regardless the grade will not be used in the calculation of the trainees final grade.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Define the following proficiency levels and the definition of the three levels contained with each.

A
  • Skill: level 1 imitation; level 2 Repetition; level 3 Habit
  • Knowledge: level 1 knowledge/comprehension; level 2 application/analysis; level 3 synthesis/evaluation.
19
Q

List and discuss the five categories for performance and knowledge tests.

A
  • Pre-test: validation of material, prerequisite.
  • Progress test: given after blocks of instruction.
  • Comprehensive test: Final course exam.
  • Oral test: Normally administered by board.
  • Quiz: tests recently taught material.
20
Q

Discuss the process for piloting a test.

A

Reviewed by SME, piloting by CCMM and forwarded to the Learning Standards Officer (LSO) for approval, testing trainees who are at the end stages, surveying trainee test results, using test item analysis and survey to improve the test.

21
Q

Describe the use of Job Sheet as it relates to knowledge and performance tests.

A

Direct trainee step-by-step performance.

22
Q

Describe the use of Problem Sheet as it relates to knowledge and performance tests.

A

Present practical problems that require analysis.

23
Q

Describe the use of Assignment Sheet as it relates to knowledge and performance tests.

A

Direct the study or homework efforts of trainees

24
Q

Describe the use of Multiple-choice as it relates to knowledge and performance tests.

A

Most versatile of all knowledge test.

25
Q

Describe the use of True or False as it relates to knowledge and performance tests.

A

Test questions that is either true or false.

26
Q

Describe the use of Matching as it relates to knowledge and performance tests.

A

Two lists of connected words/phrases/pictures/symbols.

27
Q

Describe the use of Completion as it relates to knowledge and performance tests.

A

Allow free response.

28
Q

Describe the use of Labeling as it relates to knowledge and performance tests.

A

Measure ability to recall facts.

29
Q

Describe the use of Essay as it relates to knowledge and performance tests.

A

Requires written response.

30
Q

Describe the use of Case Study as it relates to knowledge and performance tests.

A

Used for complex issues for comprehension understanding.

31
Q

Describe the use of Validation of Test Instruments as it relates to knowledge and performance tests.

A

Validating information on test.

32
Q

What are the two types of testing method adopted by NMETC/NETC?

A

Criterion-referenced test and Norm-referenced test

33
Q

Discuss test failure policies and associated grading criteria within your learning environment.

A

Re-train, re-test…if passed the highest score the highest grade the student can receive is an 80%.

34
Q

Discuss how skill learning objective criticality is determined during performance test design.

A
  • Developed using Job Sheets.

- Minimum criticality: criticality of performance and frequency of performance.

35
Q

List the ten sections of a testing plan.

A
  • Course Data
  • Course Roles and Responsibilities
  • Course Waivers
  • Test Development
  • Test Administration
  • Course test and test types
  • Grading criteria
  • Remediation
  • Test and test item analysis
  • Documentation
36
Q

State the purpose of test and test item analysis.

A

Determine statical validity.

3 types: difficulty index, index of discrimination, effectiveness of alternatives.

37
Q

List and discuss the primary and secondary goal of remediation program.

A
  • Motivate and assist trainees in achieving the critical learning objectives of a course by providing additional instructional study times.
  • remove barriers of learning.
38
Q

Discuss the three methods of remediation available to instructors.

A

Targeted: one-on-one mentorship for one objective
Scalable: one-on-one mentorship with SME of each major objective.
Iterative: Total recall approach

39
Q

Discuss the following as it relates to the remediation program: Retest

A

when trainee does not achieve a test minimum passing grade.

40
Q

Discuss the following as it relates to the remediation program: Setback

A

May cover a certain portion of the course

41
Q

Discuss the following as it relates to the remediation program: Drop from training and attrite

A

Drop can be academic, nonacademic, or disenrollment

42
Q

Discuss the following as it relates to the remediation program: Counseling

A

Preventive counseling in “A”/”C” schools, include performance and personal issues.

43
Q

Discuss the following as it relates to the remediation program: Academic Review Board (ARB).

A

Will be convened when other means have not helped. Setback may be due to ARB.