111 TESTING Flashcards
State the purpose of a testing program?
NAVEDTRA 132 pg 1-2
To ensure a quality testing process is implemented to effectively assess the trainee’s achievement of learning objectives
State the roles and responsibilities of the following for an effective testing program:Naval Education Training Command (NETC)
Testing policy and guidance
State the roles and responsibilities of the following for an effective testing program:NETC N7
Testing policy and guidance oversight and monitors compliance by Centers.
State the roles and responsibilities of the following for an effective testing programLearning Center Commanding Officer
Serves as CCA; manages Sites, DET; resolves difference, incorporates TYCOM test banks, as
appropriate
State the roles and responsibilities of the following for an effective testing program Learning Center Director of Training
Ensures testing program(s) are conducted, oversees development of testing plans
State the roles and responsibilities of the following for an effective testing program Learning Center Learning Standards Officer
Provides guidance to curriculum developers on testing, monitors Total Quality Indicators (TQI)
and test item analysis and remediation programs.
State the roles and responsibilities of the following for an effective testing programCourse Curriculum Model Manager (CCMM)
Approves test design, maintains master test item bank
State the roles and responsibilities of the following for an effective testing programCurriculum developer
Designs and develops the testing plan, admin guides, and the tests
State the roles and responsibilities of the following for an effective testing programLearning Site Commanding Officer/Officer-in-charge
Implements testing plan, designates Testing Officer(s), designates the course supervisor
State the roles and responsibilities of the following for an effective testing programLearning Site Testing Officer
Test administration, oversees grading, secures tests, maintains test bank(s),
coordinates/manages revisions, conducts IS training
State the roles and responsibilities of the following for an effective testing programCourse Supervisor
Ensures, monitors, and validates admin, security, and test item analysis.
State the roles and responsibilities of the following for an effective testing programParticipating Activities
Provides comments, feedback, new test items and maintains test and test item analysis data.
State the primary course source data for creating test items
NAVEDTRA 132 pg 3-4
JDTA, OCCSTDS, CTTL/PPP Table, COI.
List usable course source data to be used when the primary course source data is not available or has not been created
NAVEDTRA 132 pg 3-4
If JDTA data is not available then curriculum developers will bridge the absence of JDTA data using data elements from a combination of: Occupational Standards (OCCSTDs), CTTL, PPP
Table, and a COI
Define the following tests- Formal
Test is graded and is used in the calculation of the trainee’s final grade
Define the following tests:Informal
May or may not be graded – regardless, the grade will not be used in the calculation of the
trainee’s final grade
For the below items, define the three levels of proficiency levels contained within each:Skill
Level 1: Imitation
Level 2: Repetition
Level 3: Habit
For the below items, define the three levels of proficiency levels contained within each:Knowledge
Level 1: Knowledge/Comprehension
Level 2: Application/Analysis
Level 3: Synthesis/Evaluation
List the five categories for performance and knowledge tests.
Pre-test For Validation of Material, Acceleration, Pre-requisite, Advanced Organizer
Progress Test Blocks of instruction
Comprehensive Test: Within Course or Final Exam
Oral Test Normally by board (panel of evaluators) assesses trainees comprehension
Quiz Short test to assess achievement of recently taught material
Discuss the process of piloting a test.
NAVEDTRA 132 pg 4-8
It is a review process to assess test reliability and validity and make corrective adjustments before actually collecting data from the target population. It includes: Review by SMEs, piloting by CCMM and forwarded to LSO for approval, testing trainees who are in the end stages (test results not to count), surveying trainee test results, using test item analysis and survey to improve the test instrument.
Describe the use of each test instrument as they relate to knowledge and performance tests :Job sheet
Job sheets direct the trainees in the step-by-step performance of a practical task they will
encounter in their job assignment.
Describe the use of each test instrument as they relate to knowledge and performance tests :Problem sheet
Problem sheets present practical problems requiring analysis and decision making similar to
those encountered on the job.
Describe the use of each test instrument as they relate to knowledge and performance tests :Assignment sheet
Assignment sheets are designed to direct the study or homework efforts of trainees.
Describe the use of each test instrument as they relate to knowledge and performance tests :Multiple-choice
Multiple-choice test item is the most versatile of all knowledge test item formats.
Describe the use of each test instrument as they relate to knowledge and performance tests :True or false
True or false test items provide only two answers
Describe the use of each test instrument as they relate to knowledge and performance tests :Matching
Matching test items are defined as two lists of connected words, phrases, pictures, or symbols.
Describe the use of each test instrument as they relate to knowledge and performance tests :Completion
Completion test items are free response test items in which the trainees must supply the
missing information from memory
Describe the use of each test instrument as they relate to knowledge and performance tests :Labeling
Labeling or identification test items are used to measure the trainee’s ability to recall facts and
label parts in pictures, schematics, diagrams, or drawings.
Describe the use of each test instrument as they relate to knowledge and performance tests :Essay
Essay test items require trainees to answer a question with a written response
Describe the use of each test instrument as they relate to knowledge and performance tests :Case study
Case studies should be used when posing a complex issue, when a comprehensive
understanding of material is required.
Describe the use of each test instrument as they relate to knowledge and performance tests :Validation of Test Instruments
After test instruments have been constructed, and before hey are actually assembled into a test, the content must be validated
What are the two types of testing methods used in testing?
NAVEDTRA 132 pg 4-9
Criterion-Referenced Test: Assesses whether required level of skill or knowledge is met.
Norm-Referenced: Estimates individual skill or knowledge in relation to a group norm (e.g., Navy Advancement Exams).
Discuss test failure policies and associated grading criteria within your learning
environment
Test (if failed), Re-train, Re-test. If passed, the highest score the student can receive is an
80%.
Discuss during performance test design how the skill learning objective criticality is determined
Will be developed using job sheets. Problem sheets are normally not used as a means of performance assessment, but may be used to evaluate achievement of less critical learning objectives. Criticality of performance points to the need for selecting tasks for training that are essential to job performance, even though the tasks may not be performed frequently. The
following levels of criticality (high = 3, moderate = 2, and low = 1) will be useful when determining criticality of performance: High - value of 3. Skill is used during job performance.
Moderate - value 2. Skill influences job performance. Low - value 1. Skill has little influence on job performance.
Discuss during knowledge test design how the knowledge learning objective criticality is determined to perform a task.
Knowledge tests will be developed using test items. Test items Knowledge test design begins
with determining the criticality of each learning objective. This process determines which
learning objectives to assess through formal testing and which learning objectives should be
assessed by informal testing. At the completion of this step, the assessment of each learning
objective is determined. Analysis of task data(discussed in Chapter 3) provides the information
for determining learning objective criticality. To determine criticality refer to the following
elements of course source data, at a minimum: criticality of performance, and frequency of
performance. Additional fields may be considered if deemed necessary by curriculum
developers. The factors used to determine the criticality of each learning objective will be listed
in the testing plan.
Identify the ten sections of a testing plan.
Course Data Course Roles and Responsibilities Course Waivers Test Development Test Administration Course Tests and Test Types Grading Criteria Remediation Test and Test Item Analysis Documentation
State the purpose of test and test item analysis
To determine statistical validity, test and test item analysis techniques are required. The three types of analysis discussed and required for use are: difficulty index, index of discrimination, and effectiveness of alternatives. Test item analysis will be documented in the course’s testing plan.
In a remediation program, discuss what the primary and secondary goal is.
A remediation program’s primary goal is to motivate and assist trainees in achieving the critical learning objectives of a course by providing additional instructional study time. A second goal of remediation is to remove barriers to learning. Because trainees learn in different ways, it may be necessary to use different methods of remediation to realize the most effective results.
Discuss the three methods of remediation available to instructors:Targeted
Targeted remediation is designed to assist the trainee who is having difficulty in
accomplishing an objective(s) and/or understanding the material during normal classroom time. Targeted remediation involves limited one-on-one mentorship or SME engagement of the objective(s) area that the trainee is having difficulty with, using text and/or lab material.
Discuss the three methods of remediation available to instructors:Scalable
Scalable remediation is designed to assist the trainee who is having difficulty in accomplishing objectives or understanding the material for a major portion of a course, during normal classroom time. Scalable remediation involves one-on-one mentorship or SME engagement of each major objective area that the trainee is having difficulty with using a total recall approach using one or a combination of: text, lab material, flashcards, mentor question and answer sessions.
Discuss the three methods of remediation available to instructors:Iterative
Iterative Remediation involves one-on-one mentorship or SME engagement of each major objective area that the trainee is having difficulty with using a total recall approach using one ora combination of: text, lab material, flashcards, mentor question and answer sessions To complete iterative remediation the trainee must complete a minimum of 20 questions per each objective area with a minimum score of 80 percent and/or successfully complete two practice exercises or scenarios per each objective area.
Define the following sections of a remediation program:Retest
When the trainee does not achieve a test’s minimum passing grade, the retest may cover the
portion of the test the trainee had difficulty with or the entire test. This decision should be based
on the degree of difficulty the trainee had with the test
Define the following sections of a remediation program:Setback
When the trainee does not achieve a test’s minimum passing grade, the retest may cover the
portion of the test the trainee had difficulty with or the entire test. This decision should be based
on the degree of difficulty the trainee had with the test.
Define the following sections of a remediation program:Drop from training and attrites
Every effort will be made to help trainees succeed. However, there are times when the trainee
is clearly unsuited,unable, and/or unwilling to complete the course. If this occurs, the trainee is
dropped from training. Trainees dropped from training may be classified as an academic drop,
nonacademic drop, or disenrollment. Trainees who are discharged from the Navy will be
classified as attrites.
Define the following sections of a remediation program:Counseling
Preventive counseling will be instituted in ―A‖ and ―C‖ schools and should include counseling for
performance and personal problems.
Define the following sections of a remediation program:Academic Review Boards (ARBs)
ARBs will be convened when other means of academic counseling, remediation, and an initial academic setback have failed to improve trainee performance. The initial academic setback may result from an academic counseling session and be directed by the CS. Additional academic setbacks must be recommended by the ARB and approved by the DOT. Examples of when an ARB may be necessary include the following:
• Trainee’s course average falls below minimum passing grade.
• Trainee is unable to achieve the objectives after counseling, remediation, retesting, and an
initial academic setback.
• Trainee’s performance is below expected academic progress.
• Trainee fails to achieve the objectives after an academic setback on those same objectives.