Assessment and Evaluation Flashcards

1
Q

A method used to measure the level of achievement or performance.

A

Measurement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

The process of gathering, describing, or quantifying information about performance.

A

Assessment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Is making judgment based on assessment results whether to revise the lesson or develop a new one.

A

Evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

An instrument or systematic procedure for measuring a sample of behavior by posing a set of questions in a careful manner.

A

Test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

A specific stimulus to which a person responds overtly; this response can be scored or evaluated (for example, classified, graded on a scale, or counted).

A

Item

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Pertains to the form, plan, structure, arrangement, and layout of test items as well as to related considerations such as time limits.

A

Format

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

People involved in creating

and developing method of assessment. The purpose is usually for research, publication or refinement of existing tests.

A

Test Developer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Refers to the wide array of professionals who relay on psychological assessment and tools for various purposes.

A

Test User

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Refers to the person whom the assessment tools were administered to.

A

Test Taker

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

It is established by comparing the
scores obtained from two successive measurements of the same individuals and calculating a correlation between the two sets of scores.

A

Test Retest

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

A type of carryover effect wherein the scores on the second test administration are higher than they were on the first.

A

Practice Effect

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Occurs when the first testing session influences the results of the second session and this can affect the test- retest reliability of a psychological measure.

A

Carry Over Effect

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

It refers to the degree to which the measurement procedure measures the variable that it claims to measure (accuracy).

A

Validity Defined

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

It is not done by statistical analysis but by the inspection of items.

A

Content Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Inspected by a group of experts/evaluators.

A

Item Inspection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Is the simplest and least scientific form of validity and it is demonstrated when the face value or superficial appearance of a measurement measures what it is supposed to measure.

A

Face Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Look if the group of experts/evaluators are consistent with their judgment about each items.

A

Inter-Judge Consistency

18
Q

These are partially related and partially independent.

A

Reliability and Validity

19
Q

The teacher/test developer must be familiarized with the content of the test he/she is planning.

A

Pre-Survey

20
Q

Learning competency: Write a descriptive paragraph.

A

Assessment Criteria

21
Q

One on one conversation with another person and asking questions to gather information, opinion, and stories.

A

Interviews

22
Q

Sustained exploration of an unfamiliar situation and is open ended.

A

Investigations

23
Q

It provides information about the students’ progress in school.

A

Feedback

24
Q

Evaluation administers at the conclusion of a unit of instruction to comprehensively assess student learning and the effectiveness of an instructional method or program.

A

Summative Assessment

25
Q

A process of viewing and studying visual media such as movies, documentaries and movie clips as related to the lesson.

A

Viewing Analysis

26
Q

Learners think of a response to a question individually first. They form pairs to discuss their answers. Then together they agree on the ideas they will share in class.

A

Think-Pair-Share

27
Q

Speaking and acting activity where the learners pretend to be something or someone to simulate an event.

A

Role Playing

28
Q

Experience the output they produce.

A

Aesthetic Project

29
Q

Learners solve a practical problem.

A

Project

30
Q

Graphics for organizing and representing knowledge of students. Branches out into specific ideas.

A

Concept Map

31
Q

Extended pieces of writing designed to tell a story, present information, or give an opinion.

A

Essays

32
Q

It is a continuous and several assessments done during the instructional process for the purpose of improving teaching or learning.

A

Formative Assessment

33
Q

Used to test the validity and reliability of the test items made.

A

Second Try Out

34
Q
PURPOSE OF TESTING
1.
2.
3.
4.
A
  1. To identify what students have learned
  2. To identify student strengths and weaknesses
  3. To provide a method for awards and recognition
  4. To provide a way to measure a teacher’s effectiveness
35
Q

TYPES OF TEST FORMAT
1.
2.

A
  1. Essay Tests

2. Objective tests

36
Q

TYPES OF RELIABILITY
1.
2.

A
  1. Test-retest

2. Equivalent Forms

37
Q

LIMITATIONS OF TEST-RETEST RELIABILITY
1.
2.

A
  1. Carryover effect

2. Practice effect

38
Q

TYPES OF VALIDITY
1.
2.

A
  1. Content Validity

2. Face Validity

39
Q
FACTORS AFFECTING VALIDITY (11)
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
A
  1. Overloading a test with items concerning too many facts.
  2. Selecting appropriate numerical problems for a test but using vocabulary in problems and directions that only better readers could understand.
  3. Unclear directions
  4. Reading vocabulary and sentence structure too difficult
  5. Inappropriate level of difficulty of the test items
  6. Poorly structured test items
  7. Ambiguity
  8. Inadequate time limits
  9. Test too short
  10. Improper arrangement of items
  11. Identifiable pattern of answers
40
Q
STEPS IN TEST CONSTRUCTION (9)
1.
2.
3.
4.
5.
6.
7.
8.
9.
A
  1. Pre-Survey
  2. Making of Table of Specifications
  3. Consultation with Experts
  4. Item Writing
  5. Consultation with Experts
  6. First try-out
  7. Item Analysis
  8. Second Try-Out
  9. Assemble and Finalize the Test
41
Q
GUIDING PRINCIPLES FOR EFFECTIVE GRADING 
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
A
  1. Discuss the grading procedure to students
  2. Grades will be purely based on achievement
  3. Explain how other factors (personal-social) behaviors will be reported.
  4. Relate grading procedures to the learning outcomes or objectives
  5. Get hold of valid evidences like test results, reports presentation, projects and other assessments
  6. Take precautions to prevent cheating
  7. Return all test and other assessment results
  8. Properly weight the various types of achievement
  9. Tardiness, weak effort, misbehavior should not affect the grades
  10. Be judicious/fair and avoid bias
  11. Do not change grades
  12. Keep students informed of their class standing or performance
42
Q
Purposes of Grading
1.
2.
3.
4.
5.
6.
7.
8.
9.
A
  1. Administrative Purposes
  2. Promotion and Retention
  3. Placement of students and awards
  4. Program Evaluation and Improvement
  5. Admission and Selection
  6. Discovering Exceptionalities
  7. Diagnosing Exceptionalities
  8. Counseling Purposes
  9. Motivation