ASSESSMENT AND EVALUATION Flashcards

1
Q

defined as a
process of appraising
something or someone.

A

Assessment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

made to
identify the level of
performance of an
individual.

A

Assessment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Assessment:________________ Evaluation:______________

A

Quantitative; Qualitative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Making judgment about the
value or worth of objects or
events based on some
measurement to arrive at a
decision.

A

Evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Performed to determine the
degree to which goals are
attained.

A

Evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

a process of collecting, reviewing and using data, for the purpose of improvement in the current performance.

A

Assessment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

described as an act of passing judgement on the basis of set of standards

A

Evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Assessment by nature is…

A

Diagnostic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Evaluation by nature is…

A

Judgemental

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Provides feedback on performance and areas of improvement

A

Assessment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Determines the extent to which objectives are achieved

A

Evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Purpose of Assessment is…

A

Formative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Purpose of Evaluation is…

A

Summative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Orientation of Assessment…

A

Process Oriented

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Orientation of Evaluation…

A

Product Oriented

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

a chart that spells out the content and level of
knowledge to be tested; it can be general or specified based on teacher’s
preference.

A

Examination blueprint

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Factors that determines the number of items in the
Examination:

A

a. amount of material taught
b. type of test question used
c. amount of time available for testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

learner is compared with a reference
group of learners

A

Relative terms (norm reference)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

learner is compared to a well defined
performance criteria

A

Absolute terms (criterion reference)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

CRITERIA FOR SELECTION OF EVALUATIVE
DEVICES/TOOL:

A

Validity
Reliability
Objectivity
Relevance
Practicality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

the degree of accuracy which a test measures what is
intends to measure.

A

Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

adequacy with which the test item measured on
the areas.

A

Content validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

the extent to which a relationship exists
between the test scores and later success.

A

Predictive validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

extent to which a relationship exist between
test scores ad an accepted contemporary criterion of performance on
the variable the test is supposed to assess.

A

Concurrent validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

the consistency with which the test measures what it
intends to measure.

A

Reliability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Dimensions of reliability:

A

a. Examinee/students
b. Examiner/scorer
c. Test content
d. Time
e. Situation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

a test is administered twice; the correlation between the scores is
an estimate of temporal reliability.

A

Test-retest method

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

two alternate or equivalent forms of a test are constructed and
administered and 1 3 weeks interval.

A

Alternate or equivalent forms of method

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

odd numbered and even numbered items are scored as separate
tests.

A

Split-half method

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

the degree of agreement between the judgment made by independent and competent observers as to whether or not a learner’s test performance meets the criteria stated in a learning objective.

A

Objectivity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

the degree to which the criteria established for selection of questions conform with the aims of the measuring instrument.

A

Relevance

32
Q
  • the convenience in using the test instrument;
  • refer to the development of evaluative device capable of being administered and scored with reasonable ease within the limits of time and of the resources imposed by circumstances.
A

Practicality

33
Q

Consider practicality as to:

A

a. Test construction
b. Administration
c. Scoring
d. No. of examinees

34
Q

KNOWLEDGE - Formal Evaluation Instruments:

A

Objective Examination
- Selection Type, Supply Type, Interpretative Exercises
Subjective Examination
- Free response, restricted response, project assignment

35
Q

KNOWLEDGE - Non-Formal Evaluation Instruments:

A

Practical Exam
Oral Exam
observational reports

36
Q

ATTITUDES - Direct Methods:

A

Questionnaires
Semantic Differential
Attitude Goals
Observation Rating Scale

37
Q

ATTITUDES - Indirect Methods:

A

Test of judgment
Test of memory and perception
Information test

38
Q

contains the stem, the key and a distractor

A

Multiple choice

39
Q

contains premises/hypotheses, and the
responses/alternatives which contain the jokers.

A

Matching type

40
Q

responses which do not match any premise

A

Jokers

41
Q

there are only two possible answers,
that is true or false, correct or incorrect, right or wrong.

A

Alternate response

42
Q

examinee is presented with a direct question
or an incomplete statement

A

Supply Type

43
Q

Examples of Supply Type:

A
  • Fill in the blanks
  • Definition of terms
44
Q

Interpretation of graphs, table, pictures, situational analysis

A

Interpretative Exercises

45
Q

the learners perform the skills to be
evaluated; simulated or actual

A

Practical Exam

46
Q

assigning a learner a task or project
to complete and then evaluate on the basis of the product of the completed performance.

A

Project assignment

47
Q

usually a six step bipolar adjective
scale indicating direction and intensity

A

Semantic Differential

48
Q

Likert Scale

A

Attitude Goals

49
Q

observation at work of
cognitive and affective behavior and scale of observable behavior.

A

Observational Rating Scale

50
Q

Free answer testing

A

Test of judgment

51
Q

based on the assumption that what is perceived and remembered is influenced by one’s attitude.

A

Test of memory and perception

52
Q

based on the assumptions that incases of uncertainty, people tend
to guess in the direction of their attitude.

A

Information test

53
Q

– the assignment of numbers to objects or events according to logically accepted rules

A

Measurement

54
Q

Five Basic components of Evaluation:

A

Audience
Purpose
Questions
Scope
Resources

55
Q

is the persons or groups for whom the evaluation is being conducted

A

Audience

56
Q

to decide whether to continue a particular education program or to determine the effectiveness of the teaching process

A

Purpose

57
Q

directly related to the purpose, are specific, and measurable

A

Questions

58
Q

determined in part by the purpose for conducting the evaluation and in part by available resources

A

Scope

59
Q

include time, expertise, personnel, materials, equipment’s, and facilities

A

Resources

60
Q

Evaluation Models:
to make adjustment in an educational activity, as soon as they are needed, whether those adjustments be in personnel, materials, facilities, learning objectives or even the health professional educator’s attitude.

A

Process (Formative) evaluation

61
Q

Evaluation Models:
to determine whether learners have acquired the knowledge or skills taught during the learning experience

A

Content evaluation

62
Q

Evaluation Models:
to determine the effects or outcomes of teaching efforts; its intent is to summarize what happened as a result of education

A

Outcome (Summative evaluation

63
Q

Evaluation Models:
to determine the relative effects of education on the institution and the community; the purpose is to obtain information that will help decide whether continuing an educational activity is worth its cost.

A

Impact evaluation

64
Q

Evaluation Models:
– designed and conducted to assist an audience to judge and improve the worth of some object/ educational program

A

Program evaluation

65
Q

Components of Evaluation Design:

A

Evaluation Structure
Evaluation Method
Evaluation Instruments

66
Q

all evaluations should be systematic and carefully and thoroughly planned before they are conducted

A

Evaluation Structure

67
Q

include those actions that are undertaken to carry out the evaluation according to the design structure; all evaluation methods deal with data and data collection

A

Evaluation Methods

68
Q

using existing instruments because instrument development requires expertise, time and expenditure of resources; requires rigorous testing for reliability and validity.

A

Evaluation Instruments

69
Q

Three methods to minimize the effects of unexpected events:

A
  1. Conduct a pilot test first
  2. Include extra time
  3. Keep a sense of humor
70
Q

– basis for interpreting the results obtained from a test

A

Reference system

71
Q

may measure the acquisition of skills and knowledge from multiple sources such as notes, texts and syllabi.

A

Norm-reference system/relative standard

72
Q

measure performance on specific concepts and are often used in a pre-test/post-test format

A

Criterion reference system/absolute standard

73
Q

refers to the process of adjusting student grades in order to ensure that a test or assignment has the proper distribution throughout the class

A

a. grading on a curve (SD

74
Q

the scores are added up and divided by the number of scores. The mean is sensitive to extreme scores when population samples are small.

A

combining scores to obtain mean score

75
Q

calculated based on the number of questions correct out of the MPL required, not the total number of questions

A

calculating minimum pass level (MPL)

76
Q

an important tool to increase test effectiveness. It provides statistics on overall performance, test quality, and individual questions. Each items contribution is analyzed and assessed. To write effective items, it is necessary to examine whether they are measuring the fact, idea, or concept for which they were intended.

A

item analysis