week 9 multiple choice testing Flashcards

1
Q

pros of MCQ

A

objective and easy, no need for inter-rating reliability

can remove easy or odd questions easily

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

cons of MCQ

A
  • doesn’t measure partial knowledge = only knowing part of the answer but not confident
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

how can you measure partial knowledge

A

confidence marking - assigning confidence to answer chosen

elimination testing - eliminate all options incorrect

complete ordering - rank options from most to least fave

partial ordering - eliminate if confident and rank remaining ones

probability testing - distribute 100 points across options

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

probability testing

A

final score is sum of probabilities assigned to correct answer

precision in measuring partial knowledge

examinees tend to like it

in standard format you get 0% if select wrong answer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

cons with multiple correct answers

A
  • not much research
  • concern that risk-taking students score differently than risk-averse students which has nothing to do with actual knowledge
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

major problem in MCQ

A

correct guessing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

how to counter effect of correct guessing

A

formula scoring
1/n-1 where n = number of options

can omit answers to avoid penalty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

problems with formula scoring

A

students don’t like being penalized

test score is influenced by metacognition - eg omitting answer when if they answered it, they would have got it correct

test score is influenced by risk taking abilities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

McDaniel et al 2007 method

A
  • web based uni course on brain and behavior
  • assigned weekly readings then practiced with MCQ, short-answer quizzing (fill in the blank), rereading (control)
  • corrective feedback provided
  • some facts not practiced at all
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

McDaniel et al 2007 results

A

short answer (cued recall) is best

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

negative testing effect

A

chose the wrong answer during retrieval practice, then in final test you choose the wrong answer again because its most familiar

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

marsh e3t al 2009 method

A
  • ptsp answered SAT MCQ on bio, chem, world and US history
  • formula score type so can omit
  • filler task
  • answer short answer test 40 earlier and 40 new items
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

marsh et al 2009 results

A

negative testing effect present

high school junior vs undergrads suffered more and omitted less, rather choose wrong answer then pass.

CORRECTIVE FEEDBACK NEEEDED!!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How does related questions lead to correct choosing

A

competitive lures between related options leads to elaborative retrieval

deeper retrieval of facts to eliminate least likely option

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

little et al 2019 method

A

ptsp completed online MCQ practice first with general knowledge
- elimination testing to encourage processing of all lures
- after distractor task, completed cued-recall test
- previously tested items repeated
- related questions and new questions aka control

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

whether practice with related items facilitates later test performance depends on???

A

whether the final test is in cued-recall or MC format

17
Q
A
18
Q

what did alamri and higham in press study try to find

A

Does falls recognition of practice questions also lead to impaired multiple choice performance in genuine educational context?

Can sequencing of relates to pairs be used to poo student performance?

19
Q

alamri and higham in press METHOD

A
  • 164 first year psychology students
  • wrote a 55 item MC test in preparation for final exam
  • iv = SEQUENCING OF RELATED PAIRS
  • 11 related-separated pair
  • 11 related-back-to-back pairs
  • 11 new items
  • no repeated items

reason for separation is if put back to back they would notice questions aren’t the same

20
Q

what did alamri and higham in prep think would happen

A
  • lots of false recognition, performance impairment
    during related-separated pairs
  • very little false recognition, comparing questions might boost performance during related back-to-back pairs
21
Q

alamri and higham in prep results

A

facilitation was observed even though ptsp answered all questions in MC format. bigger boost in back to back condition

22
Q

compare negative testing effect and related questions effect

A
  • MC test at time 1 then cued recall test in time 2 BUT in RQE there’s MC test in time 1 then MC test in time 2
  • In NTE repeated question with same correct answer but in RQE related questions with different correct answers

-NTW = select ted lures on T1 are falsely recalled on T2 but in RTE, feedback on T1 is erroneously selected again on T2

  • in NTE, reduced by feedback with RQE its caused by feedback of earlier questions
  • NTE errors caused by repeating earlier error but in RQE errors caused by responding with earlier negative feedback.
23
Q

what did Kelley et al 2019 find

A

peer wise involves 2 effective learning techniques

generation - generating info leads to better memory than reading

retrieval practice leads to better long term retention

24
Q

Kelley et al 2019 method

A
  • 40 students enrolled in cognitive psych course
  • generate and answer questions
  • for each 8 textbook chapters, required to generate one question but can do more, must provide explanation with reference
  • evenly spaced throughout semester
  • also required to answer and evaluate other students questions

at end of course researchers examine the questions generated an answered by each student and determine if there was exam questions that overlapped with either generated or answer the question.

Compare exam performance on these questions to control questions exam questions on topics that is no overlap with the students or third who answered peer wise was topics

25
Q

Kelley et al 2019 finding

A
  • generation scored 90% compared to control of about 75%
  • benefit of retrieval practice
26
Q

The final score with probability testing for MC exams is:

A

The sum of the probabilities assigned to the correct answers

27
Q

On an MC test in probability testing format, John assigns 70% to a wrong answer but 30% to the right answer. This shows how:

A

Probability testing measures partial knowledge

27
Q

Answering a difficult MC question can facilitate performance on related cued-recall questions answered later on. This demonstrates:

A

Participants were retrieving information about the lures

28
Q

Compared to related MC questions that are separated by several intervening items on a test, presenting related MC questions back-to-back in an educational context has been shown to:

A

Benefit performance on the second version of the questions