Competence I: Ch 3, 4, & 21 Flashcards

1
Q

Research Questions

A

research questions are key elements in developing a topic area, and in essence are questions that explore the relations among or between constructs. Ex: Does the client’s level of dysfunction affect the working alliance formed in counseling?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Research Hypotheses

A

more specific in that it states the expected relationship between the constructs. Ex: More-dysfunctional clients will form poorer alliances in counseling than less-dysfunctional clients.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Descriptive Questions

A

ask what some phenomena or events are like. Developed by collecting inventories, surveys, or interviews to describe events.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Difference Questions

A

Ask if there are differences between groups of people, or even within individual participants. A comparison must be taking place in some form. Ex. Between group and within group designs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Relationship Questions

A

explore the degree to which two or more constructs are related or vary together. Use correlations or regression analyses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Function of a testable research question/ hypothesis

A
  1. Asks a question about 2. the relationships between two or more constructs that can be 3. Measured in some way.
    • The question should be worded clearly and unambiguously in question form
      - the research question should inquire into a relationship between two or more constructs, asking whether construct A is related to construct B (mostly for difference or relationship questions, descriptive questions collect information or categorize information)
      - Relationship is examined and also measured
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Operational Definition

A

specifying the activities or operations necessary to measure a construct in a particular experiment. Provides a working definition of a phenomenon.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Uniformity Myth

A

We have oversimplified counseling to assume that psychotherapeutic treatments are standard (uniform) set of techniques, applied in a consistent (uniform) manner, by a standard (uniform) therapist, to a homogeneous (uniform) group of clients. Greatly hampered our progress in understanding counseling research, so encourage questions about best types of treatments for particular types of clients across various settings. Translates into a research design myth, that one type of research is better than all others when we should be asking, what is the best research design for this particular problem at this time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

MAXMINCON Principle

A

Researcher first tries to maximize the variance of the variable(s) pertaining to research question, second minimize the error variance of random variables, third control the variance of extraneous/ unwanted variables. Applies most directly to experimental research (between or within groups design) can also be applied to all research designs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Internal Validity

A

internal validity of a study is the experimental control allowing researchers to make more inferences about causal relationships between variables. High in control use random selection of participants, random selection to treatments, manipulation of IV to allow researcher to make inferences about causality.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

External Validity

A

generalizability of the results to applied settings

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Experimental Field Studies

A

Moderate external validity and internal validity. (high and high in table) Characterized by investigations that manipulate IVs and are conducted in real life settings. Examine causality through random assignment of treatment and control of IVs. Moderately high in external validity. Can never exercise same control in field as in laboratory, only moderately high internal validity as well.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Experimental Laboratory Studies

A

low in external validity and high in internal validity. Characterized by manipulation of IVs and are conducted in laboratory settings. Low external because instead of using participants directly sampled from a population of interest experimenter sets up a situation to resemble a naturally occurring one. High in internal because experimenter randomly assign participants to treatments and manipulate one or more IV. Can make inferences about causality.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Descriptive Field Studies

A

High external validity, low internal validity. Are investigations that do not exercise experimental controls and are conducted in a real life setting. High external because sample can be taken directly from a population of interest. Low internal because variables are studied as they occur naturally rather than being manipulated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Descriptive Laboratory Studies

A

Low in both internal and external validity. Investigations that do not exercise experimental controls (randomization or manipulation of IV) that are conducted in laboratory settings. Low in external because uses a setting that can only simulate real life. Low in internal validity because lacks experimental control in manipulation of an IV or randomization of participants. This study involves describing , identifying, and categorizing data and obtaining descriptive stats. Two reasons conduct this study: 1. Laboratory setting allows researcher some control over extraneous variables 2. It is impossible to study some phenomena in a field or real life setting.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Experimental Controls

A

randomization of subjects, or manipulation of the independent variable. MAXMINCON principle, maximize variable of variable related to research question, minimize error variance of random variables and control variance of extraneous variables.

17
Q

Existing Body of Knowledge

A

information previous research suggest about a topic and the kinds of questions remaining unanswered. Want new research to add to the existing literature. Also the design is important, if topic has been researched in a laboratory setting, new research helpful in field setting.

18
Q

Independent Variable

A

variables you manipulate

19
Q

Dependent Variable

A

Variable you measure

20
Q

Bubble Hypothesis

A

suggests doing research is similar to trying to apply a sticker to a car windshield. As one air bubble is eliminated another pops up. Every research design is flawed. Each will have different limitations and strengths, but no single design can entirely eliminate the bubble.

21
Q

Evaluation Program Stakeholders

A

program managers, funders, and others who have special interest in the program. NOT people responsible for carrying out the program. Get input about the program evaluation, ask for participation and incorporate opinions and suggestions. What questions would the stakeholders like the evaluation to answer?

22
Q

Structured Observations

A

requires the observer to be detached from the activity and typically the observations sample the program’s activities in a structured fashion. Multiple observers may code human behaviors and tally frequencies of behaviors that occur.

23
Q

Evaluating Programs via surveys, focus groups, journals and content testing

A

Evaluation must not only examine how the program was implemented, but also ascertain whether the anticipated effects of the program are evident and whether the program is on track to achieve the stated program goals.

  • Surveys; low cost way collect info from program participants. Can be used to get pre and post test data, quick and easy to administer,
  • focus group: interacton of people as they discuss a common experience or viewpoint. Self report of how program affected participants. Also program staff can share their concerns and impressions
  • Content Testing: course exams evaluates program participants knowledge about a topic using paper pencil activity. Test must be highly correlated with material presented in the workshop and individual items must be specific and difficult enough that only workshop attendees answer questions correctly
24
Q

Calculating Per Unit Cost

A
  1. Determine fixed costs (those you have no control) 2. Calculate per unit cost (printing costs= cost per page, overhead costs calculated per month, per person costs should include all expenses associated with individuals such as benefits, income tax, and social security
25
Q

Program evaluator responsibility for data integrity and validity and timely collection of data

A

steps must be taken to ensure that the data are clean, as error-free as possible, unbiased, and collected on time within budget

1. data collection procedure should undergo pilot testing.   2. Establish a checks and balances system: everything filled out correctly, and things done to their entirety.
26
Q

Pilot Testing

A

What all data collection procedures should undergo. Requires thoroughly training any data collectors or site personnel concerning how the data are to be collected. Face to face training with several opportunities for role playing and questions can be eliminate confusion..

27
Q

Primary Data Analysis

A

identical to those presented in results section of empirical articles. Data from each component of evaluation are analyzed in isolation from other data. Are also excellent ways to present data to stakeholders in some preliminary fashion. Also prevent data from becoming backlogged.

28
Q

Secondary Data Analysis

A

ties together the individual primary analyses to describe a component of the program. The secondary analysis moves back and forth from programs objectives to the evidence weaving together of the individual pieces of data into a holistic picture.

29
Q

Triangulating the data collection process

A

use multiple methods multiple data sources and more than one data collector/ observer for data collected over more than one period of time. Strategy reduces the reliance on any particular source.

30
Q

Executive Summary

A

no more than 3 pages, give reader overview of the goals of the program, and indicate services provided, the outcomes anticipated, and the extent to which these objectives were met. Should stand on its own, contain enough detail so that person can grasp the program, its purpose, and its impact by reading it alone.

31
Q

Accountability & Action Research

A

action research focuses on generating local rather than generalized knowledge (used for outcome research). Cover all data collection activities that lead to findings that are useful for evaluating local programs. This responsibility is on the counseling practitioners. Counselor educators need to address causes of resistance when training entry level counselors.

32
Q

Linking Evidence Based Practice with Accountability

A

Evidence based practice product of outcome research findings, accountability activities employ action research methods. Want to view evidence based practice more broadly, currently evidence based practice is product of outcome research but should also be considered product of action research.

33
Q

Outcome Research

A

evidence was product of rigorous scientific empirical studies, the domain and responsibility of trained researchers employed at university settings. Counselors responsible to locate and use evidence based interventions

34
Q

Action Research

A

focuses on generating local knowledge, all data collection activities that lead to findings that are useful for evaluating local programs.

35
Q

Baker, 2012

A

Explaining outcome research produces evidence based practice and action research produces accountability activities. Responsibility of counselor education to both produce outcome research and to education counselor practitioners to consume this research and produce their own action research. Responsibility of practitioners to consume and use outcome research as well as produce action research.