Exam 1 Flashcards
What is a claim
An assertion made by
What is evidence?
Is a reason and connected with a warrant.
What is a warrant?
It is connect to evidence and a claim
What is backing?
Additional evidence to support the warrant when a counter argument can be made.
What are the 5 Ways of Knowing?
- Personal Experience
- Intuition
- Authority
- Appeals to tradition, custom, and faith
- Magic, superstition, and mysticism
Personal Experience…
Tends to be seen as the most trustworthy but is biased
Intuition…
Perceptions like cloud figures => biased
Authority…
Trust on people
Appeals to tradition, custom, and faith…
Can lead to stereotypes because it is always been like that
Magic, superstition, and mysticism…
Mysteries are used to explain the unexplainable
What are the 6 Characteristics of Research?
- Research is based on curiosity and asking questions
- Research is a systematic process
- Research is potentially replicable
- Research is reflexive and self-critical => knows its limitations
- Research is cumulative and self-correcting => others can add
- Research is cyclical => continuous
What is systematic process?
5 step-by-step phases
What are the 5 step-by-step phases?
- Conceptualization what needs to be studied
- Planning and designing
- Methodologies
- Data analysis
- Reconceptualization of what studied and learned
What is proprietary research?
For a specific audience (i.e a teacher for her own reflection)
What is scholarly research?
For public access
What are the 3 Academic Cultures of Research?
- Physical science
- Humanities
- Social or human science
What is physical science?
biology, chemistry, physics
humanities?
art, music, literature
social or human science?
human behavior
Communication overlaps with which 3 Academic Cultures of Research?
communication overlaps with all 3
Positivist paradigm vs. naturalistic paradigms
how these paradigms approach the “ologies”
What is a paradigm
a worldview
positivist paradigm?
emphasize the word science in social science by trying to use physical science methods to study human behavior
positivist paradigm - ontological?
singular reality and objective
positivist paradigm - epistemological?
there is an independent relationship between researcher and participants
positivist paradigm - axiological?
the researcher’s values and biases have no effect
positivist paradigm - methodological?
the methods preferred is deduction (from general to specific), cause and effect relationships, research-controlled settings, and quantitative methods
positivist paradigm - rhetorical assumption?
formal and impersonal
naturalistic paradigm?
emphasize the word social in social science by trying to develop new methods to capture social behaviour
naturalistic paradigm - ontological?
multiple realities
naturalistic paradigm - epistemological?
there is an interdependent relationship between researcher and participants
naturalistic paradigm - axiological
the researcher’s values and biases have an effect
naturalistic paradigm - methodological?
the methods preferred is induction (from specific to general), holistic understanding, natural settings, and qualitative methods
naturalistic paradigm - rhetorical assumption?
informal and personal
Definition of communication
the process by which verbal and nonverbal messages are used to create and share meaning. Making things common ⇒ information exchange perspective
Communication research
focus on messages and message creating behaviors
Definition of technical communication
the process of making technical messages accessible to a lay audience
Basic Research
-Nature of problem
-theory
-commonsense theories
-Goals
-methodology
methodology of basic research
hypothesis testing
Goals of basic research
to increased knowledge on communication phenomena because theories are ongoing and can always benefit from fine tuning
Theory
a generalization made to explain why something happens
Nature of problem of basic research
research done to test a theory and make generalizations about communication
Applied Research
-Nature of problem
Focus on a specific event or challenge to make generalizations
-goals of action research
-social justice communication research
-methodology
Nature of problem for action reasearch
research done to solve a problem
goals of action research
engaged in not only finding a solution but implementing it
social justice communication research
focus on the underrepresented
methodology of applied research
observe and test out solution
Reasons for reviewing previous research
-To get an understanding of what you are studying by learning what others have said before
-To find gaps in the research
-To refine a research question
-To design own study
What are scholarly research articles?
-Primary research reports
-Published in journals that are run by professionals in each discipline
-They have gone through peer-review process
Primary research reports
the first reporting of a study by the people who conducted the study
How is research presented?
-Reading scholarly Journal articles
-They represent the most up to date research in the field
-Meant to be read a report of the findings from the study
Typical Quantitative Scholarly Journal Article
-Title
-Abstract
-Introduction
-Literature review
-Methodology
-Results
-Discussion
-References
Title
present the topic and variables studied
Abstract
summary of the purpose of the study, methods, key findings, and contributions
Introduction
establishes the purpose and significance of the study
Literature review
an establishment of the previous work done by others
Research question/ hypothesis
concludes the LR
Methodology in an article
an explanation of how the study took place
Participants.
people or texts studied
Procedures
the step-by-step
Data treatment
how data was analyzed
Results
summary of what data was collected
Discussion
interpretations of the results, problems and limitations are shared
References
list of sources
SPSS
a software used to help a researcher identify patterns in data
data page
the page where the data is imputed
variable page
where the labels for the data are added (i.e age, major)
Conceptual definitions
-Dictionary-like definitions that describes a concept with other terms (i.e argumentative = to debate)
-abstract
Operational definitions
-meaning is constructed by defining what activities are needed to measure it (i.e love = you do these nice things)
-concrete
Measurement theory
-Determining how the variables will be observed
-Is the process of determining changes within variable in terms of size, characteristic, or quantity
Quantitative
use numerical values to determine the amount of something (i.e 250 pounds)
Qualitative
use symbols to appoint meaning (i.e heavy)
triangulation
studying something in multiple ways within a single study
Methodological - triangulation
using multiple methods to study same phenomena
Data - triangulation
different sources for data collection were used
researcher - triangulation
multiple researchers collected and analyzed the data
Theoretical - triangulation
use multiple perspectives to interpret same data
Levels of Measurement
NOIR
Nominal ⇒ classification
-mutually exclusive (can’t belong to multiple groups), equivalent, exhaustive
-Examples ⇒ yes/no question, select from a checklist, open-ended questions then categorized
-Pro ⇒ can lead to important findings
-Con ⇒ can be limiting
Ordinal ⇒ rank order
-(fixed measurements for greater than to less than like sibling ranking)
-Pro ⇒ turn discrete classifications into ordered classifications
-Con ⇒ can’t tell a researcher how much of a variable was measured (i.e how much age difference is between the siblings)
Interval (types) ⇒ Likert Scale
(ratings by perception, (-1, 0, 1 ⇒ 0 is not absence but a point in the scale)
Ratio ⇒ counts
0 means absence; no negative numbers
Unidimensional
indicators that can be added together toward a single, overall score
Multidimensional
concept is made up of independent factors
Measurement Methods
-Self-Reports
-Other’s reports
-Behavioral Acts
Self-Reports
-asking people to report on themselves.
-Pro ⇒ This is a good way to learn people’s beliefs, attitudes, and values.
-Con ⇒ people can provide inaccurate information if they can’t remember or may be biased
Other’s reports
-asking people to observe other people
-Pro ⇒ may remove some biases (example a professor will have biases on the clarity of their teaching than a student)
-Con ⇒ the person may not have enough knowledge on an observation; doesn’t remove 100% biases
Behavioral Acts
-the researcher observes a person’s behavior
-Pro ⇒ can reveal if what they say matches what they do
-Con ⇒ can’t show how people feel or think or interest
Measurement Techniques
-Questionnaires
-Interviews
-Observations
Questionnaires
written questions that yield written responses
Questionnaires: What are they used for?
to measure variables
Interviews
verbal questions that yield verbal responses
Closed questions
provide participants with preselected answers
Open questions
participants use their own words to respond to questions
Directive questionnaires & interviews
predetermined set of questions
Nondirective questionnaires & interviews
respondent’s initial responses determine what they will be asked next
Observations
inspection and interpretation of behavior
Direct observation
researchers watch people engage with communication directly
Indirect observation
researchers observe communication artifacts
What is validity?
The degree of accuracy that a researcher is measuring what they claim to be measuring
Internal validity
deals with the accuracy of conclusions made
External validity
deals with the generalizability of the findings from a study
The best valid studies are those that are …
those that are high on both internal and external validity
What is reliability?
To be consistent and stable
Reliable
70% or 1.0
What is measurement validity?
How well a researcher’s methods measure what they intend to measure
What is measurement reliability?
When what is measured is consistent and reliable
Trust score component
the score if everything was perfect
Error score component
deviation to take into consideration that people’s behavior changes and fluctuates
Techniques to assess reliability
*Multiple-administration techniques
*Single-administration techniques
Multiple-administration techniques
test-retest method
Test retest method
administers the same procedures to the same people at different times
Single-administration techniques
-split half reliability
-Cronbach’s alpha
-intercoder reliability
split half reliability
dividing the responses in half where 1st half is similar to 2nd half
Cronbach’s alpha
every item is compared to every item
intercoder reliability
coding is stable when a data is coded by multiple people, multiple times
Validity threats due to
-How the research is conducted
-the research participants
-The researcher effects
History
external factors that influence people’s behavior in the study
Sleeper effect
effects that take time to manifest
Sensitization
initial measurements influence the latter
How does Data analysis impact validity?
improper procedures
How the research is conducted
-History
-Sleeper effect
-Sensitization
-Data analysis
the research participants
-The Hawthorne effect
-Selection
-Mortality
-Maturation
-Interparticipant bias
The Hawthorne effect
if people are aware of a researcher’s intent, it will influence their behavior
Selection
the people selected should belong to the group for which they were selected for (i.e a 15-year-old should be 15)
Mortality
the loss of participants during the study
Maturation
changes within a participant that affects their behavior
Interparticipant bias
participants influence others in a study
Researcher personal attribute effect
when the researcher influences people’s behavior
The researcher effects
-Researcher personal attribute effect
-Researcher unintentional expectancy effect
-Researcher observational biases
Researcher unintentional expectancy effect
influence through indirectly informing people of the desired behavior
Researcher observational biases
when a researcher’s knowledge influenced their observations by focusing on desired outcome