post midterm Flashcards

1
Q

what is the comparative method

A

a method of comparison between cases to find similarities and differences between cases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

types of comparative method

A

single case study- using one single case to build or test a theory

small n-case study- only a small number of cases fit the phenomenon

large n-case study- unit of analysis allows for a large sample and many variables to be evaluated at the same time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How to use the comparative method: small n case studies (things to watch out for)

A

When there are is a specific case or phenomenon that you want to investigate.

  • things to watch out for:
  • number of variables (don’t want too few cases and too many variables)
  • selection bias: cases that prove your theory
  • omitted variables; unaccounted variable can lead to a spurious relationship; relationship between x and y is caused by z
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How to use comparative method: small n case studies (strategies)

A

Most similar systems design

most different systems design

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

how to use comparative method: small n case studies (best used for..)

A
  • theory development/building and deep investigation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Politics of Third Wave Feminism by Evans

A

small n case study: that federal vs unitary system affects the participation of women in government

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How to use comparative method: Large N studies

A
  • quantitative analysis: data analysis, case selection, data collection
  • variables are most important
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How to use comparative method: large N studies (things to look out for)

A
  • careful of equivalence of meaning and conceptual stretching, does my definition of the variables mean what I want it to mean across borders
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

how to use comparative method: (time wise)

A

cross-sectional vs. longitudinal.

Historical events research- cross-sectional with a single case

historical process research- longitudinal with a single case

cross-sectional comparative research- cross-sec. and one single case

comparative historical research- longitudinal with many cases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Why use the comparative method

A

Use it to test theory
- test theory on new cases
- develop new theories and cases
- it helps guard against false uniqueness- “too narrow of an explanation for a large phenomenon.
false universalism- if it happens in one place it happens everywhere.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the Ethnographic approach

A

how we study people and how data is collected in those studies, more broadly it includes different data collection and data analysis methods.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

participant observation

A

field research; a method of data collection most common in ethnography

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

why do ethnography

A
  • studying people in there natural habitat is important because there is a difference with what people say and what people do.
  • good for exploratory research
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

informants, field

A

informants- people being studied

field- the research setting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

what are the key concerns for ethnographic research

A
  • case selection (generalizability is not a priority in this case)
  • access to research information
  • trust, rapport and objectivity
  • replicability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Richard Fenno, Home Style- observing members of the US congress in their home districts

A

example of participant observation, took mental notes, asked a lot of questions, a lot of participating observation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

The Politics of Third Wave Feminism

A

example of ethnographic research: she used semi-structured interview, attendance and feminist gatherings

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Focus group

A

observation of the group dynamic, seeing how people interact with each other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

types of questions during an interview

A

closed vs. open questions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

types of interviews

A

structured- closed questions, same order, survey style (many people)

semi-structured- mix of short and long questions, allowed follow ups/modifications

unstructured- long, complex, no set questions/general topics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

When to use the interview method

A
  • is the info you need only available through talking to people
  • when doing qualitative analysis
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

things to consider when choosing the type of interview

A
  • exploratory (qualitative) vs. explanatory (quantitative)
  • is topic straightforward or complex
  • are costs, time and available facilities an issues
  • is reliability or validity of answers threatened
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

two types of interviews in political science

A

expert interview

elite interview

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

3 steps in data analysis

A

data reduction- reduce it down to common themes

data coding- what are the commonalities and what will you call these commonalities

analysis- what does the data mean, do my conclusions make sense in terms of internal validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
quantitative analysis happens in what type of interview
close-ended/structured interview in which there are many responses
26
Dr. Roberta Rice's research project
why are indigenous parties successful in some Latin American countries but not in others. Bolivia, Ecuador, Peru, Chile
27
Rice's methods
100+ semi-structured interviews, participant observations, organizational documents
28
what is an interview
a directed conversation that elects the inner views of the respondent .
29
advantages of research
- rich data- that you can't find in textbooks or online - its in the respondents' own words - you can learn new things from interviewees, gaining understanding - interview and human interaction is a source, body language what gets them excited what makes them uncomfortable
30
Disadvantages of interviews
- interviews may not be truth (need to confirm or validate what info respondents give) - time consuming - difficult to analyze and generalize from that - interview effects- Hawthorne effect (observation changes people's actions) Rosenthal effect (other's expectations of the target affect the target's performance) lack of standardization- your question may not mean the same thing across the board, sometimes you have to rework your question.
31
7 steps in interview research
1. identify the requirements for research and choose an interview type that fits those requirements 2. identify participants - secondary research on people 3. design an interview guide- intro: explain who u are and the purpose of interview, questions, leave it open to interviewee to add. 4. background research on interview so you can ask the right questions 5. conduct interview- ethics approval must be signed before 6. analysis afterword of notes.
32
Experimental method
method that allows you to control variables in order to remove externalities and verify variables and causal variables
33
experimental method- intervention
treatments or settings are manipulated or controlled by a researcher i.e lab setting
34
Experimental design
pre-test: measure outcome of control and experiment group before intervention intervention- no treatment to control, treatment to experimental group post-test- measure outcome of both and compare and analyze
35
when do we use the experimental method
control over the environment, and can find causal variables
36
relationship between control and validity in experimental method
the more control over the situation, the less external validity and vise versa
37
Dr. Tuxhorn's study
explain the suprot for Canada-China trade agreement
38
Dr. Tuxhorn's research question and research design
Research question: is Canadian support for China driven by fears of the US Research design: randomized sample: split into control and experimental group - control group left alone - group 1: add info of US protectionism - group 2: Grains of US over Canada - looked to which group would be more supportive of trade with China - gains of the US had more support for China
39
why one can't assume correlation = causation
there are many factors that come into play and could be causing the phenomenon
40
relative gains
how will change of info change the way people act/react
41
relationship between random sampling and external validity
positive correlation
42
externalities
variables that one cannot control
43
Ethical problems that come with talking to people about political issues
talking to people about how they should vote affects voter turn out and voting so it impacts actually politics, and may favour a particular party
44
ideal research design
comparing fictional or what if scenarios and there effects | - what if NAFTA negotiations never happened vs. if they did?
45
what is the survey method
standard questionnaire, usually involves many participants.
46
why use the survey method
- it is used for describing and explaining attitude in land behavioural phenomena, common form of data collection
47
validity, reliability measurement error of measures in survey method
surveys are good at showing what a section of the population generally think, but can at times misrepresent them. measurement error- questions attempt to measure political attitudes and behaviours, but because answers to this are abstract it becomes hard to measure (i.e respondents don't remember, lie and, unwilling to reveal personal views more) validity- the ability for the questions to fully capture the concept and what its trying to measure. reliability- is the questioned answered in the same way across the board.
48
two types of survey methods
self-administered vs. supervised
49
pros and cons of supervised surveys
con: interviewer-effect Pro: increased comprehension of question
50
when to use the survey method
- depends on research question: is the research question sensitive to the interviewer effect, the social desirability effect, recall problems (would it be easier to get accurate information with a survey rather than face to face?)
51
errors in representation of sample
A correct sample allows for accurate description of the data, coverage error- not everyone has an equal chance of being selected sampling error- the degree to which the sample is different from the actual population non-response error- sample is not reflective because participants refused to participate
52
response rate
(number of completed interviews) / (number of attempted interviews) - this happens with cellphones, robo calling
53
What helps increase response rate
- incentives, | - actual person instead of robo calling
54
Importance of random sampling
- avoiding coverage error- everyone has equal chance of being selected - more representative of the population therefore increases validity - avoiding selection bias
55
types of random sampling
simple random sampling systematic sampling stratified sampling cluster sampling
56
non-random sampling
when you intentionally pick your sample
57
why use non-random sampling
there no sample frame (natural pool to pick from i.e terrorists), limited time and resources, case study, or small sample study
58
Examples of non-random sampling and definitions
quota- taking people in as they come (the first thirty people who walk into the store) purposive- picking a specific sample because of the way it relates to your case study or answers your research question snowball- asking interviewees if there is anyone else you can talk to
59
systematic sampling
randomly selecting where you start/how you pick. example: Skip interval: divide population by the sample and (4) you pick every fourth person from the pool
60
stratified sampling
randomizing at different levels:
61
cluster sampling
the sample is formed in groups and from those groups you pick individual parts as your sample, church attendance- look at groups of different churches and pick attendants from those churches to sample
62
Rules of a good questionnaire
1. convince people to participate 2. includes all info you need to collect (valid measures of the factors of interest) 3. elicits acceptable accurate information
63
parts of a questionnaire (intro)
intro- states purpose (this has to be vague a not to skew answers), expresses gratitude (build rapport), states ethics- anonymous and voluntary
64
order of questions in a questionnaire
questions should be: - easy to answer, interesting, close-ended. - general questions should come before specific ones (general perception of crime rather before personal experience, to avoid skewed response) - questions directly related to survey question should be a the beginning of questionnaire - group common questions together - include instructions to avoid measurement error i.e 0 means not at all
65
what to consider when writing questions
- the goal: using the questions to create valid and reliable measurements of the key concepts - the 4 types of response processes - is the question simple and clear/easy to understand - do they need certain info to answer the question, do they have the info, can my question be interpreted the same way cross the board (across cultures/languages)
66
Rules for close-ended questions
- should be only one category for every possible response. - avoid don't know response in order to assure a response for each question- opinion isn't there are or not it is created as you ask questions - avoid leading questions
67
response set
similarity of responses across range of questions ex: agree or disagree to a set of questions
68
acquiescence bias
the tendency of respondents to agree with statements presented to them, in order to avoid this you use (strongly agree, strongly disagree scale)
69
social desirability and how to avoid it
the need to answer what would most be socially accepted. this can be avoided through an introduction to the question to soften the pressure.
70
Franceschet research
Directorio Legislativo- looking at how the gender of politicians affected their political careers and demographics process of research: 1. collecting data- 2. figuring out research questions that best take into account all the variables at play i.e does the gender affect the degree level of politician 3. making data usable- coding 4. consult with experts to make sure questions make sense.
71
How is quantitative data used?
to describe and explain data
72
what are the 3 different levels of measurement in quantitative analysis?
nominal- mutually exclusive categories cannot be numerically ranked and numbers assigned to them don't have a meaning. lowest level of measurement ordinal- the categories have an order which can be ranked, but the numbers assigned do no have a meaning and distance between the categories is not equal. high level of measurement interval- has the same qualities as ordinal and nominal but the numbers assigned to them have intrinsic meaning and value. i.e level of income or number of children in a home. The highest level of measurement
73
discrete. continuous measures
discrete- break between each measure(1, 2, 3) | continuous- break point between measures (1-1000, 1001-1009)
74
what is univariate analysis
statistical measures used to summarize the characteristics of a single variable
75
2 common statistics in univariate analysis
proportion- share of cases relative to the whole population on a 0 to 1 scale (50 women in a sample of 125=0.4 proportion) percentage- proportion times 100 (40% women)
76
what are measures of central tendency
a way of measuring the central value in a frequency distribution using mean (gives the most information), median, mode (gives the least information)
77
frequency distribution
describes the entire distribution if responses and summarizes the number of cases for each different response
78
comparison of utility: mean, medium, mode and what level of measurement does each use
mode- list utility, tells the most common response not much else and can change dramatically with the addition of a few new cases, so its not really stable. (nominal) median- uses ordinal data because in order to determine the median you have to order the variables from lowest to highest. Looks at the middle value of distribution mean- most useful because it says where is the case relative to others, is it above or below average, but the medium says more about the most common case if there are outliers
79
what is measure of dispersion and how do you do it
definition: measuring how standard a particular case is now that you know the standard. we do that with: nominal, ordinal, and interval nominal- there is no measure of dispersion, doesn't talk about the case relative to another just indicates how many cases there are ordinal- tells us about the spread of the data by giving us the range (the highest - the lowest value) interval- standard deviation; calculating how far the case deviates from the mean and in what direction. But
80
rules for tables and charts
1. straightforward, clear, informative 2. description in the text below 3. clear title, clear labels, 4. note the source of the data
81
Frequency tables
a way of representing the distributions of a single variable, the table provides information of the distribution of responses on all categories
82
utility of bar chart, pie chart, and line graph
bar chart- good for variation of percentage in the data/differences across categories pie chart- good for variation of percentage in the data/ differences across categories line graph- showing a trend over time, continuous longitudinal data
83
two types of textual analysis
content and discourse analysis
84
content analysis
1. analysis of content of the text to finding the meaning behind it 2. capturing and coding the characteristics of the text 3. the act of studying has no effect on those you are studying
85
manifest vs. latent content analysis
manifest- coding the easiest characteristics to find within a text latent- interpretations of meanings, motives and purposes in a text
86
steps of content analysis
1. select text to be analyzed 2. determine the categories or topics of interest you are looking for in the text 3. determine the unite of analysis: every sentence, phrase, paragraph etc. 4. assign different code to the categories you determined in step 2 5. code 6. arrange the codes to provide a description of the variables or show a relationship between variables 7. draw conclusions from these relationships that link back/help answer the research question
87
what is analysis
process of attaching meaning to data and connecting this meaning back to the research question
88
Discourse analysis
how discourse plays a role in creating, resisting and redistributing social power abuse, dominance and inequality. The analysis is when we see how the discourse establishes a particular context within the text and identifying the impact of that discourse on power relations
89
advise on writing
punctuation: - don't use a comma to connect to independent clauses - parenthetic expressions in-closed with a comma - semi-colon should not be used to join incomplete sentences
90
parts of a paper
Introduction: present the topic, the research question, how you will argue it and how the paper will proceed. body: develop your arguments, and address counterarguments, discuss findings and data conclusion: recaps the arguments and connects it back to the thesis, address limitations of research and what needs to be done moving forward.
91
what is critical thinking
the systematic evaluation of beliefs by rational standards
92
arguments that work
1. argument supported by evidence 2. argument that builds on existing theory 3. clear statement of claim being made 4. argument is logical
93
argument that doesn't work
1. argument with irrelevant premise- based on emotion 2. arguments with false/unacceptable premise 3.
94
strategies for successful writing
mind map to brainstorm ideas, create an outline prior to writing
95
bivariate vs. multivariate analysis
bivariate- analysis of the relationship between two variables multivariate- analysis between the relationship between three or more variables.