Please enable JavaScript.
Coggle requires JavaScript to display documents.
Research Methods in Political Science - Coggle Diagram
Research Methods in Political Science
Philosophy of Social Science
interpretivism
social and natural world are fundamentally different
understanding human behaviour by interpretation of social behaviour
some similarities with positivism
collecting evidence
establishing causal relations
social world is subjectively created
examples
Hermeneutics
Critical Theory
Constructivism
Post-colonialism
Feminism
objectivity & values
researchers have values -> source of bias?
critical theory: can't be separated
positivism: let's distinguish
normative theory (what ought to be)
empirical theory (what is)
Max Weber: distinguish, yes, but values cannot be ignored
Robert Cox: all theory is normative, interpretations always reflect values
scientific realism
key differences
reality can consist of unobservable elements as well
assessment by observable
consequences
causal mechanisms instead of law-like generalizations
'best' theory is the one that explains phenomena the 'best'
examples
mechanisms with effect (Charles Tilly)
environmental: external influences on conditions affecting social life
cognitive: operate through alterations of individual and collective perceptions
relational: alter connections among people, groups, and interpersonal networks
individualism vs. holism
Coleman's Bathtub
similarities positivism
social and natural worlds are similar
realism: 'objective' reality exists
"normal science" and "scientific revolutions"
the structure of scientific revolutions (1962)
science is a social institution
scientific community subscribes to a common view, paradigm, or conceptual scheme (= 'normal science')
defines objects, norms, and methods of investigation
'Truth' is based on consensus in scientific community
Thomas Kuhn (1922-1996)
paradigm shifts: revolutionary change of paradigms
positivism
August Comte (1798-1857)
search for the truth through systematic collection of observable facts
scientific study of the social world
different positions
classical positivism
empiricism: knowledge is limited to what can be observed/measured
laws (are explanatory and predictive): social world is subject to regular and systematic processes
induction (observation -> theory)
cause-and-effect relationships
naturalism social sciences = natural sciences
science is value-free
logical positivism
empiricism + logical reasoning
retroduction (observation <-> theory)
verification (establishing truth claims)
deduction (theory -> observation)
falsification
rejection of induction
one counter-observation and law is falsified
particular experience -X-> general knowledge
rejection of verifiability
verification of theory is pointless
goal must be falsification (and replacement)
Karl Popper
scientific research process
Imre Lakatos (1922-1974)
falsification and the methodology of scientific research programs
hard core with a protective belt of auxiliary hypotheses
novel facts: progressive (problem shifts) or degenerating research programs
scientific research programs = incremental, cumulative, progressive articulation of programs lead to the growth of scientific knowledge
key terms
epistemology
what can we know about social phenomena?
methodology
how do we gain/obtain knowledge?
ontology
what is the nature of the social world, is there an objective/subjective reality
Research Design
causality & overview designs
research ethics & threats to validity
research questions & theories
literature search vs. review
search
sources
databases
reviews/state of the art articles
core books in libraries
create annotated bibliography of relevant sources
tools
refrences management software
review: not an annotated bibliography
evaluate: Identify the contributions (strengths) and limitations (weaknesses) of existing research
conceptualize: Use it to define key concepts
summarize: outline the relevant existing research / knowledge / theories / methods / evidence (topical / thematic review)
types of theories
theory = simplified model of reality
"a set of interrelated constructs (concepts), definitions, and propositions that present a systematic view of phenomena by specifying relations among variables, with the purpose of explaining and predicting the phenomena"
identifies key concepts/factors and their relationships
scope/level
grand theory
middle-range theory
process
inductive
deductive
nature of question
empirical
normative
types of research questions
predictive
prescriptive
explanatory
normative
descriptive
target audiences
academic: development and testing
explanatory
normative
descriptive
applied research & consulting: practical outcomes
predictive
prescriptive
hypotheses = proposed explanation for phenomenon, usually by stating some kind of testable cause-and-effect relationship
finding good research questions
should consider...
puzzle: unexpected contradictions
replication (not fully accepted)
existing literature
gaps and controversies
caution: blind acceptance of normal science
real world events and problems
historic/recent events are good choices
ongoing events are risky
typical steps
literature review
What do we already know?
What do we not know?
theory / theoretical framework
relevant concepts and factors
expectations and hypotheses
general research question / working hypothesis
research design (data and sources)
relationships
relationship = specify relationship between factors
cause: explanatory factor / IV
outcome; dependent variable / DV
null relationship
covariance relationship
causal relationship
reciprocal causations
unidirectional causations
research questions: relevant & useful?
problem/topic/puzzle: event or policy as starting point
specific
research question, by reviewing existing literature
scinetific enquiry
general
research question, starting point
relevance?
scientific relevance/importance
social relevance/importance
usefulness?
be researchable, can be obstructed by phrasing and data scarcity
research question should be new
research question should guide and structure the whole process
relevance and usefulness might work against each other/require compromise
data & measurement
Data Collection
interviews & focus groups
experiments
surveys & sampling
ethnography & participant observation
comparative & historical research & case selection
challenges of case selection
two approaches to small-n case selection
contrast of case selection and case sampling
two crucial steps / elements
defining the full set of a data units: universe of cases / population
selecting a subset / sample of data from that universe or population
key difference on second step / element
case selection: deliberate / strategic / purposive
sampling: probablistic / non-probablistic
common purpose / goal
selection of a subset of cases / sampling from a population
useful variation on the relevant dimensions / factors
single and small-n case selection techniques
historical research
historical methods
timing: historical institutionalism
critical junctures
positive feedback & path dependence
typical methods
narrative case studies
event structure analysis (ESA)
analytic procedure to ‘unpack’ an event into intermediary causal steps or constituent parts
step 2: break narrative into series of short statements
step 3: order statements into a diagram that reflect causal sequence or relations
step 1: construct narrative of what happend
process tracing
"domino-theory" George & Bennett
theory must predict all intervening steps
identification of causal chain (events & mechanisms)
theory-generating and/or theory testing
historical data
secondary sources
interpretation, commentary, and analysis by observers/social scientists
typical academic literature review
key task: establish authenticity, reliability & accuracy of information
primary sources
original and historical documents, writings, and artefacts created by political actors/participants
stored in archives and libraries
comparative historical analysis (CHA)
contrast of contexts: set limits to the scope or claims of an overly general theory
parallel demonstration of theory: show applicability across cases
macro-causal analysis: make causal inferences about macro-historical process and structures
comparative research
comparative method = rules, standards, and procedures for identifying and explaining differences and similarities between cases, using concepts that are applicable in more than one case or country
comparison must serve a theoretically justified purpose
typical uses
test theory
develop new theory or hypotheses
apply existing theory to new cases (micro replication, Rokkan 1966)
types by number of cases
small-n case study / comparison = analysis of a limited number of cases
disadvantages
high risk of selection bias (misleading inferences)
causality tends to be deterministic, not probabilistic
crucial first step: case selection
bases on similarities and/or differences
knowledge of universe / population of cases important
advantages
detailed analysis of cases still possible
better ability to contextualize
large-n case study / comparison
quantitative analysis (large-n comparison)
case selection, data collection, and (statistical) data analysis
advantages
large number of cases lowers risk of selection bias
can account for/test many exploratory factors simultaneously
qualitative comparative analysis (intermediate n)
formalized systematic comparison (Boolean algebra and fuzzy-set logic)
qualitative comparative analysis (QCA)
analysis: process of paired comparison (all possible combinations of factors/conditions) to generate or test summaries/typologies/theories
1 more item...
truth table: list of cases with relevant conditions & outcomes, coded as...
2 more items...
disadvantages
'thin' concepts and theories (simplistic indicators)
equivalence of meaning across cases (danger of concept-stretching)
limited ability to capture causal processes
case selection = sampling
representatieve sample of population
(single) case study = focus on single case, but theories and arguments (can and should) have wider implications
(single) case selection
relevatory: reveal relationship which cannot be studied by other means
unusual: throw light on deviant or extreme cases / outliers
critical: to testing a theory
crucial: confirm or disconfirm a theory
advantages
rich / thick description
good match of theory and evidence
high internal validity
purpose
apply existing theory to new contexts
examine exceptions to the rule (outliers / deviant cases)
provide descriptive contextualization
generate new theory
disadvantages
low external validity
lack of context: uncertainty about conclusions
data collection
interviews, historical & policy documents, speeches, ethnography, official statistics, surveys
process tracing (historical methods)
two components and their purpose
detailed / thick description -> internal validity
engage wider academic discussion / literature -> external validity
multiple-n vs. single case study (Rose 1991)
helps to avoid false uniqueness
helps to avoid false universalism
textual / content analysis & big data
Data Analysis