Please enable JavaScript.
Coggle requires JavaScript to display documents.
Mixed methods to conduct evaluations (quantitative methods (RCT…
Mixed methods to conduct evaluations
Constraints
political
institutional
timing
budget
qualitative methods
participation observatory
key informant interview
focus groups
document review (a variety of - reports, funding proposals, newsletters, meeting minutes, program logs and marketing materials)
pros
cost-effective
unobtrusive
less time consuming
availibility
stability
broad coverage
cons
low retriability
biased selectivity
insufficient detail
irrelevant or out of date
disorganized
quantitative methods
propensity score matching
should not be dismissed
careful structural modelling
instrumental variables
RCT (randomized control trials)
should be centralized
by some dev. economists such as Banerjee, Duflo and Kremer
rarely possible in developing countries
Economic analysis of survey-based data
none of quantitative oriented development participants questioned
core method
Due to concerns
outputs being measured rather than outcomes
little reliable information presented
5 Issues regarding using mixed methods
the application of strong statistical designs in dev, contexts
the principle that the questions being posed should determine the research methods used
2 counter arguments
the great diversity of projects requires the use of a range of conrrespondingly different evaluation methodologies
specially, that strong statistical project level of designs are often inappropriate for the more complex, multi-component programs
evaluations are conducted to address a wide range of operational and policy questions
different methodologies depending on the specific information needs of particular clients
Techniques
Participatory Rural Appraisal (PRA) (Chambers, 1994)
beneficiary perceptions on casualty
Concept Mapping (Kane and Trochim, 2007)
different types of data and collection and analysis
Theory of Change (Morra and Rist, 2009)
different types of data and collection and analysis