Please enable JavaScript.
Coggle requires JavaScript to display documents.
Mary Rider - Coggle Diagram
Mary Rider
Week 1 Concept Map
Chapter 1 – Scientific Research and Ethics
What Is Scientific Research
Systematic process to gain knowledge through observation, logic, and evidence.
Example – A city evaluates a recycling program using data instead of opinions.
Aims to describe, explain, predict, and sometimes control phenomena.
Key Features of Good Research
Objectivity – Conclusions are based on facts rather than personal bias.
Example – Reporting both positive and negative findings in a water-quality study.
Falsifiability – Claims can be tested and proven wrong if the evidence contradicts them.
Reproducibility – Others can repeat the study and get similar results.
Peer Review – Experts evaluate quality and accuracy before publication.
Reasoning Methods
Inductive Reasoning – Builds theory from observation and patterns.
Example – Observing citizen feedback and developing a theory about public trust.
Deductive Reasoning – Tests existing theory through hypotheses and data.
Example – Testing whether transparent budgeting increases trust.
Abductive Reasoning – Iterative mix of both induction and deduction.
Example – Revising a theory as new citizen-survey data are collected.
The Scientific Method
Step 1 – Observe and Ask Questions.
Step 2 – Conduct Background Research to frame what is known.
Step 3 – Form a Hypothesis (expected relationship).
Step 4 – Design Study with variables, sampling, and controls.
Step 5 – Collect and Analyze Data.
Step 6 – Draw Conclusions and Share Results.
Example – Publishing findings in a public-administration journal.
Variables and Measurement
Independent Variable – The factor manipulated or expected to cause change.
Dependent Variable – The outcome that changes in response.
Control Variable – Kept constant to isolate true relationships.
Operational Definition – How each variable is measured.
Example – Define “public engagement” as attendance at town meetings.
Reliability – Consistency of measurement across trials.
Validity – Accuracy in measuring what was intended.
Internal – Results are due to variables studied.
External – Findings apply to other populations.
Sampling and Power
Sampling Strategy – How participants or data sources are chosen.
Example – Randomly selecting municipal employees for a satisfaction survey.
Power – The ability to detect a real effect; depends on sample size.
Data Collection and Analysis
Quantitative – Uses numbers and statistics.
Qualitative – Uses interviews, themes, or observations.
Mixed Methods – Combines both for fuller understanding.
Triangulation – Checking consistency across multiple sources or methods.
Ethics in Research
Informed Consent – Participants understand purpose and risks before agreeing.
Privacy and Confidentiality – Protect personal information.
IRB or Ethics Review – Independent board ensures protection of participants.
Risk–Benefit Balance – Minimize harm and justify any potential risk.
Honest Reporting – Avoid selective or misleading presentation of data.
Example – Disclosing data limitations in a grant report.
Transparency – Make methods and data available for verification.
Integrity – No fabrication, plagiarism, or data manipulation.
Conflict of Interest – Declare funding or affiliations that could bias results.
Replication and Accountability
Replication – Repeating studies to confirm results.
Accountability – Researchers are responsible for accuracy and ethical behavior.
Chapter 2 – Thinking Like a Researcher
The Nature of Scientific Thinking
Critical Thinking – Question assumptions and evaluate evidence logically.
Curiosity – Seek explanations rather than accept appearances.
Skepticism – Require proof before accepting claims.
Open-Mindedness – Willing to revise ideas when evidence changes.
Scientific Attitude vs. Common Sense
Common Sense – Based on personal experience; may be biased.
Scientific Thinking – Structured and evidence-based; aims for generalizable truth.
Example – Testing rather than assuming why a new policy failed.
Components of a Research Design
Research Problem – The issue to be studied.
Research Questions – Specific queries that guide investigation.
Hypotheses – Predicted relationships among variables.
Theoretical Framework – Underlying concepts that explain why relationships exist.
Research Design Choice – Decide whether to explore, describe, or test relationships.
Example – Exploratory study on citizen attitudes toward flood-risk programs.
Constructs and Variables
Construct – Abstract idea that represents a phenomenon.
Example – Trust in government or job satisfaction.
Variable – Measurable form of the construct.
Operationalization – Turning a construct into measurable indicators.
Measurement Quality
Reliability – Produces same result each time when conditions are unchanged.
Validity – Truly measures the intended construct.
Causality and Correlation
Causality – One event produces another through a direct link.
Correlation – Two things move together but may not cause each other.
Example – Ice-cream sales and drownings rise together due to hot weather, not causation.
Bias and Confounding – External factors distort relationships.
Example – Income differences confound relationship between education and civic participation.
Reasoning and Logic in Research
Pragmatic Logic – Chooses methods that best answer the research question.
Alternate Induction and Deduction – Theory refinement through iterative testing.
Mixed Methods – Uses both numerical and descriptive data for completeness.
Controls and Counterfactuals
Control Group – Comparison group not exposed to treatment.
Counterfactual – Imagined scenario showing what would happen without the intervention.
Example – Comparing two similar counties, one using new policy, one not.
Peer Review and Knowledge Building
Peer Review – Other experts verify quality and credibility.
Replication – Confirms that findings are reliable.
Transparency – Share data and code to allow verification.
Cumulative Knowledge – New studies build on prior work.
Example – Later research expands on earlier findings in coastal-resilience policy.
Critical Reading and Evaluation
Evaluate logic, evidence, and source credibility before citing.
Check if conclusions follow from data and if alternative explanations exist.
Integrity and Professional Responsibility
Ethical use of data and respect for participants.
Reporting limitations and potential biases clearly.
Connections Between Chapters 1 and 2
Scientific Method and Research Thinking
Chapter 1’s process (observe → hypothesize → test) supports Chapter 2’s design logic.
Ethics and Research Planning
Researchers must integrate ethics (from Chapter 1) into every design step (Chapter 2).
Reasoning Methods and Theory Building
Inductive and deductive reasoning (Chapter 1) combine in iterative research logic (Chapter 2).
Reliability and Validity Across Both
Measures of consistency and accuracy appear in both chapters as signs of good science.
Application to Public Administration
Example – Local governments apply these principles when evaluating policy outcomes or service performance.