Social Work Research

Epistemology: how we know what we know, way of understanding

Authority: understanding through role models, or someone who has more experience or education.

Experience: understanding the ways of life through experience

Media: world wide, accessible, and different perspectives of information

Tradition: way a family or community practices or engages in knowledge, it's consistent, and part of one's identity and cultural heritage

TROUT: Tentative, Replicable, Observable, Unbiased, Transparent

Phases of Research Process

Placebo effect: a "fake" treatment. Anything that seems to be a "real" medicine treatment but isn't.

Recognizing flaws in unscientific sources: inaccurate observation, overgeneralization, selective observation, ex post facto hypothosizing

Evidence based practice: is considering values and expectations of clients and involving them as informed participants in the decision making process.

CIAO: Client characteristics, Intervention being considered, Alternative interventions, Outcome

Positivism: the philosophy of quantitative research

quantitative research: objective observation, controlled manipulation, reliability through numbers. A philosophical system that holds that every rationally justifiable assertion is capable of logical or mathematical proof and that rejects metaphysics and theism. Emphasizes the production of precise and generalizable statistical findings. deeper meanings of particular human experiences, and generate theoretically richer observations.

Constructivism: the philosophy that says all knowledge exists because humans say so

Subjective nature of knowledge, knowledge is built on agreement

Mixed methods research: stand alone research design in which a single study collects both quantitative and qualitative data. Integrates both sources of data at one or more stages of the research process to improve the understanding of the phenomenon being invested

  1. formulate the problem 2. design the study 3. collect data 4. process the data 5. analyze the data 6. interpret the findings 7. write up and share results

Kinds of studies

  1. Exploratory study: an early study looking at natural phenomenon and figuring out what it is
  1. Descriptive study: looks at things more specifically and try to describe it
  1. Explanatory study: why does something happen
  1. Evaluation: experiments what the outcome will be
  1. Constructing measures: asking specific questions based on symptoms

NASW Code of Ethics in Research

voluntary participation & informed consent, no harm to participants, anonymity & confidentiality, deceiving participants, analysis & reporting, weighing benefits and costs

Internal Review Board

Group of professionals who review scientific proposals to ensure human protection. consider their influence early on, respect for person, beneficence, and justice

Cultural competence: Access to the population, learn as much as you can, cultural humility, hire people form their community, take care of potential barriers

Independent variable: intervention - "causer"

Dependent variable: symptoms/problem you're trying to address "causee"

mediating variable: mechanism to affect relationship by independent and dependent variables

Moderating variable: mediating variable that strengthens a relationship

control variable: moderating variale actively studied

Conceptualization: how you conceptually define your variable

Operalization: how it is measured

NOIR: Nominal: named, Ordinal: ranked variables

Lit Reviews: describes problems, what has happened in the past, why you're doing it. to give consent of the outline of the paper

Systematic error: consistently gives you poor data or information

Random error: using cumbersome, complex, or boring procedures

Reliability: inter-observer, test-retest, parallel forms, internal consistency

Validity: criterion Related, predictive, know groups, construct, convergent, discriminate

Sampling: generalized or generalizable

sample error: difference between sample size and the population

Population: group of potential units that could be sampled

elements: the individual parts

parameter: description that delineates who can be sampled

sampling frame: list from which elements can be selected

probability sampling: randomly selecting

Systematic random sampling: start at a random number and follow specific ordinal after

stratified sampling: group populations in homogenous groups before you sample and then randomly sample

Cluster sampling: randomly sampling from the cluster

Non probability sampling: not as effective, convient

deviance case sampling: based on bell curves

quota sampling: have a predetermined quota of what the sample consists of and pick from there

snowball sampling: finding one person that brings in more people

intensity sampling: looking outside the norm

theoretical sampling: sample from population until you do not get any infromation

Survey: satisfaction with services, teacher evaluations, marketing hook, describing the population, needs of groups.

correlation: 2 variables that are related in terms of frequency and intensity. Correlation does not cause causation. One event must precede the other. Eliminate competing variables.

external validity: the ability to generalize findings

Internal validity: the level of confidence that you can say the independent variable changed the dependent variable

Threats to internal validity

History: external events are going to happen that affect the study

statistical regression: going away from the norm

maturation: changes in body and age

ambiguity of timing: not sure what causes what due to timing

testing: using the same test can alter findings

Instrument changes: changing instruments during the study.

measurement bias: garbage in garbage out

selection bias: researcher has role of who goes into each group

Researcher reactivity: the way the data is presented by the researcher

random sample & random selection

placebo effect

Attrition: drop out rate

Single subject design: test intervention/control group. Well defined dependent variable. Independent variable can be taken away. measure baseline -- intervention -- baseline

triangulation: reinforcing data with more data

Coding: data entry, use NOIR variables,

Procedure for selecting statistics

  1. questions to address
  1. find the scales and variables that gives you your answer
  1. identify nature of variables
  1. draw diagrams for the questions
  1. determine parametric or non-parametric statistics
  1. choose your test/make choice

Qualitative Research: knowledge is inherently a subjective experience, filtered through humans, basis of communication, never objective, verified through consensus, understood through paradigms

constructivism: all knowledge is related to experience

Ethnography: knowledge is built through accurate and detailed descriptions of the way people live

Feminist: knowledge is best framed from experiences of those who biologically differ from the majority

Moral Activism: knowledge is contextually political

Critical Theory: knowledge comes from the framework of culture, society, and political structure

Cultural Studies: knowledge encompasses all aspects of cultures highly self reflective

Queer Theory: Knowledge is built from framework of sex, gender, race, and culture to the end of de-simplifying words used to describe experiences

Participatory Action: knowledge comes from those studied they have ultimate say in how knowledge is used

Grounded Theory: knowledge comes from identifying patterns as best possible independent of previous knowledge or context

Phenomenology: knowledge is based on experience as understood through states of consciousness

Secondary data: data someone else collected, can be anywhere or anything

Manifest: summarizing what is said

Latent: summarizing what is meant

The Four Walls

Wall 1: antecedent

Wall 2: physical setting

Wall 3: social structure

Wall 4: intention for audience

Logic Model

Inputs: everything needed to make the agency function

Process: services provided

Outputs: the number or statistics of who is there

Outcomes: if there are any improvements in symptoms

Approach to Program Evaluation

Needs Assessment: surveys, rates, social indicators

Process evaluation: staff and client surveys

Goal attainment: quasi or experimental designs

Cost efficiency: cost benefit analysis