Statistical Analysis
Descriptive Statistics
Inferential Statistics
Frequency Distribution: systematic arrangement of numeric values from low to high
Shape
Central Tendency: common set of scores that comes from the center
Variability
Symmetry
Modality
Positive
Negative
Unimodal
Bimodal
Multimodal
Peakedness
Leptokurtic
Mesokurtic
Platykurtic
Normal distribution (bell shaped curve) Inferential
Shape
click to edit
symmetric, unimodal, mesokurtic
Standard Deviation
1 SD: 68%
2SD: 95%
3SD:99%
Mode: most frequent
Median: middle point
Mean: average
Best for nominal measures
Skewed distribution
Normal distribution
Indexes of Variability
Standard Deviation: Average devotion from the mean
Degree of Variability
Homogeneity: alike, leptokurtic
Heterogeneity: different, great variability, platykurtic
Range: highest value minus lowest value
Bivariate : describing relationship between 2 variables
Contingency Tables (cross tabs)
Correlation Coefficients
involves 2 variables crosstabulated
Nominal or ordinal
Indicates direction and magnitude between 2 variables
Pearsons r (+1 to -1)
Used for interval or ratio level measures
Positive Correlation: same direction
Negative Correlation: opposite direction
Parametric: estimation of a parameter (interval/ratio)
Non parametric : measurements nominal or ordinal (not normally distributed)
click to edit
Independent Variable= nominal
Dependent Variable= ratio/interval
2 groups means
3 or more groups means
Paired t test
ANOVA
Independent Groups:
Compares 2 sample means from different populations (men and women) regarding the same variable to determine whether the difference between 2 means is statistically significant or by chance alone
Dependent Groups
Comapres the means of 2 related groups to determine whether there is a statistically significant difference between these means
One Way ANOVA: tests difference between 3 groups
Multifactor ANOVA: tests 2 or more independent variables with regard to a variable outcome to test the effect the IV has on that outcome
Repeated Measures ANOVA: tests the same subjects at the baseline at different points in time
Pearson's r
Variables are interval/ratio
Calculates the Probability that the correlation between two variables is not zero
Pearson's r
Correlation Matrix: multiple variables can display all pairs of correlations
Hypothesis testing??????
Chi-squared: tests the difference in proportion in categories with contingency tables
Data measured at nominal level
Includes 5 data points (degrees of freedom)