QT - 2

Chi Square test

The Chi-Square test is a statistical procedure for determining the difference between observed and expected data. This test can also be used to determine whether it correlates to the categorical variables in our data. It helps to find out whether a difference between two categorical variables is due to chance or a relationship between them.

A chi-square test is a statistical test that is used to compare observed and expected results. The goal of this test is to identify whether a disparity between actual and predicted data is due to chance or to a link between the variables under consideration. As a result, the chi-square test is an ideal choice for aiding in our understanding and interpretation of the connection between our two categorical variables.

A chi-square test or comparable nonparametric test is required to test a hypothesis regarding the distribution of a categorical variable. Categorical variables, which indicate categories such as animals or countries, can be nominal or ordinal. They cannot have a normal distribution since they can only have a few particular values.

F - Test

A test statistic which has an F-distribution under the null hypothesis is called an F test. It is used to compare statistical models as per the data set provided or available. George W. Snedecor, in honour of Sir Ronald A. Fisher, termed this formula as F-test Formula.

To compare the variance of two different sets of values, the F test formula is used. To be applied to F distribution under the null hypothesis, we first need to find out the mean of two given observations and then calculate their variance.

F Value = Variance of Set-1/Variance of Set-2

Anova

Definition

Techniques

Correlation

Definition

Types

Co-efficient

Regression

Difference between correlation and regression

Multivariate analysis

Definition

Two types

Factor analysis

Cluster analysis

Discriminant analysis

Linear program

Operation Research

Game theory

Saddle point

Assumptions

Limitation


The chi-square test, for starters, is extremely sensitive to sample size. Even insignificant relationships can appear statistically significant when a large enough sample is used. Keep in mind that "statistically significant" does not always imply "meaningful" when using the chi-square test.


Be mindful that the chi-square can only determine whether two variables are related. It does not necessarily follow that one variable has a causal relationship with the other. It would require a more detailed analysis to establish causality.

Properties

Difference between cluster and factor analysis

features

Importance

Types of Variance analysis

SPSS

Features

Waiting line theory

Components

Techniques/Models

Types

Linear and non linear correlation

Purpose of factor analysis

Limitations

Single line channel

T Distributon

Importance

Advantages

Limitations

Application

Limitation

Uses

One way two Way

Characters

Method

Simple and Multiple correlation

Software used for analysis

Assignment problem

Methods

Feasibility region

Pure strategy

Mixed strategy

Maximum principle

Transportation problem

Steps

Conditions for application


  • You want to test a hypothesis about the relationship between two categorical variables


  • The sample was randomly selected from the population


  • There are a minimum of five observations expected in each combined group

Uses

Graphical method

Assumptions

ANOVA stands for Analysis of Variance. It is a statistical method used to analyze the differences between the means of two or more groups or treatments. It is often used to determine whether there are any statistically significant differences between the means of different groups.


ANOVA compares the variation between group means to the variation within the groups. If the variation between group means is significantly larger than the variation within groups, it suggests a significant difference between the means of the groups.

The statistic that measures whether the means of different samples are significantly different is called the F-Ratio. The lower the F-Ratio, the more similar will the sample means be. In that case, we cannot reject the null hypothesis.