ChiSquare (Χ²) Distributions  Definition & Examples
A chisquare (Χ^{2}) distribution is a continuous probability distribution that is used in many hypothesis tests.
The shape of a chisquare distribution is determined by the parameter k. The graph below shows examples of chisquare distributions with different values of k.
Table of contents
What is a chisquare distribution?
Chisquare (Χ^{2}) distributions are a family of continuous probability distributions. They’re widely used in hypothesis tests, including the chisquare goodness of fit test and the chisquare test of independence.
The shape of a chisquare distribution is determined by the parameter k, which represents the degrees of freedom.
Very few realworld observations follow a chisquare distribution. The main purpose of chisquare distributions is hypothesis testing, not describing realworld distributions.
In contrast, most other widely used distributions, like normal distributions or Poisson distributions, can describe useful things such as newborns’ birth weights or disease cases per year, respectively.
Relationship to the standard normal distribution
Chisquare distributions are useful for hypothesis testing because of their close relationship to the standard normal distribution. The standard normal distribution, which is a normal distribution with a mean of zero and a variance of one, is central to many important statistical tests and theories.
Imagine taking a random sample of a standard normal distribution (Z). If you squared all the values in the sample, you would have the chisquare distribution with k = 1.
Χ^{2}_{1} = (Z)^{2}
Now imagine taking samples from two standard normal distributions (Z_{1} and Z_{2}). If each time you sampled a pair of values, you squared them and added them together, you would have the chisquare distribution with k = 2.
Χ^{2}_{2} = (Z_{1})^{2} + (Z_{2})^{2}
More generally, if you sample from k independent standard normal distributions and then square and sum the values, you’ll produce a chisquare distribution with k degrees of freedom.
Χ^{2}_{k} = (Z_{1})^{2} + (Z_{2})^{2} + … + (Z_{k})^{2}
Chisquare test statistics (formula)
Chisquare tests are hypothesis tests with test statistics that follow a chisquare distribution under the null hypothesis. Pearson’s chisquare test was the first chisquare test to be discovered and is the most widely used.
Pearson’s chisquare test statistic is:
Formula  Explanation 

Where

If you sample a population many times and calculate Pearson’s chisquare test statistic for each sample, the test statistic will follow a chisquare distribution if the null hypothesis is true.
The shape of chisquare distributions
We can see how the shape of a chisquare distribution changes as the degrees of freedom (k) increase by looking at graphs of the chisquare probability density function. A probability density function is a function that describes a continuous probability distribution.
When k is one or two
When k is one or two, the chisquare distribution is a curve shaped like a backwards “J.” The curve starts out high and then drops off, meaning that there is a high probability that Χ² is close to zero.
When k is greater than two
When k is greater than two, the chisquare distribution is humpshaped. The curve starts out low, increases, and then decreases again. There is low probability that Χ² is very close to or very far from zero. The most probable value of Χ² is Χ² − 2.
When k is only a bit greater than two, the distribution is much longer on the right side of its peak than its left (i.e., it is strongly rightskewed).
As k increases, the distribution looks more and more similar to a normal distribution. In fact, when k is 90 or greater, a normal distribution is a good approximation of the chisquare distribution.
Properties of chisquare distributions
Chisquare distributions start at zero and continue to infinity. The chisquare distribution starts at zero because it describes the sum of squared random variables, and a squared number can’t be negative.
The mean (μ) of the chisquare distribution is its degrees of freedom, k. Because the chisquare distribution is rightskewed, the mean is greater than the median and mode. The variance of the chisquare distribution is 2k.
Property  Value 

Continuous or discrete  Continuous 
Mean  k 
Mode  k − 2 (when k > 2) 
Variance  2k 
Standard deviation  
Range  0 to ∞ 
Symmetry  Asymmetrical (rightskewed), but increasingly symmetrical as k increases. 
Example applications of chisquare distributions
The chisquare distribution makes an appearance in many statistical tests and theories. The following are a few of the most common applications of the chisquare distribution.
Pearson’s chisquare test
One of the most common applications of chisquare distributions is Pearson’s chisquare tests. Pearson’s chisquare tests are statistical tests for categorical data. They’re used to determine whether your data are significantly different from what you expected. There are two types of Pearson’s chisquare tests:
Color  Frequency 

Red  30 
Gray  29 
Yellow  26 
Pink  33 
Black  56 
White  90 
Blue  86 
Since there were 350 shirt sales in total, 50 sales per color would be exactly equal. It’s obvious that there weren’t exactly 50 sales per color. However, this is just a oneweek sample, so we should expect the numbers to be a little unequal just due to chance.
Does the sample give enough evidence to conclude that the frequency of shirt sales truly differs between shirt colors?
A chisquare goodness of fit test can test whether the observed frequencies are significantly different from equal frequencies. By comparing Pearson’s chisquare test statistic to the appropriate chisquare distribution, the company can calculate the probability of these shirt sale values (or more extreme values) happening due to chance.
Population variance inferences
The chisquare distribution can also be used to make inferences about a population’s variance (σ²) or standard deviation (σ). Using the chisquare distribution, you can test the hypothesis that a population variance is equal to a certain value using the test of a single variance or calculate confidence intervals for a population’s variance.
F distribution definition
Chisquare distributions are important in defining the F distribution, which is used in ANOVAs.
Imagine you take random samples from a chisquare distribution, and then divide the sample by the k of the distribution. Next, you repeat the process with a different chisquare distribution. If you take the ratios of the values from the two distributions, you will have an F distribution.
The noncentral chisquare distribution
The noncentral chisquare distribution is a more general version of the chisquare distribution. It’s used in some types of power analyses.
The noncentral chisquare distribution has an extra parameter called λ (lambda) or the noncentral parameter. This parameter changes the shape of the distribution, shifting the peak to the right and increasing the variance as λ increases.
The λ parameter works by defining the mean of the normal distributions that underlie the chisquare distribution. For example, you can produce a noncentral chisquare distribution with λ = 2 and k = 3 by squaring and summing values sampled from three normal distributions, each with a mean of two and a variance of one.
Frequently asked questions about chisquare distributions
 What happens to the shape of the chisquare distribution as the degrees of freedom (k) increase?

As the degrees of freedom (k) increases, the chisquare distribution goes from a downward curve to a hump shape. As the degrees of freedom increases further, the hump goes from being strongly rightskewed to being approximately normal.
 What properties does the chisquare distribution have?

A chisquare distribution is a continuous probability distribution. The shape of a chisquare distribution depends on its degrees of freedom, k. The mean of a chisquare distribution is equal to its degrees of freedom (k) and the variance is 2k. The range is 0 to ∞.
 What are the two main types of chisquare tests?

The two main chisquare tests are the chisquare goodness of fit test and the chisquare test of independence.
Sources in this article
We strongly encourage students to use sources in their work. You can cite our article (APA Style) or take a deep dive into the articles below.
This Scribbr article