Maximum Likelihood Estimation

Interactive exploration of Maximum Likelihood Estimation for the normal distribution

Maximum Likelihood Estimation (MLE) is the standard method for estimating the parameters of GARCH and other volatility models. Before applying it to variance dynamics, it helps to build intuition with a simple example: estimating the mean of a normal distribution (see Hull 2023, sec. 8.9; Christoffersen 2012, sec. 4).

1. The likelihood principle

Consider a sample of \(n\) observations \(x_1, x_2, \ldots, x_n\) drawn from a Normal\((\mu, \sigma^2)\) distribution with known \(\sigma = 1\). The probability density of a single observation is:

\[ f(x_i \mid \mu) = \frac{1}{\sqrt{2\pi}} \exp\!\left[-\frac{(x_i - \mu)^2}{2}\right] \]

If the observations are independent, the joint density (the likelihood) is the product of individual densities:

\[ L(\mu \mid x) = \prod_{i=1}^{n} f(x_i \mid \mu) \]

We already have the sample and want to find the parameter \(\mu\) that makes it most probable. The value that maximizes \(L\) is the Maximum Likelihood Estimate (MLE).

Since logarithm is monotonically increasing, maximizing \(L\) is equivalent to maximizing the log-likelihood:

\[ \text{LL}(\mu \mid x) = \ln L(\mu \mid x) = \sum_{i=1}^{n} \ln f(x_i \mid \mu) = -\frac{n}{2}\ln(2\pi) - \frac{1}{2}\sum_{i=1}^{n}(x_i - \mu)^2 \]

The log-likelihood turns products into sums, making both computation and differentiation easier.

Note

Why does this matter for GARCH?

In GARCH estimation the idea is identical, but the variance \(\sigma^2_t\) changes each day. The log-likelihood becomes \(\text{LL} = -\frac{1}{2}\sum_{t=1}^{T}\left[\ln(\sigma^2_t) + R^2_t / \sigma^2_t\right]\), where \(\sigma^2_t\) depends on the parameters \(\omega\), \(\alpha\), \(\beta\) being estimated. We search numerically for the parameters that maximize this expression.

2. Interactive MLE explorer

The sample below is drawn from a Normal distribution with true mean \(\mu = 1.5\) and \(\sigma = 1\). Your task: find the value of \(\mu\) that maximizes the log-likelihood. Move the slider and watch how the density curve shifts to better “cover” the data points, and the log-likelihood value changes.

Tip

How to experiment

Move the \(\mu\) slider to see the density shift over the data. The log-likelihood is highest when the density best explains the observations. Toggle “Show true density” to reveal the actual distribution that generated the sample. Generate new samples to see how the MLE changes.

3. Key takeaways

The interactive example illustrates several important MLE properties:

  1. The MLE of \(\mu\) is the sample mean \(\hat{\mu} = \bar{x}\). This can be shown analytically by differentiating the log-likelihood and setting it to zero.

  2. The log-likelihood is a smooth function of the parameters, enabling numerical optimization. For GARCH, where no closed-form solution exists, we rely on iterative algorithms (e.g., BFGS) to find the peak.

  3. Larger samples produce sharper peaks in the log-likelihood, meaning more precise estimates. This is why at least 1,000 daily observations are recommended for GARCH estimation.

  4. The curvature at the peak relates to the precision of the estimate. A steeper, narrower peak means a more precisely estimated parameter.

Note

From this example to GARCH estimation

In GARCH, the same principle applies but with time-varying variance. The per-observation log-likelihood becomes:

\[ \ln(l_t) = -\frac{1}{2}\ln(2\pi) - \frac{1}{2}\ln(\sigma^2_t) - \frac{1}{2}\frac{R^2_t}{\sigma^2_t} \]

The key difference is that \(\sigma^2_t\) itself depends on the parameters (\(\omega\), \(\alpha\), \(\beta\)) through the recursive GARCH equation. This creates a complex, non-linear optimization problem that must be solved numerically, but the underlying logic is exactly the same: find the parameters that make the observed data most likely.

References

Christoffersen, Peter F. 2012. Elements of Financial Risk Management. 2nd ed. Academic Press.
Hull, John. 2023. Risk Management and Financial Institutions. 6th ed. John Wiley & Sons.