next up previous contents
Next: Kullback-Leibler information number and Up: Akaike Information Criterion (AIC) Previous: Akaike Information Criterion (AIC)

A fitting problem

In practical data analysis, we ususally don't know what function should be used to fit the observed data. We tend to use a famailiar function, but how do we know it is better than other possible choices?

Consider N data points, (xi, yi) for $i = 1, 2, \ldots N$, and a model with K parameters, $y = f(x\vert\theta_K)$. An example is a polynomial function $y = \sum_{k=0}^{K-1} A_k x^k$.

The estimation of parameters is usually done by minimizing a sum of the squared residual (SSR) defined by
\begin{displaymath}
SSR_K = \sum_{i = 1}^N (y_i - f(x_i\vert\theta_K))^2\end{displaymath} (15)

An important problem is to select the optimal number of parameters K*.

A merit function like SSRK in linear least square fitting has the following relationship
\begin{displaymath}
SSR_0 \geq SSR_1 \geq \cdots \geq SSR_K \geq SSR_{K+1} \geq \cdots\end{displaymath} (16)
which cannot give us the answer of K*.