blob: 02d6b9e476df98c11d840843ba41526e82040a87 [file] [log] [blame]
%!TEX root = ceres-solver.tex
\chapter{Non-linear Least Squares}
\label{chapter:tutorial:nonlinsq}
Let $x \in \reals^n$ be an $n$-dimensional vector of variables, and
$F(x) = \left[f_1(x); \hdots ; f_k(x)\right]$ be a vector of residuals $f_i(x)$.
The function $f_i(x)$ can be a scalar or a vector valued
function. Then,
\begin{equation}
\arg \min_x \frac{1}{2} \sum_{i=1}^k \|f_i(x)\|^2.
\end{equation}
is a Non-linear least squares problem~\footnote{Ceres can solve a more general version of this problem, but for pedagogical reasons, we will restrict ourselves to this class of problems for now. See section~\ref{chapter:overview} for a full description of the problems that Ceres can solve}. Here $\|\cdot\|$ denotes the Euclidean norm of a vector.
Such optimization problems arise in almost every area of science and engineering. Whenever there is data to be analyzed, curves to be fitted, there is usually a linear or a non-linear least squares problem lurking in there somewhere.
Perhaps the simplest example of such a problem is the problem of Ordinary Linear Regression, where given observations $(x_1,y_1),\hdots, (x_k,y_k)$, we wish to find the line $y = mx + c$, that best explains $y$ as a function of $x$. One way to solve this problem is to find the solution to the following optimization problem
\begin{equation}
\arg\min_{m,c} \sum_{i=1}^k (y_i - m x_i - c)^2.
\end{equation}
With a little bit of calculus, this problem can be solved easily by hand. But what if, instead of a line we were interested in a more complicated relationship between $x$ and $y$, say for example $y = e^{mx + c}$. Then the optimization problem becomes
\begin{equation}
\arg\min_{m,c} \sum_{i=1}^k \left(y_i - e^{m x_i + c}\right)^2.
\end{equation}
This is a non-linear regression problem and solving it by hand is much more tedious. Ceres is designed to help you model and solve problems like this easily and efficiently.