Minor corrections to the documentation. Thanks to Satya Mallick for reporting these. Change-Id: Ia52e08a7e21d5247dc475cfbf10bf57265aa118f
diff --git a/docs/source/solving.rst b/docs/source/solving.rst index 8202d16..03c070a 100644 --- a/docs/source/solving.rst +++ b/docs/source/solving.rst
@@ -291,9 +291,10 @@ and then use it as the starting point to further optimize just `a_1` and `a_2`. For the linear case, this amounts to doing a single linear least squares solve. For non-linear problems, any method for solving -the `a_1` and `a_2` optimization problems will do. The only constraint -on `a_1` and `a_2` (if they are two different parameter block) is that -they do not co-occur in a residual block. +the :math:`a_1` and :math:`a_2` optimization problems will do. The +only constraint on :math:`a_1` and :math:`a_2` (if they are two +different parameter block) is that they do not co-occur in a residual +block. This idea can be further generalized, by not just optimizing :math:`(a_1, a_2)`, but decomposing the graph corresponding to the @@ -315,9 +316,9 @@ ------------------- Note that the basic trust-region algorithm described in -Algorithm~\ref{alg:trust-region} is a descent algorithm in that they -only accepts a point if it strictly reduces the value of the objective -function. +:ref:`section-trust-region-methods` is a descent algorithm in that +they only accepts a point if it strictly reduces the value of the +objective function. Relaxing this requirement allows the algorithm to be more efficient in the long term at the cost of some local increase in the value of the @@ -362,7 +363,7 @@ Here :math:`H(x)` is some approximation to the Hessian of the objective function, and :math:`g(x)` is the gradient at :math:`x`. Depending on the choice of :math:`H(x)` we get a variety of -different search directions -`\Delta x`. +different search directions :math:`\Delta x`. Step 4, which is a one dimensional optimization or `Line Search` along :math:`\Delta x` is what gives this class of methods its name.
diff --git a/docs/source/tutorial.rst b/docs/source/tutorial.rst index 1e5756a..f1c2cbe 100644 --- a/docs/source/tutorial.rst +++ b/docs/source/tutorial.rst
@@ -30,7 +30,7 @@ more familiar `non-linear least squares problem <http://en.wikipedia.org/wiki/Non-linear_least_squares>`_. -.. math:: \frac{1}{2}\sum_{i=1} \left\|f_i\left(x_{i_1}, ... ,x_{i_k}\right)\right\|^2. +.. math:: \frac{1}{2}\sum_{i} \left\|f_i\left(x_{i_1}, ... ,x_{i_k}\right)\right\|^2. :label: ceresproblem2 In this chapter we will learn how to solve :eq:`ceresproblem` using