Add script for building documentation. Update make_release Minor documentation fixes. Change-Id: I1248ec3f58be66b5929aee6f2aa392c15d53ed83
diff --git a/docs/source/bibliography.rst b/docs/source/bibliography.rst index e80e483..188ed77 100644 --- a/docs/source/bibliography.rst +++ b/docs/source/bibliography.rst
@@ -19,7 +19,7 @@ **Representations of Quasi-Newton Matrices and their use in Limited Memory Methods**, *Mathematical Programming* 63(4):129–-156, 1994. -.. [ByrdSchanbel] R.H. Byrd, R.B. Schnabel, and G.A. Shultz, **Approximate +.. [ByrdSchnabel] R.H. Byrd, R.B. Schnabel, and G.A. Shultz, **Approximate solution of the trust region problem by minimization over two dimensional subspaces**, *Mathematical programming*, 40(1):247–263, 1988.
diff --git a/docs/source/solving.rst b/docs/source/solving.rst index fb48bb3..7f58ec8 100644 --- a/docs/source/solving.rst +++ b/docs/source/solving.rst
@@ -231,8 +231,8 @@ ``SUBSPACE_DOGLEG`` is a more sophisticated method that considers the entire two dimensional subspace spanned by these two vectors and finds -the point that minimizes the trust region problem in this -subspace [ByrdSchanbel]_. +the point that minimizes the trust region problem in this subspace +[ByrdSchnabel]_. The key advantage of the Dogleg over Levenberg Marquardt is that if the step computation for a particular choice of :math:`\mu` does not @@ -792,7 +792,7 @@ ``ARMIJO`` is the only choice right now. -.. member:: NonlinearConjugateGradientType Solver::Options::nonlinear conjugate_gradient_type +.. member:: NonlinearConjugateGradientType Solver::Options::nonlinear_conjugate_gradient_type Default: ``FLETCHER_REEVES`` @@ -837,8 +837,8 @@ Ceres supports two different dogleg strategies. ``TRADITIONAL_DOGLEG`` method by Powell and the ``SUBSPACE_DOGLEG`` - method described by [ByrdSchnabel]_. See :ref:`section-dogleg` for more - details. + method described by [ByrdSchnabel]_ . See :ref:`section-dogleg` + for more details. .. member:: bool Solver::Options::use_nonmonotonic_steps