Add Problem::EvaluateResidualBlockAssumingParametersUnchanged

Simplify the semantics for Problem::EvaluateResidualBlock to
not ignore the presence of EvaluationCallback and add another method
EvaluateResidualBlockAssumingParametersUnchanged to handle the case
where the user has an EvaluationCallback but knows that the parameter
blocks do not change between calls.

Updated the documentation for the methods and EvaluationCallback to
reflect these semantics.

Also added tests for Evaluation related methods calling i
EvaluationCallback when its present.

https://github.com/ceres-solver/ceres-solver/issues/483

Change-Id: If0a0c95c2f1f92e9183a90df240104a69a71c46d
diff --git a/docs/source/nnls_modeling.rst b/docs/source/nnls_modeling.rst
index 593fcab..b6bec4a 100644
--- a/docs/source/nnls_modeling.rst
+++ b/docs/source/nnls_modeling.rst
@@ -1725,14 +1725,17 @@
 
     Default: `nullptr`
 
-    Using this callback interface, Ceres can notify you when it is
-    about to evaluate the residuals or Jacobians. With the callback,
-    you can share computation between residual blocks by doing the
-    shared computation in
+    Using this callback interface, Ceres will notify you when it is
+    about to evaluate the residuals or Jacobians.
+
+    If an ``evaluation_callback`` is present, Ceres will update the
+    user's parameter blocks to the values that will be used when
+    calling :func:`CostFunction::Evaluate` before calling
+    :func:`EvaluationCallback::PrepareForEvaluation`. One can then use
+    this callback to share (or cache) computation between cost
+    functions by doing the shared computation in
     :func:`EvaluationCallback::PrepareForEvaluation` before Ceres
-    calls :func:`CostFunction::Evaluate`. It also enables caching
-    results between a pure residual evaluation and a residual &
-    Jacobian evaluation.
+    calls :func:`CostFunction::Evaluate`.
 
     Problem does NOT take ownership of the callback.
 
@@ -1752,8 +1755,8 @@
    parameter blocks it expects. The function checks that these match
    the sizes of the parameter blocks listed in parameter_blocks. The
    program aborts if a mismatch is detected. loss_function can be
-   nullptr, in which case the cost of the term is just the squared norm
-   of the residuals.
+   `nullptr`, in which case the cost of the term is just the squared
+   norm of the residuals.
 
    The parameter blocks may be passed together as a
    ``vector<double*>``, or as up to ten separate ``double*`` pointers.
@@ -1997,22 +2000,35 @@
    ``apply_loss_function`` as the name implies allows the user to
    switch the application of the loss function on and off.
 
-   .. NOTE::
+   .. NOTE:: If an :class:`EvaluationCallback` is associated with the
+      problem, then its
+      :func:`EvaluationCallback::PrepareForEvaluation` method will be
+      called every time this method is called with `new_point =
+      true`. This conservatively assumes that the user may have
+      changed the parameter values since the previous call to evaluate
+      / solve.  For improved efficiency, and only if you know that the
+      parameter values have not changed between calls, see
+      :func:`Problem::EvaluateResidualBlockAssumingParametersUnchanged`.
 
-      If an :class:`EvaluationCallback` is associated with the problem
-      then it is the user's responsibility to call
-      :func:`EvaluationCalback::PrepareForEvaluation` it before
-      calling this method.
 
-      This is because, if the user calls this method multiple times,
-      we cannot tell if the underlying parameter blocks have changed
-      between calls or not. So if ``EvaluateResidualBlock`` was
-      responsible for calling the
-      :func:`EvaluationCalback::PrepareForEvaluation`, it will have to
-      do it everytime it is called. Which makes the common case where
-      the parameter blocks do not change, inefficient. So we leave it
-      to the user to call the
-      :func:`EvaluationCalback::PrepareForEvaluation` as needed.
+.. function::  bool EvaluateResidualBlockAssumingParametersUnchanged(ResidualBlockId residual_block_id, bool apply_loss_function, double* cost,double* residuals, double** jacobians) const
+
+    Same as :func:`Problem::EvaluateResidualBlock` except that if an
+    :class:`EvaluationCallback` is associated with the problem, then
+    its :func:`EvaluationCallback::PrepareForEvaluation` method will
+    be called every time this method is called with new_point = false.
+
+    This means, if an :class:`EvaluationCallback` is associated with
+    the problem then it is the user's responsibility to call
+    :func:`EvaluationCallback::PrepareForEvaluation` before calling
+    this method if necessary, i.e. iff the parameter values have been
+    changed since the last call to evaluate / solve.'
+
+    This is because, as the name implies, we assume that the parameter
+    blocks did not change since the last time
+    :func:`EvaluationCallback::PrepareForEvaluation` was called (via
+    :func:`Solve`, :func:`Problem:Evaluate` or
+    :func:`Problem:EvaluateResidualBlock`).
 
 
 .. function:: bool Problem::Evaluate(const Problem::EvaluateOptions& options, double* cost, vector<double>* residuals, vector<double>* gradient, CRSMatrix* jacobian)
@@ -2126,11 +2142,15 @@
                                             bool new_evaluation_point) = 0;
       };
 
-   Ceres will call ``PrepareForEvaluation()`` every time, and once
-   before it computes the residuals and/or the Jacobians.
+.. function:: void EvaluationCallback::PrepareForEvaluation(bool evaluate_jacobians, bool new_evaluation_point)
 
-   User parameters (the double* values provided by the us)
-   are fixed until the next call to ``PrepareForEvaluation()``. If
+   Ceres will call :func:`EvaluationCallback::PrepareForEvaluation`
+   every time, and once before it computes the residuals and/or the
+   Jacobians.
+
+   User parameters (the double* values provided by the us) are fixed
+   until the next call to
+   :func:`EvaluationCallback::PrepareForEvaluation`. If
    ``new_evaluation_point == true``, then this is a new point that is
    different from the last evaluated point. Otherwise, it is the same
    point that was evaluated previously (either Jacobian or residual)
@@ -2138,33 +2158,37 @@
    ``evaluate_jacobians`` is true, then Ceres will request Jacobians
    in the upcoming cost evaluation.
 
-   Using this callback interface, Ceres can notify you when it is about
-   to evaluate the residuals or Jacobians. With the callback, you can
-   share computation between residual blocks by doing the shared
-   computation in PrepareForEvaluation() before Ceres calls
-   CostFunction::Evaluate() on all the residuals. It also enables
-   caching results between a pure residual evaluation and a residual &
-   Jacobian evaluation, via the new_evaluation_point argument.
+   Using this callback interface, Ceres can notify you when it is
+   about to evaluate the residuals or Jacobians. With the callback,
+   you can share computation between residual blocks by doing the
+   shared computation in
+   :func:`EvaluationCallback::PrepareForEvaluation` before Ceres calls
+   :func:`CostFunction::Evaluate` on all the residuals. It also
+   enables caching results between a pure residual evaluation and a
+   residual & Jacobian evaluation, via the ``new_evaluation_point``
+   argument.
 
    One use case for this callback is if the cost function compute is
-   moved to the GPU. In that case, the prepare call does the actual cost
-   function evaluation, and subsequent calls from Ceres to the actual
-   cost functions merely copy the results from the GPU onto the
+   moved to the GPU. In that case, the prepare call does the actual
+   cost function evaluation, and subsequent calls from Ceres to the
+   actual cost functions merely copy the results from the GPU onto the
    corresponding blocks for Ceres to plug into the solver.
 
    **Note**: Ceres provides no mechanism to share data other than the
    notification from the callback. Users must provide access to
    pre-computed shared data to their cost functions behind the scenes;
    this all happens without Ceres knowing. One approach is to put a
-   pointer to the shared data in each cost function (recommended) or to
-   use a global shared variable (discouraged; bug-prone).  As far as
-   Ceres is concerned, it is evaluating cost functions like any other;
-   it just so happens that behind the scenes the cost functions reuse
-   pre-computed data to execute faster.
+   pointer to the shared data in each cost function (recommended) or
+   to use a global shared variable (discouraged; bug-prone).  As far
+   as Ceres is concerned, it is evaluating cost functions like any
+   other; it just so happens that behind the scenes the cost functions
+   reuse pre-computed data to execute faster.
 
-   See ``evaluation_callback_test.cc`` for code that explicitly verifies
-   the preconditions between ``PrepareForEvaluation()`` and
-   ``CostFunction::Evaluate()``.
+   See ``evaluation_callback_test.cc`` for code that explicitly
+   verifies the preconditions between
+   :func:`EvaluationCallback::PrepareForEvaluation` and
+   :func:`CostFunction::Evaluate`.
+
 
 ``rotation.h``
 ==============
diff --git a/include/ceres/problem.h b/include/ceres/problem.h
index 88f9966..76d115b 100644
--- a/include/ceres/problem.h
+++ b/include/ceres/problem.h
@@ -475,7 +475,7 @@
   // at the end of an iteration during a solve.
   //
   // Note 4: If an EvaluationCallback is associated with the problem,
-  // then its PrepareForEvaluation method will be called everytime
+  // then its PrepareForEvaluation method will be called every time
   // this method is called with new_point = true.
   bool Evaluate(const EvaluateOptions& options,
                 double* cost,
@@ -509,23 +509,41 @@
   // apply_loss_function as the name implies allows the user to switch
   // the application of the loss function on and off.
   //
-  // WARNING: If an EvaluationCallback is associated with the problem
-  // then it is the user's responsibility to call it before calling
-  // this method.
-  //
-  // This is because, if the user calls this method multiple times, we
-  // cannot tell if the underlying parameter blocks have changed
-  // between calls or not. So if EvaluateResidualBlock was responsible
-  // for calling the EvaluationCallback, it will have to do it
-  // everytime it is called. Which makes the common case where the
-  // parameter blocks do not change, inefficient. So we leave it to
-  // the user to call the EvaluationCallback as needed.
+  // If an EvaluationCallback is associated with the problem, then its
+  // PrepareForEvaluation method will be called every time this method
+  // is called with new_point = true. This conservatively assumes that
+  // the user may have changed the parameter values since the previous
+  // call to evaluate / solve.  For improved efficiency, and only if
+  // you know that the parameter values have not changed between
+  // calls, see EvaluateResidualBlockAssumingParametersUnchanged().
   bool EvaluateResidualBlock(ResidualBlockId residual_block_id,
                              bool apply_loss_function,
                              double* cost,
                              double* residuals,
                              double** jacobians) const;
 
+  // Same as EvaluateResidualBlock except that if an
+  // EvaluationCallback is associated with the problem, then its
+  // PrepareForEvaluation method will be called every time this method
+  // is called with new_point = false.
+  //
+  // This means, if an EvaluationCallback is associated with the
+  // problem then it is the user's responsibility to call
+  // PrepareForEvaluation before calling this method if necessary,
+  // i.e. iff the parameter values have been changed since the last
+  // call to evaluate / solve.'
+  //
+  // This is because, as the name implies, we assume that the
+  // parameter blocks did not change since the last time
+  // PrepareForEvaluation was called (via Solve, Evaluate or
+  // EvaluateResidualBlock).
+  bool EvaluateResidualBlockAssumingParametersUnchanged(
+      ResidualBlockId residual_block_id,
+      bool apply_loss_function,
+      double* cost,
+      double* residuals,
+      double** jacobians) const;
+
  private:
   friend class Solver;
   friend class Covariance;
diff --git a/internal/ceres/problem.cc b/internal/ceres/problem.cc
index 767fe97..f3ffd54 100644
--- a/internal/ceres/problem.cc
+++ b/internal/ceres/problem.cc
@@ -139,8 +139,26 @@
                                     double* cost,
                                     double* residuals,
                                     double** jacobians) const {
-  return impl_->EvaluateResidualBlock(
-      residual_block_id, apply_loss_function, cost, residuals, jacobians);
+  return impl_->EvaluateResidualBlock(residual_block_id,
+                                      apply_loss_function,
+                                      /* new_point = */ true,
+                                      cost,
+                                      residuals,
+                                      jacobians);
+}
+
+bool Problem::EvaluateResidualBlockAssumingParametersUnchanged(
+    ResidualBlockId residual_block_id,
+    bool apply_loss_function,
+    double* cost,
+    double* residuals,
+    double** jacobians) const {
+  return impl_->EvaluateResidualBlock(residual_block_id,
+                                      apply_loss_function,
+                                      /* new_point = */ false,
+                                      cost,
+                                      residuals,
+                                      jacobians);
 }
 
 int Problem::NumParameterBlocks() const { return impl_->NumParameterBlocks(); }
diff --git a/internal/ceres/problem_impl.cc b/internal/ceres/problem_impl.cc
index 6cc4d33..20da652 100644
--- a/internal/ceres/problem_impl.cc
+++ b/internal/ceres/problem_impl.cc
@@ -601,7 +601,6 @@
                            CRSMatrix* jacobian) {
   if (cost == nullptr && residuals == nullptr && gradient == nullptr &&
       jacobian == nullptr) {
-    LOG(INFO) << "Nothing to do.";
     return true;
   }
 
@@ -768,9 +767,15 @@
 
 bool ProblemImpl::EvaluateResidualBlock(ResidualBlock* residual_block,
                                         bool apply_loss_function,
+                                        bool new_point,
                                         double* cost,
                                         double* residuals,
                                         double** jacobians) const {
+  auto evaluation_callback = program_->mutable_evaluation_callback();
+  if (evaluation_callback) {
+    evaluation_callback->PrepareForEvaluation(jacobians != nullptr, new_point);
+  }
+
   ParameterBlock* const* parameter_blocks = residual_block->parameter_blocks();
   const int num_parameter_blocks = residual_block->NumParameterBlocks();
   for (int i = 0; i < num_parameter_blocks; ++i) {
@@ -789,7 +794,8 @@
   }
 
   double dummy_cost = 0.0;
-  FixedArray<double> scratch(residual_block->NumScratchDoublesForEvaluate());
+  FixedArray<double, 32> scratch(
+      residual_block->NumScratchDoublesForEvaluate());
   return residual_block->Evaluate(apply_loss_function,
                                   cost ? cost : &dummy_cost,
                                   residuals,
diff --git a/internal/ceres/problem_impl.h b/internal/ceres/problem_impl.h
index 8bbe723..004918a 100644
--- a/internal/ceres/problem_impl.h
+++ b/internal/ceres/problem_impl.h
@@ -124,6 +124,7 @@
 
   bool EvaluateResidualBlock(ResidualBlock* residual_block,
                              bool apply_loss_function,
+                             bool new_point,
                              double* cost,
                              double* residuals,
                              double** jacobians) const;
diff --git a/internal/ceres/problem_test.cc b/internal/ceres/problem_test.cc
index 9093b7a..e3c77a0 100644
--- a/internal/ceres/problem_test.cc
+++ b/internal/ceres/problem_test.cc
@@ -49,6 +49,7 @@
 #include "ceres/sparse_matrix.h"
 #include "ceres/types.h"
 #include "gtest/gtest.h"
+#include "gmock/gmock.h"
 
 namespace ceres {
 namespace internal {
@@ -1570,6 +1571,8 @@
  public:
   static constexpr bool kApplyLossFunction = true;
   static constexpr bool kDoNotApplyLossFunction = false;
+  static constexpr bool kNewPoint = true;
+  static constexpr bool kNotNewPoint = false;
   static double loss_function_scale_;
 
  protected:
@@ -1599,6 +1602,7 @@
   double* jacobians[2] = {actual_dfdx.data(), actual_dfdy.data()};
   EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
                                              kApplyLossFunction,
+                                             kNewPoint,
                                              &actual_cost,
                                              actual_f.data(),
                                              jacobians));
@@ -1625,8 +1629,12 @@
        OneResidualBlockNoLossFunctionNullEval) {
   ResidualBlockId residual_block_id =
       problem_.AddResidualBlock(IdentityFunctor::Create(), nullptr, x_, y_);
-  EXPECT_TRUE(problem_.EvaluateResidualBlock(
-      residual_block_id, kApplyLossFunction, nullptr, nullptr, nullptr));
+  EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
+                                             kApplyLossFunction,
+                                             kNewPoint,
+                                             nullptr,
+                                             nullptr,
+                                             nullptr));
 }
 
 TEST_F(ProblemEvaluateResidualBlockTest, OneResidualBlockNoLossFunctionCost) {
@@ -1637,8 +1645,12 @@
   double expected_cost = expected_f.squaredNorm() / 2.0;
 
   double actual_cost;
-  EXPECT_TRUE(problem_.EvaluateResidualBlock(
-      residual_block_id, kApplyLossFunction, &actual_cost, nullptr, nullptr));
+  EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
+                                             kApplyLossFunction,
+                                             kNewPoint,
+                                             &actual_cost,
+                                             nullptr,
+                                             nullptr));
 
   EXPECT_NEAR(std::abs(expected_cost - actual_cost) / actual_cost,
               0,
@@ -1658,6 +1670,7 @@
   Vector actual_f(5);
   EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
                                              kApplyLossFunction,
+                                             kNewPoint,
                                              &actual_cost,
                                              actual_f.data(),
                                              nullptr));
@@ -1688,6 +1701,7 @@
   double* jacobians[2] = {actual_dfdx.data(), nullptr};
   EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
                                              kApplyLossFunction,
+                                             kNewPoint,
                                              &actual_cost,
                                              actual_f.data(),
                                              jacobians));
@@ -1715,6 +1729,7 @@
   Vector actual_f(5);
   EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
                                              kApplyLossFunction,
+                                             kNewPoint,
                                              nullptr,
                                              actual_f.data(),
                                              nullptr));
@@ -1749,6 +1764,7 @@
   double* jacobians[2] = {actual_dfdx.data(), actual_dfdy.data()};
   EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
                                              kApplyLossFunction,
+                                             kNewPoint,
                                              &actual_cost,
                                              actual_f.data(),
                                              jacobians));
@@ -1793,6 +1809,7 @@
   double* jacobians[2] = {actual_dfdx.data(), actual_dfdy.data()};
   EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
                                              kDoNotApplyLossFunction,
+                                             kNewPoint,
                                              &actual_cost,
                                              actual_f.data(),
                                              jacobians));
@@ -1836,6 +1853,7 @@
   double* jacobians[2] = {actual_dfdx.data(), actual_dfdy.data()};
   EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
                                              kApplyLossFunction,
+                                             kNewPoint,
                                              &actual_cost,
                                              actual_f.data(),
                                              jacobians));
@@ -1880,6 +1898,7 @@
   double* jacobians[2] = {actual_dfdx.data(), actual_dfdy.data()};
   EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
                                              kApplyLossFunction,
+                                             kNewPoint,
                                              &actual_cost,
                                              actual_f.data(),
                                              jacobians));
@@ -1923,6 +1942,7 @@
   double* jacobians[2] = {actual_dfdx.data(), actual_dfdy.data()};
   EXPECT_FALSE(problem_.EvaluateResidualBlock(residual_block_id,
                                               kApplyLossFunction,
+                                              kNewPoint,
                                               &actual_cost,
                                               actual_f.data(),
                                               jacobians));
@@ -1930,6 +1950,7 @@
   jacobians[0] = nullptr;
   EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
                                              kApplyLossFunction,
+                                             kNewPoint,
                                              &actual_cost,
                                              actual_f.data(),
                                              jacobians));
@@ -1968,6 +1989,7 @@
   double* jacobians[2] = {actual_dfdx.data(), actual_dfdy.data()};
   EXPECT_FALSE(problem_.EvaluateResidualBlock(residual_block_id,
                                               kApplyLossFunction,
+                                              kNewPoint,
                                               &actual_cost,
                                               actual_f.data(),
                                               jacobians));
@@ -1975,12 +1997,14 @@
   jacobians[0] = nullptr;
   EXPECT_FALSE(problem_.EvaluateResidualBlock(residual_block_id,
                                               kApplyLossFunction,
+                                              kNewPoint,
                                               &actual_cost,
                                               actual_f.data(),
                                               jacobians));
   jacobians[1] = nullptr;
   EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
                                              kApplyLossFunction,
+                                             kNewPoint,
                                              &actual_cost,
                                              actual_f.data(),
                                              jacobians));
@@ -2018,6 +2042,7 @@
   double* jacobians[2] = {actual_dfdx.data(), actual_dfdy.data()};
   EXPECT_FALSE(problem_.EvaluateResidualBlock(residual_block_id,
                                               kApplyLossFunction,
+                                              kNewPoint,
                                               &actual_cost,
                                               actual_f.data(),
                                               jacobians));
@@ -2025,6 +2050,7 @@
   jacobians[0] = nullptr;
   EXPECT_TRUE(problem_.EvaluateResidualBlock(residual_block_id,
                                              kApplyLossFunction,
+                                             kNewPoint,
                                              &actual_cost,
                                              actual_f.data(),
                                              jacobians));
@@ -2132,5 +2158,119 @@
   EXPECT_TRUE(problem.IsParameterBlockConstant(&y));
 }
 
+class MockEvaluationCallback : public EvaluationCallback {
+ public:
+  MOCK_METHOD2(PrepareForEvaluation, void(bool, bool));
+};
+
+TEST(ProblemEvaluate, CallsEvaluationCallbackWithoutJacobian) {
+  constexpr bool kDoNotComputeJacobians = false;
+  constexpr bool kNewPoint = true;
+
+  MockEvaluationCallback evaluation_callback;
+  EXPECT_CALL(evaluation_callback,
+              PrepareForEvaluation(kDoNotComputeJacobians, kNewPoint))
+      .Times(1);
+
+  Problem::Options options;
+  options.evaluation_callback = &evaluation_callback;
+  ProblemImpl problem(options);
+  double x_[2] = {1, 2};
+  double y_[3] = {1, 2, 3};
+  ResidualBlockId residual_block_id =
+      problem.AddResidualBlock(IdentityFunctor::Create(), nullptr, x_, y_);
+
+  double actual_cost;
+  Vector actual_f(5);
+  Matrix actual_dfdx(5, 2);
+  Matrix actual_dfdy(5, 3);
+  double* jacobians[2] = {actual_dfdx.data(), actual_dfdy.data()};
+  EXPECT_TRUE(problem.Evaluate(
+      Problem::EvaluateOptions(), &actual_cost, nullptr, nullptr, nullptr));
+}
+
+TEST(ProblemEvaluate, CallsEvaluationCallbackWithJacobian) {
+  constexpr bool kComputeJacobians = true;
+  constexpr bool kNewPoint = true;
+
+  MockEvaluationCallback evaluation_callback;
+  EXPECT_CALL(evaluation_callback,
+              PrepareForEvaluation(kComputeJacobians, kNewPoint))
+      .Times(1);
+
+  Problem::Options options;
+  options.evaluation_callback = &evaluation_callback;
+  ProblemImpl problem(options);
+  double x_[2] = {1, 2};
+  double y_[3] = {1, 2, 3};
+  ResidualBlockId residual_block_id =
+      problem.AddResidualBlock(IdentityFunctor::Create(), nullptr, x_, y_);
+
+  double actual_cost;
+  Vector actual_f(5);
+  Matrix actual_dfdx(5, 2);
+  Matrix actual_dfdy(5, 3);
+  double* jacobians[2] = {actual_dfdx.data(), actual_dfdy.data()};
+  ceres::CRSMatrix jacobian;
+  EXPECT_TRUE(problem.Evaluate(
+      Problem::EvaluateOptions(), &actual_cost, nullptr, nullptr, &jacobian));
+}
+
+TEST(ProblemEvaluateResidualBlock, NewPointCallsEvaluationCallback) {
+  constexpr bool kComputeJacobians = true;
+  constexpr bool kNewPoint = true;
+
+  MockEvaluationCallback evaluation_callback;
+  EXPECT_CALL(evaluation_callback,
+              PrepareForEvaluation(kComputeJacobians, kNewPoint))
+      .Times(1);
+
+  Problem::Options options;
+  options.evaluation_callback = &evaluation_callback;
+  ProblemImpl problem(options);
+  double x_[2] = {1, 2};
+  double y_[3] = {1, 2, 3};
+  ResidualBlockId residual_block_id =
+      problem.AddResidualBlock(IdentityFunctor::Create(), nullptr, x_, y_);
+
+  double actual_cost;
+  Vector actual_f(5);
+  Matrix actual_dfdx(5, 2);
+  Matrix actual_dfdy(5, 3);
+  double* jacobians[2] = {actual_dfdx.data(), actual_dfdy.data()};
+  EXPECT_TRUE(problem.EvaluateResidualBlock(
+      residual_block_id, true, true, &actual_cost, actual_f.data(), jacobians));
+}
+
+TEST(ProblemEvaluateResidualBlock, OldPointCallsEvaluationCallback) {
+  constexpr bool kComputeJacobians = true;
+  constexpr bool kOldPoint = false;
+
+  MockEvaluationCallback evaluation_callback;
+  EXPECT_CALL(evaluation_callback,
+              PrepareForEvaluation(kComputeJacobians, kOldPoint))
+      .Times(1);
+
+  Problem::Options options;
+  options.evaluation_callback = &evaluation_callback;
+  ProblemImpl problem(options);
+  double x_[2] = {1, 2};
+  double y_[3] = {1, 2, 3};
+  ResidualBlockId residual_block_id =
+      problem.AddResidualBlock(IdentityFunctor::Create(), nullptr, x_, y_);
+
+  double actual_cost;
+  Vector actual_f(5);
+  Matrix actual_dfdx(5, 2);
+  Matrix actual_dfdy(5, 3);
+  double* jacobians[2] = {actual_dfdx.data(), actual_dfdy.data()};
+  EXPECT_TRUE(problem.EvaluateResidualBlock(residual_block_id,
+                                            true,
+                                            false,
+                                            &actual_cost,
+                                            actual_f.data(),
+                                            jacobians));
+}
+
 }  // namespace internal
 }  // namespace ceres