Skip to content

FeatureBase: covariance, information, square root information upper matrices

Joan Vallvé Navarro requested to merge compute_upper_sqrt into master

This MR proposes different changes in FeatureBase class concerning the measurement noise (covariance, information and square root upper information):

  1. API coherence: Use of Information in the whole class (before there were mixed Info and Information)
  2. A new getter: getMeasurementInformation() which is not really a getter, it returns the squared squareRootInformationUpper.
  3. Check & ensure symmetry: A preliminary assert with numeric tolerance Constants::EPS to avoid wrong user input. Afterwards, when setting the input matrix it uses Eigen::selfAdjointView which takes only the upper triangle to build an exactly symmetric matrix.
  4. Avoid singular covariance: The previous implementation didn't ensure SDP (it was a simple if condition to regularize adding a small constant diagonal matrix). Now the new function avoidSingularCovariance() uses a while which regularize the matrix by adding a constant diagonal matrix with an increasing value until SDP matrix.
  5. computeSqrtUpper() computes sqrt upper: The previous implementation computed the sqrt upper of the inverse of the input, now it does what its name says. So now its input is the information matrix covariance.inverse().
  6. New implementation of computeSqrtUpper(): Now it works also for (close to) SDP information matrices for which Eigen::LLT produces surprisingly bad factorizations.

Open issue: What SDP means?

  • Which test do we perform to decide it, determinant or eigen decomposition?
  • Which tolerances should we use? In case of information and covariance the answers can be different.

@artivis proposed using wolf.h functions: isSymmetric(), isPositiveSemiDefinite(), isCovariance(). I will do it for sure, however I would like to discuss about the previous two questions and modify their implementation if it's the case.

WIP NOTE: Regularization of not SDP matrices is not working properly. Actually, the current implementation was not working either. Adding assert(measurement_covariance_.determinant() > Constants::EPS_SMALL) after regularizing the matrix, some tests didn't pass. Maybe we should move from adding a diagonal matrix to altering the Eigen decomposition.

Edited by Joan Solà Ortega

Merge request reports