FeatureBase: covariance, information, square root information upper matrices
This MR proposes different changes in FeatureBase
class concerning the measurement noise (covariance, information and square root upper information):
-
API coherence: Use of
Information
in the whole class (before there were mixedInfo
andInformation
) -
A new getter:
getMeasurementInformation()
which is not really a getter, it returns the squaredsquareRootInformationUpper
. -
Check & ensure symmetry: A preliminary assert with numeric tolerance
Constants::EPS
to avoid wrong user input. Afterwards, when setting the input matrix it usesEigen::selfAdjointView
which takes only the upper triangle to build an exactly symmetric matrix. -
Avoid singular covariance: The previous implementation didn't ensure SDP (it was a simple
if
condition to regularize adding a small constant diagonal matrix). Now the new functionavoidSingularCovariance()
uses awhile
which regularize the matrix by adding a constant diagonal matrix with an increasing value until SDP matrix. -
computeSqrtUpper()
computes sqrt upper: The previous implementation computed the sqrt upper of the inverse of the input, now it does what its name says. So now its input is the information matrixcovariance.inverse()
. -
New implementation of
computeSqrtUpper()
: Now it works also for (close to) SDP information matrices for whichEigen::LLT
produces surprisingly bad factorizations.
Open issue: What SDP means?
- Which test do we perform to decide it, determinant or eigen decomposition?
- Which tolerances should we use? In case of information and covariance the answers can be different.
@artivis proposed using wolf.h
functions: isSymmetric()
, isPositiveSemiDefinite(), isCovariance()
. I will do it for sure, however I would like to discuss about the previous two questions and modify their implementation if it's the case.
WIP NOTE: Regularization of not SDP matrices is not working properly. Actually, the current implementation was not working either. Adding assert(measurement_covariance_.determinant() > Constants::EPS_SMALL)
after regularizing the matrix, some tests didn't pass. Maybe we should move from adding a diagonal matrix to altering the Eigen decomposition.