This MR proposes different changes in FeatureBase
class concerning the measurement noise (covariance, information and square root upper information):
Information
in the whole class (before there were mixed Info
and Information
)getMeasurementInformation()
which is not really a getter, it returns the squared squareRootInformationUpper
.Constants::EPS
to avoid wrong user input. Afterwards, when setting the input matrix it uses Eigen::selfAdjointView
which takes only the upper triangle to build an exactly symmetric matrix.if
condition to regularize adding a small constant diagonal matrix). Now the new function avoidSingularCovariance()
uses a while
which regularize the matrix by adding a constant diagonal matrix with an increasing value until SDP matrix.computeSqrtUpper()
computes sqrt upper: The previous implementation computed the sqrt upper of the inverse of the input, now it does what its name says. So now its input is the information matrix covariance.inverse()
.computeSqrtUpper()
: Now it works also for (close to) SDP information matrices for which Eigen::LLT
produces surprisingly bad factorizations.Open issue: What SDP means?
@artivis proposed using wolf.h
functions: isSymmetric()
, isPositiveSemiDefinite(), isCovariance()
. I will do it for sure, however I would like to discuss about the previous two questions and modify their implementation if it's the case.
WIP NOTE: Regularization of not SDP matrices is not working properly. Actually, the current implementation was not working either. Adding assert(measurement_covariance_.determinant() > Constants::EPS_SMALL)
after regularizing the matrix, some tests didn't pass. Maybe we should move from adding a diagonal matrix to altering the Eigen decomposition.