bug #1538: update manual pages regarding BDCSVD.

This commit is contained in:
Gael Guennebaud 2018-04-11 10:46:11 +02:00
parent c91906b065
commit e798466871
4 changed files with 39 additions and 7 deletions

View File

@ -16,7 +16,7 @@ equations is the fastest but least accurate, and the QR decomposition is in betw
\section LeastSquaresSVD Using the SVD decomposition
The \link JacobiSVD::solve() solve() \endlink method in the JacobiSVD class can be directly used to
The \link BDCSVD::solve() solve() \endlink method in the BDCSVD class can be directly used to
solve linear squares systems. It is not enough to compute only the singular values (the default for
this class); you also need the singular vectors but the thin SVD decomposition suffices for
computing least squares solutions:

View File

@ -4,7 +4,7 @@ namespace Eigen {
This page presents a catalogue of the dense matrix decompositions offered by Eigen.
For an introduction on linear solvers and decompositions, check this \link TutorialLinearAlgebra page \endlink.
To get an overview of the true relative speed of the different decomposition, check this \link DenseDecompositionBenchmark benchmark \endlink.
To get an overview of the true relative speed of the different decompositions, check this \link DenseDecompositionBenchmark benchmark \endlink.
\section TopicLinAlgBigTable Catalogue of decompositions offered by Eigen
@ -113,6 +113,18 @@ To get an overview of the true relative speed of the different decomposition, ch
<tr><th class="inter" colspan="9">\n Singular values and eigenvalues decompositions</th></tr>
<tr>
<td>BDCSVD (divide \& conquer)</td>
<td>-</td>
<td>One of the fastest SVD algorithms</td>
<td>Excellent</td>
<td>Yes</td>
<td>Singular values/vectors, least squares</td>
<td>Yes (and does least squares)</td>
<td>Excellent</td>
<td>Blocked bidiagonalization</td>
</tr>
<tr>
<td>JacobiSVD (two-sided)</td>
<td>-</td>

View File

@ -73,7 +73,7 @@ depending on your matrix and the trade-off you want to make:
<td>ColPivHouseholderQR</td>
<td>colPivHouseholderQr()</td>
<td>None</td>
<td>++</td>
<td>+</td>
<td>-</td>
<td>+++</td>
</tr>
@ -85,6 +85,14 @@ depending on your matrix and the trade-off you want to make:
<td>- -</td>
<td>+++</td>
</tr>
<tr class="alt">
<td>CompleteOrthogonalDecomposition</td>
<td>completeOrthogonalDecomposition()</td>
<td>None</td>
<td>+</td>
<td>-</td>
<td>+++</td>
</tr>
<tr class="alt">
<td>LLT</td>
<td>llt()</td>
@ -101,15 +109,24 @@ depending on your matrix and the trade-off you want to make:
<td>+</td>
<td>++</td>
</tr>
<tr class="alt">
<td>BDCSVD</td>
<td>bdcSvd()</td>
<td>None</td>
<td>-</td>
<td>-</td>
<td>+++</td>
</tr>
<tr class="alt">
<td>JacobiSVD</td>
<td>jacobiSvd()</td>
<td>None</td>
<td>- -</td>
<td>-</td>
<td>- - -</td>
<td>+++</td>
</tr>
</table>
To get an overview of the true relative speed of the different decompositions, check this \link DenseDecompositionBenchmark benchmark \endlink.
All of these decompositions offer a solve() method that works as in the above example.
@ -183,8 +200,11 @@ Here is an example:
\section TutorialLinAlgLeastsquares Least squares solving
The most accurate method to do least squares solving is with a SVD decomposition. Eigen provides one
as the JacobiSVD class, and its solve() is doing least-squares solving.
The most accurate method to do least squares solving is with a SVD decomposition.
Eigen provides two implementations.
The recommended one is the BDCSVD class, which scale well for large problems
and automatically fall-back to the JacobiSVD class for smaller problems.
For both classes, their solve() method is doing least-squares solving.
Here is an example:
<table class="example">

View File

@ -11,5 +11,5 @@ int main()
VectorXf b = VectorXf::Random(3);
cout << "Here is the right hand side b:\n" << b << endl;
cout << "The least-squares solution is:\n"
<< A.jacobiSvd(ComputeThinU | ComputeThinV).solve(b) << endl;
<< A.bdcSvd(ComputeThinU | ComputeThinV).solve(b) << endl;
}