Skip to content
GitLab
Explore
Sign in
Register
Primary navigation
Search or go to…
Project
Orekit
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Package Registry
Container Registry
Model registry
Operate
Environments
Terraform modules
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
Gaëtan Pierre
Orekit
Commits
25b81842
Commit
25b81842
authored
3 years ago
by
Bryan Cazabonne
Browse files
Options
Downloads
Patches
Plain Diff
Improve JavaDoc of SequentialBatchLSEstimator.
parent
ba5f6152
No related branches found
Branches containing commit
No related tags found
Tags containing commit
No related merge requests found
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
src/main/java/org/orekit/estimation/leastsquares/SequentialBatchLSEstimator.java
+63
-10
63 additions, 10 deletions
...t/estimation/leastsquares/SequentialBatchLSEstimator.java
with
63 additions
and
10 deletions
src/main/java/org/orekit/estimation/leastsquares/SequentialBatchLSEstimator.java
+
63
−
10
View file @
25b81842
...
@@ -16,6 +16,8 @@
...
@@ -16,6 +16,8 @@
*/
*/
package
org.orekit.estimation.leastsquares
;
package
org.orekit.estimation.leastsquares
;
import
org.hipparchus.linear.MatrixDecomposer
;
import
org.hipparchus.linear.QRDecomposer
;
import
org.hipparchus.optim.nonlinear.vector.leastsquares.LeastSquaresProblem.Evaluation
;
import
org.hipparchus.optim.nonlinear.vector.leastsquares.LeastSquaresProblem.Evaluation
;
import
org.hipparchus.optim.nonlinear.vector.leastsquares.SequentialGaussNewtonOptimizer
;
import
org.hipparchus.optim.nonlinear.vector.leastsquares.SequentialGaussNewtonOptimizer
;
import
org.orekit.propagation.conversion.OrbitDeterminationPropagatorBuilder
;
import
org.orekit.propagation.conversion.OrbitDeterminationPropagatorBuilder
;
...
@@ -24,18 +26,68 @@ import org.orekit.propagation.conversion.PropagatorBuilder;
...
@@ -24,18 +26,68 @@ import org.orekit.propagation.conversion.PropagatorBuilder;
/**
/**
* Sequential least squares estimator for orbit determination.
* Sequential least squares estimator for orbit determination.
* <p>
* <p>
* This class extends {@link BatchLSEstimator}. It uses the result of a previous
* When an orbit has already been estimated and new measurements are given, it is not efficient
* optimization with its {@link Evaluation} and re-estimate the orbit with new measures
* and the previous Evaluation.
* </p>
* <p>
* When an orbit has already been estimated and measures are given, it is not efficient
* to re-optimize the whole problem. Only considering the new measures while optimizing
* to re-optimize the whole problem. Only considering the new measures while optimizing
* will neither give good results as the old measures will not be taken into account.
* will neither give good results as the old measure
ment
s will not be taken into account.
* Thus, a sequential estimator is used to estimate the orbit, which uses the old results
* Thus, a sequential estimator is used to estimate the orbit, which uses the old results
* of the estimation (with the old evaluation) and the new measures.
* of the estimation and the new measurements.
* </p>
* <p>
*
* In order to perform a sequential optimization, the user must configure a
* {@link SequentialGaussNewtonOptimizer}. Depending if its input data are
* an empty {@link Evaluation}, a complete <code>Evaluation</code> or an a priori
* state and covariance, different configuration are possible.
* <p>
* <b>1. No input data from a previous estimation</b>
* <p>
* Then, the {@link SequentialBatchLSEstimator} can be used like a {@link BatchLSEstimator}
* to perform the estimation. The user can initialize the <code>SequentialGaussNewtonOptimizer</code>
* using the default constructor.
* <p>
* <code>final SequentialGaussNewtonOptimizer optimizer = new SequentialGaussNewtonOptimizer();</code>
* <p>
* By default, a {@link QRDecomposer} is used as decomposition algorithm. In addition, normal
* equations are not form. It is possible to update these two default configurations by using:
* <ul>
* <li>{@link SequentialGaussNewtonOptimizer#withDecomposer(MatrixDecomposer) withDecomposer} method:
* <code>optimizer.withDecomposer(newDecomposer);</code>
* </li>
* <li>{@link SequentialGaussNewtonOptimizer#withFormNormalEquations(boolean) withFormNormalEquations} method:
* <code>optimizer.withFormNormalEquations(newFormNormalEquations);</code>
* </li>
* </ul>
* <p>
* <b>2. Initialization using a previous <code>Evalutation</code></b>
* <p>
* In this situation, it is recommended to use the second constructor of the optimizer class.
* <p>
* <code>final SequentialGaussNewtonOptimizer optimizer = new SequentialGaussNewtonOptimizer(decomposer,
* formNormalEquations,
* evaluation);
* </code>
* <p>
* Using this constructor, the user can directly configure the MatrixDecomposer and set the flag for normal equations
* without calling the two previous presented methods.
* <p>
* <i>Note:</i> This constructor can also be used to perform the initialization of <b>1.</b>
* In this case, the <code>Evaluation evaluation</code> is <code>null</code>.
* <p>
* <b>3. Initialization using an a priori estimated state and covariance</b>
* <p>
* These situation is a classical satellite operation need. Indeed, a classical action is to use
* the results of a previous orbit determination (estimated state and covariance) performed a day before,
* to improve the initialization and the results of an orbit determination performed the current day.
* In this situation, the user can initialize the <code>SequentialGaussNewtonOptimizer</code>
* using the default constructor.
* <p>
* <code>final SequentialGaussNewtonOptimizer optimizer = new SequentialGaussNewtonOptimizer();</code>
* <p>
* The MatrixDecomposer and the flag about normal equations can again be updated using the two previous
* presented methods. The a priori state and covariance matrix can be set using:
* <ul>
* <li>{@link SequentialGaussNewtonOptimizer#withAPrioriData(org.hipparchus.linear.RealVector, org.hipparchus.linear.RealMatrix) withAPrioriData} method:
* <code>optimizer.withAPrioriData(aPrioriState, aPrioriCovariance);</code>
* </li>
* </ul>
* @author Julie Bayard
* @author Julie Bayard
* @since 11.0
* @since 11.0
*/
*/
...
@@ -60,6 +112,7 @@ public class SequentialBatchLSEstimator extends BatchLSEstimator {
...
@@ -60,6 +112,7 @@ public class SequentialBatchLSEstimator extends BatchLSEstimator {
* <p>
* <p>
* The solver used for sequential least squares problem is a
* The solver used for sequential least squares problem is a
* {@link SequentialGaussNewtonOptimizer sequential Gauss Newton optimizer}.
* {@link SequentialGaussNewtonOptimizer sequential Gauss Newton optimizer}.
* Details about how initialize it are given in the class JavaDoc.
* </p>
* </p>
*
*
* @param sequentialOptimizer solver for sequential least squares problem
* @param sequentialOptimizer solver for sequential least squares problem
...
...
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment