unnecessary "for" loops in "org.orekit.propagation.analytical.tle"
The package "org.orekit.propagation.analytical.tle" is a Java
implementation
of the algorithms described in the paper *Revisiting Spacetrack Report
3
(AIAA 2006-6753)*. I am using the Orekit implementation as a partial
basis
for a re-coding of those algorithms in Swift.
That required me to look closely at conversions involving language
constructs
that are available in Java but not in Swift. One of these is the "for
( ; ; )" control statement, and I've tripped over a usage in Orekit that
seems
odd .. it’s computationally correct, but unnecessarily complicated. In
the
file DeepSDP4.java, a constant is defined:
~
private static final int SECULAR_INTEGRATION_ORDER = 2;
~
and later in that same file, two loops of the following form:
~
for (int j = 0; j < SECULAR_INTEGRATION_ORDER; j += 2) {
derivs[j] = term1a + term2a;
derivs[j + 1] = term1b + term2b;
if ((j + 2) < SECULAR_INTEGRATION_ORDER) {
term1a = -term1a;
term2a *= -4.0;
term1b = -term1b;
term2b *= -4.0;
}
}
~
as I read this, the "for" loop will only execute once (with j = 0), and
the
contained "if" condition will never be true. That reduces the above code
to:
~
derivs[0] = term1a + term2a;
derivs[1] = term1b + term2b;
~
to check this, I made this (and one other similar near-by) change and
re-
ran the unit tests with no failure.
There is one more "for" loop that will execute only once (with j = 2):
~
for (int j = 2; j <= SECULAR_INTEGRATION_ORDER; ++j) {
xlpow *= xldot;
derivs[j - 1] *= xlpow;
delt_factor *= delt / (double) j;
xli += delt_factor * derivs[j - 2];
xni += delt_factor * derivs[j - 1];
}
~
I enclose a patch file to make the necessary changes.
(from redmine: issue id 342, created on 2017-06-01, closed on 2017-07-27)
- Uploads:
- patch.diff patch file to make the changes