computeLimitVisibilityPoint misinterprets Azimuth
The following (python) code using orekit as of aug 22:
ITRF2005 = FramesFactory.getITRF2005()
earth = OneAxisEllipsoid(Constants.WGS84_EARTH_EQUATORIAL_RADIUS,
...: Constants.WGS84_EARTH_FLATTENING,
...: ITRF2005)
...: staFrame = TopocentricFrame(earth, GeodeticPoint(0.0,0.0,0.0),
'test')
print
staFrame.computeLimitVisibilityPoint(Constants.WGS84_EARTH_EQUATORIAL_RADIUS+600000,
0.0,5.0)
{lat: 0 deg, lon: 148,4942265118 deg, alt: 599�999,9999999991}
staFrame.getNorth()
Out[65]: <Vector3D: {-0; -0; 1}>
The direction to the point should be in latitude direction (Azimuth 0.0 is assumed north)
Luc comment from mailinglist: A very quick look makes me think there is an inconsistency between the pointAtDistance method and the other ones like getAzimuth/getElevation) about how the frame is oriented.
(from redmine: issue id 145, created on 2013-08-22, closed on 2013-08-22)