1 Introduction
In recent years, various methods utilizing image processing
techniques have been proposed for planetary
exploring spacecrafts. For example, HAYABUSA tried
to carry out autonomous navigation and guidance during
proximity phase with optical cameras and laser altimeter
(LIDAR), and demonstrated some of its autonomous
functions[1] [2].
As for spacecrafts such as lander, one of the important
issues with applying camera image based guidance
and navigation is errors caused by the depth information
for scene in images which cannot be precisely estimated
only by cameras.
Laser Range Finder (LRF) is one solution to improve
such errors. In previous work, to measure accurate
shapes of archaeological buildings with 2D scanning
LRF mounted baloon, a method utilizing CCD camera
combined with the LRF is proposed[3]. Whereas the
method can estimate motion of the sensor (the balloon)
as well as the targetsf shapes, the computational load
can be serious because of the nonlinear optimaization
method for a function of a large number of variables.
Instead of using conventional imaging devices such as
CCD camera, this study proposes an approach to estimating
the landerfs motion by range image sequence
acquired from 2D scanning LRF aiming at autonmous
guidance and navigation during final descent phase.
Since the range image can be considered as the pixel
data of CCD camera which contains range information
instead of its original luminance one, Computer Vision
theories can be applied. The proposed method estimates
landerfs 6 DegreeOfFreedom (6DOF) motion
by Extended Kalman Filter algorithm through some
image processing techniques applied to the range images.
Focusing on the simulation study, this paper treates
2D scanning LRF which has desirable specifications,
that is, as high resolution and high scanning rate as
common CCD camera has. Finally, the effectiveness of
the proposed method is demonstrated through a series
of numerical simulations and some present issues are
discussed.
Fig. 1: Relationship between a scene pointfs motion in
images and landerfs motion.
2 Relationship between Scene
Pointfs Motion in Range Images
and Landerfs Motion
In this study, to descrive scene pointsf locations or
their motion in range images, perspective projection
model as shown in Fig. 1 is introduced. When a scene
point at (ximg, yimg) in image coordinates moves at a
velocity of ( ? ximg, ? yimg) between two sequential images,
the relationship between the scene pointfs velocity (well
known as an foptical flowf in Computer Vision theory),
and the motion of the lander (or lander mounted LRF)
can be described as follows:
? ximg = f ?
u
zc ? q + r E yimg
+ ximg w
zc
+ p E yimg ? q E ximg (1)
? yimg = f ?
v
zc
+ p ? r E ximg
+ yimg w
zc
+ p E yimg ? q E ximg (2)
where (u, v, w, p, q, r) mean the landerfs velocities
in 6DOF motion, and zc denotes the scene pointfs position
in zaxis of the bodyfixed coordinates. f shows
focal length of an equivalent camera, which can be calculated
from field of view and resolution of the 2D scanning
LRF. To simplify the model, the LRFfs observing
point is assumed to be coincident with gravitational
center of the lander as shown in Fig. 1.
The above differential equations are called fImage
Jacobianf[4], and if external and internal parameters
of imaging device, zc and f are previously given and
optical flows for a series of the scene points in the image
sequence can be obtained, the 6DOF motion of the
lander can be estimated by methods such as a least
squares method, Kalman Filter, etc.
However, in a case of using CCD camera, to derive
the depth information (zc) only by camera images is
usually a hard task, and uncertainty remains even if
some effective techniques such as Stereo Vision are applied.
On the other hand, range image acquired from LRF
obviously includes the depths data for the scene points,
and if they can be obtained with enough high resolution
and high scanning rate, conventional image processing
technique can be directly applied.
3 Landerfs Motion Parameters
Estimation from Range Images
Considering an optical flow of ith scene point (i =
1, 2, E E E) in kth frame (k = 1, 2, E E E) of range image,
and zc and w are directly measured from LRF, Eqs. (1)
and (2) can be reduced into the following equations in
a matrix form:
?????
? x(i)
img(k) ? f
x
(i)
img
zc(k)w(k)
? y(i)
img(k) ? f
y
(i)
img
zc(k)w(k)
w(k)
?????
= ???
? f
zc(k) 0 0 c(i)
14 c(i)
15 c(i)
16
0 ? f
zc(k) 0 c(i)
24 c(i)
25 c(i)
26
0 0 1 0 0 0
???
~(u(k), v(k), w(k), p(k), q(k), r(k))T , (3)
where
c(i)
14 = fx(i)
img(k)y(i)
img(k),
c(i)
15 = ?f 1 + x(i)
img(k)2,
c(i)
16 = fy(i)
img(k),
c(i)
24 = ?f 1 + y(i)
img(k)2,
c(i)
25 = ?fx(i)
img(k)y(i)
img(k),
c(i)
26 = ?fx(i)
img(k).
In the framework of Computer Vision theory, camerafs
motion (u, v, w, p, q, r) is estimated from the
measurement data, ( ? ximg, ? yimg) in Eq. (1) and (2) for
a series of scene points. These optical flows can be obtained
through a tracking method[5] in image sequence,
and the remaining unknown parameters, zc are given
by something except single camera image.
However, in the case of using LRF, as depth and
the variation of the depth (zc and w) can be directly
measured by the range images, LRFfs motion can be
more effectively estimated from the measurement data
on the lefthand side of Eq. (3), which is different from
the image processing methods based on camerafs pixel
data.
If the measured variables for all d scene points in an
image are denoted as,
z(k) = z(1)
1 (k), z(1)
2 (k), E E E, z(d)
1 (k), z(d)
2 (k), w(k)T
,
(4)
where z(i)
1 (k) = ?x(i)
img(k) ? f
x
(i)
img
zc(k)w(k), z(i)
2 (k) =
? y(i)
img(k) ? f
y
(i)
img
zc(k)w(k) (i = 1, 2, E E E, d), and the motion
parameters are considered as state variables such that
x(k) = (u(k), v(k), w(k), p(k), q(k), r(k))T , the concerned
system can be expressed as state and measurement
equations as follows:
?x
(k) = A(x(k)) + Bu(k) + v(k) (5)
z(k) = Cx(k) + w(k) (6)
k = 1, 2, E E E
where u ¸ R6~1 means control input for the landerfs
motion, and v ¸ R6~1 and w(k) ¸ Rd~1 denote
system and measurement noises which satisfy the
following expectation conditions: E
vT v = S ¸
R6~6, E
wTw = T ¸ Rd~d.
Also, Eq. (5) describes the landerfs dynamics of motion
such that
A =
???????????
?qw + rv
?ru + pw
?pv + qu
Jp1qr + Jp2pq
Jq1(p2 ? r2) + Jq2rp
Jr1qr + Jr2pq
???????????
, (7)
B =
???????????
1/m 0 0 0 0 0
0 1/m 0 0 0 0
0 0 1/m 0 0 0
0 0 0 Jp3 0 Jr3
0 0 0 0 Jq3 0
0 0 0 Jp4 0 Jr4
???????????
,(8)
where Jp1, Jp2 E E E denote inertial parameters, and for
an inertial matrix of the landerfs body:
J = ???
Jxx Jxy Jxz
Jxy Jyy Jyz
Jxz Jyz Jzz
???
, (9)
they are expressed as
Jp1 = ?
I2
zz + I2
xz ? IyyIzz
IxxIzz ? I2
xz
, Jp2 = ?
Ixz(Iyy ? Ixx ? Izz)
IxxIzz ? I2
xz
,
Jp3 = Izz
IxxIzz ? I2
xz
, Jp4 = Ixz
IzzIxx ? I2
xz
,
Jq1 = ?
Ixz
Iyy
, Jq2 = ?
Ixx ? Izz
Iyy
, Jq3 =
1
Iyy
,
Jr1 = Ixz(Iyy ? Izz ? Ixx)
IzzIxx ? I2
xz
, Jr2 = I2
xz + I2
xx ? IxxIyy
IzzIxx ? I2
xz
,
Jr3 = Ixz
IxxIzz ? I2
xz
, Jr4 = Ixx
IzzIxx ? I2
xz
.
Considering Eq. (3), the coefficient matrix C in
Eq. (6) can be expressed such that
C =
????????????
? f
zc(k) 0 0 c(1)
14 c(1)
15 c(1)
16
0 ? f
zc(k) 0 c(1)
24 c(1)
25 c(1)
26
...
...
...
...
... ...
? f
zc(k) 0 0 c(d)
14 c(d)
15 c(d)
16
0 ? f
zc(k) 0 c(d)
24 c(d)
25 c(d)
26
0 0 1 0 0 0
????????????
(10)
Therefore, given range images and optical flows for
each image frame through conventional image processing
technique, the states x can be calculated by online
estimation methods such as an Extended Kalman Filter
(EKF)[6]. The algorithm is summarized as follows.
EKF algorithm:
1. Set the initial parameters:
?x
(0) = x0 ¸ R6~1, ? P(0) = ƒ°0 ¸ R6~6,
S = S0 ¸ R6~6, T = T0 ¸ Rd~d.
2. Calculate the Kalman Gain (k = 0, 1, 2, E E E):
K(k) = ? P(k)CT (k) C(k) ? P(k)CT (k) + T0?1
.
3. Update the state and the covariance estimates
(k = 0, 1, 2, E E E):
?x
(k) = ?x(k) + K(k) [y(k) ? C?x(k)],
P(k) = ? P(k) ? K(k)C ? P(k).
4. Predict the state and the covariance estimates
(k = 0, 1, 2, E E E):
?x
(k + 1) = A(?x(k)) + Bu(k),
? P(k + 1) = a(k)P(k)aT (k) + S0,
where A(x(k + 1)) = x(k) + A(x(k))?t (?t :
enough small sampling time), and a(k) =
ÝA
Ýxx=?x(k).
4 Numerical Simulation
Fig. 2: Digital elevation map of the synthetic terrain
for numerical simulation.
!"
!"
(a) CASE1
!"
!"
(b) CASE2
Fig. 3: Simulated landerfs motion.
In order to demonstrate the effectiveness of the proposed
method, a series of numerical simulations were
conducted using a synthetic terrain data. Fig. 2 shows
the digital elevation map (DEM) of the synthetic terrain,
which is generated by a function combining 2D
sinusoidal and 2D fractional Brownian motion[7].
In this simulation study, the lander has a rectangular
solid shape (2.1[m] ~ 2.1[m] ~ 4.8[m]) and a weight of
3.0[kg]. To see results from different types of the landerfs
motion two cases are assumed. One considers a
constant translational velocity about a horizontal axis
and another does a constant angular velocity about a
vertical axis as well as a vertical descent motion at a
constant velocity for each case as shown in Fig. 3. In
both of the cases, the landerfs motion is not controlled
(u = 0).
Specifications of the 2D scanning LRF are assumed
as shown in Table 1, and 32 frames (for 31 seconds) of
the range images are applied to the proposed method.
As explained in the previous section, conventional
image processing technique (an edge detection with
Zero Crossing points and a correlation matching
Table 1: Specifications of the LRF.
Field of view (FOV) [deg ~ deg] 10.0 ~ 10.0
Frame rate [frame/sec] 1.0
Resolutions [dots] 128 ~ 128
Focal length? [m] 1.0
Projection planefs size? [m ~ m] 0.35 ~ 0.35
?The parameters derived from equivalent CCD camera.
#
#
#
#
#
#
#
#
$%&'
$%&'
Fig. 4: A range images depicted by grayscale pictures.
method[5]) were applied to extract and track the characteristic
scene points (d = 200) in the range images
from the LRF. Also, considering that the scene points
extracted in the first image may go outside of the
frames during the tracking operations, the ones to be
tracked were chosen near the center of the first image.
When the EKF algorithm was applied to the range
image data, the initial parameters were set as
ƒ°0 = 100.0I6 (CASE1), or 10.0I6 (CASE2)
S0 = 1.0 ~ 10?8I3 0
0 1.0 ~ 10?8I3 ,
T0 = 0.1Id (Noise1), or 0.2Id (Noise2),
where Ij (j = 3, 6, d) denote j ~ j identity matrices.
The initial values of ?x (x0) were calculated by applying
a least squares method to Eq. (6) for the first range
image.
The state and measurement noises in Eqs. (5) and
(6) are firstly omitted to generate range images. What
we should note here is that measurement errors should
occur during the image processing to estimate optical
flows, which means that w in Eq. (6) is not exactly zero
during the motion estimation with the EKF algorithm.
Results of the range images are shown in Fig. 4.
They are expressed as grayscale pictures, whose darker
points depict shorter ranges from the landerfs observing
point, and vice versa.
Fig. 5 shows estimated optical flows for an image
(
)
(a) CASE1
(
)
(b) CASE2
Fig. 5: Estimated optical flows in an image frame. (at
k = 20[frames])
'!
!*+,!!,,*,"!

!"
'!
!*+,!#.,*,"!

!"
.
*
/
0
0
0
1
2
0
0
0
.*/
!" 12 !" 0 0 0 0 0 0
Fig. 6: Estimated parameters of the landerfs motion.
(CASE1)
frame. As shown in the figures, the optical flows are
well estimated so that they distinguish the motion of
the scene points depending on the landerfs motion.
Figs. 6 and 7 show the landerfs translational and rotational
velocities estimated with the EKF algorithm.
In each figure, time histories of the estimated parameters
of translational motion (?u, ?v, ? w) are shown on the
upper side, and those of rotational motion (?p, ?q, ?r) are
shown on the lower side. For both of the cases, most of
the parameters except ?u and ?q in CASE1 are successfully
estimated. Although in the cases in this paper,
unfortunatelly, the final estimated values of ?u and ?q has
slight errors from their exact values, the performance
of the online estimation algorithm can be improved by
arranging the numbers of the scene points to measure
the optical flows and some parameters of the EKF algorithm
such as ƒ°0, S0 and T0.
Next, results for the case that the measurement noise
is added to the range data are shown in Figs. 8 and 9.
A series of Gaussian noise (mean value: 0[m], variance:
0.1[m2]) are added to all the range data in each image
frame. As shown in these figures, almost all the estimated
parameters are affected by the measurement
'!
!*+,!!,,*,"!

!"
'!
!*+,!#.,*,"!

!"
.
*
/
0
0
0
1
2
0
0
0
.*/
!" 12 !" 0 0 0 0 0 0
Fig. 7: Estimated parameters of the landerfs motion.
(CASE2)
'!
!*+,!!,,*,"!

!"
'!
!*+,!#.,*,"!

!"
.
*
/
0
0
0
1
2
0
0
0
.*/
!" 12 !" 0 0 0 0 0 0
Fig. 8: Estimated parameters of the landerfs motion.
(CASE1, with measurement noise on the range data)
noise except ? w. Especially in the results for CASE2,
it is seen that ?r does not converge on its exact value
any more. The reason is considered that as the optical
flows in CASE2 (which show the rotating motion
around the center of the image frame) are not so simple
as those in CASE1, the errors during estimating such
flows in each image frame could be more harmful for
final motion estimation.
For each case of the landerfs motion, the errors of the
estimated parameters are evaluated through the error
functions defined as,
et = 1
3Nf ~
Nf
k=1ã(u(k)??u(k))2+(v(k)??v(k))2+(w(k)? ? w(k))2,
(11)
er = 1
3Nf ~
Nf
k=1ã(p(k)??p(k))2+(q(k)??q(k))2+(r(k)??r(k))2,
(12)
where?means the estimated values of the parameters,
and Nf means the total number of the range images.
'!
!*+,!!,,*,"!

!"
'!
!*+,!#.,*,"!

!"
.
*
/
0
0
0
1
2
0
0
0
.*/
!" 12 !" 0 0 0 0 0 0
Fig. 9: Estimated parameters of the landerfs motion.
(CASE2, with measurement noise on the range data)
The results of the error analysis are shown in Table 2.
In the table, fNoise1f means the case that no state and
measurement noise are added as shown in Figs. 6 and
7, and fNoise2f means the case that the measurement
noise is added to the range data as shown in Figs. 8
and 9. As seen in the table, the measurement noise
clearly affects the estimation errors in CASE2 while it
is not so serious in CASE1.
Since measurement error included in range image
data must have negative effects on estimating optical
flows as well as on motion estimation, more sophisticated
image processing method should be applied.
Also, there remains room for discussing about more
effective initial parameters setting (ƒ°0, S0, T0, etc.)
for the EKF algorithm.
Table 2: Error analysis of the landerfs motion estimation.
Noise1 Noise2
CASE 1 et [m/sec] 1.2 ~ 10?3 1.0 ~ 10?3
er [rad/sec] 1.4 ~ 10?5 1.5 ~ 10?5
CASE 2 et [m/sec] 2.9 ~ 10?4 4.8 ~ 10?4
er [rad/sec] 0.4 ~ 10?5 4.3 ~ 10?5
5 Concluding remarks
This study proposes a new approach to estimating
landerfs motion from range images of 2D scanning LRF
aiming at landerfs autonomous landing. If the range
image data can be acquired with enough high resolution
and high scanning rate, the proposed method will
be effective. On the other hand, the 2D scanning LRF
with the specifications assumed in this study have not
been realized yet. Considering the practical specifications
of the LRF, the feasibility study on the proposed
method to improve motion estimation error should be
continued.
References
[1] T. Kominato et al., gHayabusa Optical Hybrid
Navigation and Guidanceh, Proc. of the 16thWorkshop
on JAXA Astrodynamics and Flight Mechanics,
pp. 167172, 2006.
[2] H. Morita et al., gDetailed Guidance and Navigation
Operations during Descents and Touchdowns,
Hayabusah, Proc. of the 16th Workshop on JAXA
Astrodynamics and Flight Mechanics, pp. 173178,
2006.
[3] A. Banno, K. Ikeuchi, gShape Recovery of 3D Data
Obtained from a Moving Range Sensor by Using
Image Sequencesh, Proc. of the 10th IEEE International
Conference on Computer Vision (ICCV
2005), pp. 792799, 2005.
[4] B. Espiau, F. Chaumette, P. Rives, gA New Approach
to Visual Servoing in Roboticsh, IEEE
Trans. Robotics and Automation, Vol. 8, 3,
pp. 313326, 1992.
[5] D. A. Forsyth, J. Ponce, gComputer Vision: A
Modern Approachh, Prentice Hall, New Jersey,
2003.
[6] F. L. Lewis, gOptimal Estimationh, John Wiley &
Sons, Inc., New York, 1986.
[7] S. Yoshikawa and K. Nishiguchi, gGeneration of
Digital Elevation Map of the Moon from Observation
Data interpolated by Fractal Modelingh,
Transactions of the Society of Instrument and Control
Engineers, Vol. 37, No. 3, pp. 262267, 2001.