Vous êtes sur la page 1sur 8

Mike Kearns

President, Space Exploration




Neptec Design Group
T: (613) 599-7602 x 271
F: (613) 599-7604
mkearns@neptec.com
TriDAR: Test Results From
Three Space Shuttle Missions



2012 Neptec Design Group Ltd.

2
Neptec Overview
Neptec has been supplying vision systems for NASA and the
Canadian Space Agency since 1989, refining the sensors
and software needed to perform on-orbit damage inspection
and automated rendezvous and docking. Recently, Neptec
has become a system integrator for Lunar and Mars Rover
systems.

The technology successfully deployed and refined in space
has now been adapted for use in military and industrial
applications. Neptecs 3D OPAL Sensor sees through dust
and other obscurants to greatly improve helicopter pilots vision in degraded visual
environments, while Neptecs real-time change detection identifies threats in a scene by
assessing 3D data. In addition, Neptec has developed its 3D laser scanning technology into a
high resolution metrology system that provides the capability to rapidly perform the dimensional
verification required in the manufacturing industry.

TriDAR Provides Critical Guidance Information Without
Requiring Cooperative Targets
Overview

Neptecs TriDAR (triangulation + LIDAR) system results from over 5 years of research and
development funded by the Canadian Space Agency (CSA) and NASA. TriDAR was developed
in response to the need for a compact 3D vision system that can operate at both short and long
range from full darkness to direct sunlight.

TriDAR is a relative navigation vision system. It provides critical guidance information that can
be used to guide an unmanned vehicle during rendezvous and docking operations in space.
Unlike current technologies, TriDAR does not rely on any reference markers, such as reflectors,
positioned on the target spacecraft. To achieve this, TriDAR relies on a laser based 3D sensor
and a thermal imager. TriDARs proprietary software uses the geometric information contained
in successive 3D images to match against the known shape of the target object and calculate its
position and orientation.

TriDAR has successfully performed three demonstrations of this technology. TriDAR was flown
on-board Space Shuttle Discovery on the STS-128 and STS-131 missions, and on-board Space
Shuttle Atlantis on the historic STS-135 mission. On all three missions, TriDAR provided
astronauts with real-time guidance information during rendezvous and docking with the
International Space Station (ISS). It automatically acquired and tracked the ISS using only
knowledge about its shape. This marked the first time a 3D sensor based targetless tracking
vision system was used in space. Overall, the performance of the TriDAR system proved it to be
one of the most advanced vision systems to fly in space.

Before TriDAR, most operational tracking solutions for pose estimation and tracking on-orbit
have relied on cooperative markers placed on the target spacecraft. For example, Neptecs
Space Vision System (SVS) used black on white or white on black dot targets. These were


2012 Neptec Design Group Ltd.

3
imaged with Space shuttle or International Space Station (ISS) video cameras to compute the
relative pose of ISS modules to assemble.

NASA used The Trajectory Control System (TCS) on board the space shuttle to provide
guidance information during rendezvous and docking with the International Space Station (ISS).
This system uses a laser to track retro-reflectors located on the ISS and provide bearing, range
and closing rate information. While reliable, target-based systems have operational limitations
as targets must be installed on target payloads. This is not always practical or even possible.
For example, servicing existing satellites that do not have reflectors installed would require a
target-less tracking capability.

TriDAR System Capabilities

TriDAR builds on recent developments in 3D sensing technologies and computer vision
achieved at Neptec and brings a new, lighting immune, capability to space vision systems. This
new technology provides the ability to automatically rendezvous and dock with vehicles that
were not designed for such operations.

TriDAR includes an active 3D sensor, a thermal imager and Neptecs model-based tracking
software. Using only knowledge about the target spacecrafts geometry and 3D data acquired
from the sensor, the system computes the 6 Degree Of Freedom (6DOF) relative pose directly.
The innovative computer vision algorithms developed by Neptec allows this process to occur in
real-time on a flight computer while achieving the necessary robustness and reliability expected
for mission critical operations. Fast data acquisition has been achieved by implementing a smart
scanning strategy referred to as More Information Less Data (MILD) where only the necessary
data to perform the pose estimation is acquired by the sensor. This strategy minimizes the
requirements on acquisition time, data bandwidth, memory and processing power.

TriDAR Sensor Hardware

The TriDAR sensor is a hybrid 3D camera that combines auto-synchronous laser triangulation
technology with laser radar (LIDAR) in a single optical package. This configuration takes
advantage of the complementary nature of these two imaging technologies to provide 3D data
at both short and long range without compromising on performance. The laser triangulation
subsystem is largely based on the Laser Camera System (LCS) used to inspect the Space
Shuttles thermal protection system after each launch. By multiplexing the two active
subsystems optical paths, the TriDAR can provide the functionalities of two 3D scanners into a
compact package. The subsystems also share the same control and processing electronics thus
providing further savings compared to using two separate 3D sensors. A thermal imager is also
included to extend the range of the system well beyond the LIDAR operating range.

TriDAR Detailed Test Objective (DTO) Missions Overview
TriDAR was tested three times in space: on-board Space Shuttle Discovery during the STS-128
and STS-131, and on-board Space Shuttle Atlantis during STS-135. The objective of the tests
was to demonstrate the capability of the TriDAR system to track an object in space without
using targets markers such as retro-reflectors. For these missions, TriDAR was located in the
payload bay on the Orbiter Docking System (ODS) next to the Shuttles Trajectory Control
System (TCS) (Figure 1).


2012 Neptec Design Group Ltd.

4

Figure 1 TriDAR mounted on Shuttle, as viewed from ISS
The system was activated during rendezvous when the Shuttle was tens of kilometers away
from the ISS. At this range, TriDAR used its thermal imager to acquire data to determine
bearing to and range the ISS. The thermal imagery acquired from the IR camera during STS-
128 was instrumental in improving the data acquisition and processing algorithms for future
missions. On STS-131 and STS-135, the IR camera was able to image the ISS at a range of
~39 km, immediately following successful activation. Once in range of the 3D sensor, TriDAR
automatically determined bearing and range to the ISS. On STS-128 and STS-131, this
occurred at approximately 400 m. Benefiting from a hardware upgrade for STS-135, the TriDAR
drastically improved its 3D long range capabilities, and was able to determine bearing and
range to the ISS at 2 km. On all three missions, TriDAR entered shape based tracking which
provided full 6 degree of freedom guidance and closing rate all the way into final dock, using the
same laser power and settings throughout the mission, illustrating the sensors high dynamic
range (Figure 2).







Figure 2 Short and long range 3D laser imaging scans, color-coded to range.

TriDAR


2012 Neptec Design Group Ltd.

5

Figure 3 Tracking Display Output for Crew
To reproduce the conditions of an unmanned docking, the system was designed to perform the
entire mission autonomously. It self-monitored its tracking solution and automatically re-
acquired the ISS in the event that tracking was lost. Key system information was provided in
real-time to the crew via enhanced docking displays on a laptop computer (Figure 3). This
included range and closing rates, 6 DOF pose, past pose indicators, future pose predictors,
laser data, and a 3D birds eye virtual camera view of the Shuttle position and orientation with
respect to the ISS. All displays were designed to closely resemble the certified navigation
displays used by the crew for easy cross-check. The TriDAR team took input from the crew and
iterated the displays based on input. As a result, a unique feature of the tracking display, shown
on the bottom left, is a virtual camera display implemented at the request from a crew member.
This provides a shuttle position invariant view of the ISS docking target. This provided the crew
with an orientation only view of the docking target, removing parallax effects for an accurate
measure of shuttle orientation relative to the ISS.
TriDAR was also tested during undocking and fly-around operations. For STS-128, TriDAR
operated in imaging mode, taking high resolution 3D imaging scans of the ISS while the Shuttle
flew around it. The objective for this operation was to gather data for future geometry tracking
algorithm development. The data was fed into the tracking algorithm to demonstrate 6DOF
tracking offline on the ground. Algorithm improvements were then made to enable tracking the
ISS in real-time during the STS-131 undock and fly-around. From the perspective of the
TriDAR, the ISS appeared to undergo pitch rotation of a full 360 degrees. The TriDAR
successfully tracked the ISS for the entire undock and fly-around operations, marking the first
demonstration of real-time embedded 6DOF target-less tracking of a tumbling target in space
(Figure 4).


2012 Neptec Design Group Ltd.

6

Figure 4 TriDAR tracked Space Shuttle undock and fly-around trajectory in blue, overlaid on nominal
flight path from rendezvous handbook
For STS-135, the last Shuttle mission, having already demonstrated 6DOF tracking during fly-
around, TriDAR once again acquired final high resolution 3D imaging scans of the ISS. This
decision was made to preserve for posterity, high quality 3D data of the last Shuttle fly-around to
be performed on a complete International Space Station (Figure 5).

Figure 5 3D laser range colored image on left, IR thermal imagery on right, with Earth in background,
during STS-135 final fly-around


2012 Neptec Design Group Ltd.

7
TriDAR Applications
Because of its wide operating range properties, the TriDAR sensor can be used for numerous
applications even within the same mission. For example, TriDAR could be used for:
Rendezvous and docking
Planetary landing
Rover navigation
Site and vehicle inspection
Geological material classification
TriDARs capabilities for planetary exploration have been demonstrated recently during field
trials in Hawaii held by NASA and the Canadian Space Agency (CSA). For these tests, TriDAR
was mounted on Carnegie Mellon Universitys SCARAB lunar rover and enabled it to
automatically navigate to its destination. Once the rover arrived at its destination, TriDAR was
used to acquire high resolution 3D images of the surrounding area, searching for ideal drill sites
to obtain lunar samples (Figure 6).
TriDAR technology is not limited to space applications. TriDAR technology is at the heart of
Neptecs OPAL product. OPAL provides vision to helicopter crews when their vision has been
obscured by brownouts or whiteouts. TriDAR technology can also be applied to numerous
terrestrial applications such as automated vehicles, hazard detection, radiotherapy patient
positioning, and assembly of large structures.

Figure 6 TriDAR mounted on CMU SCARAB rover


2012 Neptec Design Group Ltd.

8
References:
[1] MacLean, S., Pinkney, L., Machine Vision in Space, Can. Aeronaut. Space J., 39(2): 63-77,
1993.
[2] J. Obermark, G. Creamer, B. Kelm, W. Wagner, C. Henshaw, SUMO/FREND: vision system
for autonomous satellite grapple, Proc. SPIE, Vol. 6555, pp. 65550Y, 2007.
[3] Ruel, S., English, C., Anctil, M., Church, P., 3DLASSO: Real-time pose estimation from 3D
data for autonomous satellite servicing, Proc. ISAIRAS 2005 Conference, Munich, Germany, 5-
8 September 2005 (ESA SP-603).
[4] Ruel, S.; English, C.; Anctil, M.; Daly, J.; Smith, C.; Zhu, S., Real-time 3D vision solution for
on-orbit autonomous rendezvous and docking, Proc. SPIE, Vol. 6220, pp.622009, 2006
[5] Ruel, S. Luu, T. Anctil, M. Gagnon, S., Target Localization from 3D data for On-Orbit
Autonomous Rendezvous & Docking, 2008 IEEE Aerospace, Big Sky MT, 1-8 March 2008
[6] English, C., Zhu, X., Smith, C., Ruel, S., Christie, I., TriDAR: A hybrid sensor for exploiting
the complimentary nature of triangulation and LIDAR technologies, Proc. ISAIRAS 2005
Conference, Munich, Germany, 5-8 September 2005 (ESA SP-603).
[7] Deslauriers, A., Showalter, I., Montpool, A., Taylor, R., Christie, I. Shuttle TPS inspection
using triangulation scanning technology, SPIE 2005, Orlando, Florida, April, 2005.
[8] Ruel, S., Luu, T., Berube, A., On-Orbit Testing of Target-less TriDAR 3D Rendezvous and
Docking Sensor, Proc. ISAIRAS 2010 Conference, Sapporo, Hokkaido, August 29
September 1.

Vous aimerez peut-être aussi