Vous êtes sur la page 1sur 7


discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/4036595

Terrain imaging millimetre wave radar

Conference Paper January 2003

DOI: 10.1109/ICARCV.2002.1234782 Source: IEEE Xplore


10 132

5 authors, including:

Steven John Scheding Graham Brooker

University of Sydney University of Sydney


Andrew Maclean
University of Sydney


Some of the authors of this publication are also working on these related projects:

Autonomous Land Vehicle View project

mmW radar imaging for unmanned ground vehicles View project

All content following this page was uploaded by Steven John Scheding on 03 September 2017.

The user has requested enhancement of the downloaded file.

Proc. 2002 Australasian Conference on Robotics and Automation
Auckland, 27-29 November 2002

Terrain Imaging and Perception using Millimetre Wave Radar

S. Scheding, G. Brooker, R. Hennessy, M. Bishop and A. Maclean

Australian Centre for Field Robotics
Department of Mechanical Engineering
The University of Sydney, NSW. 2006, Australia
e-mail: scheding/brooker/m.bishop/r.hennessy/a.maclean @acfr.usyd.edu.au

Abstract 2 Radar Front End

The radar front end is a custom developed 77GHz Fre-
This paper presents an overview of the work
quency Modulated Continuous Wave (FMCW)[3] unit
being conducted by the Australian Centre
with a transmit power of 10mW and a swept band-
for Field Robotics (ACFR) and the CRC for
width of 600M Hz. The transmitted chirp is close-loop
Mining Technology and Equipment (CMTE)
linearised to achieve a linearity of better than 0.1%
into sensors and systems for real-time envi-
which in conjunction with the wide bandwidth results
ronment imaging.
in range resolution of 50cm at a range of 500m.
Radar is perhaps the only sensor capable The millimetre wave hardware consists of a 250mm
of performing imaging tasks in environments diameter antenna (giving a beam divergence of around
where high levels of dust, water vapour, and 0.8 ), two recongurable 77GHz modules custom de-
rain must be contended with. Methods of vi- veloped for this project, a voltage controlled oscilla-
sualising the radar data for various applica- tor (VCO), a local oscillator (LO) a circulator and
tions are presented. an isolator[4]. Interconnections between these mod-
ules are by means of custom-manufactured waveguide
sections. These are used to position the modules phys-
1 Introduction ically so that the front-end ts within the restricted
space within the radar housing while ensuring that the
Visualisation of environments is an important part of
losses between modules are kept to a minimum.
many processes. Usually the visualisation takes the
form of a human being manually gathering informa-
tion (whether through surveying, or more convention- 3 Signal Processing
ally, simply by looking at a scene). The information The radars back-end software architecture is modu-
gathered is generally used for planning operations or lar in design allowing for changes in supporting hard-
human-in-the-loop feedback control. ware to be easily adapted with the software. The four
In many cases, however, it is desired to either aug- functional modules are the dual-axis motor control for
ment the capacity of humans, by providing visualisa- the scanner, radar signal capture/processing and com-
tion of areas that they cannot directly see, or to auto- mand interpretation. The system block diagram can
mate equipment, in which case the machine must have be seen in Figure 2, whilst Figure 1 gives a block dia-
some perception capabilities. It is important to note gram for the signal processing.
that in both cases conventional visualisation systems The FMCW radar operates by transmitting a lin-
based on laser or cameras will often fail due to poor vis- ear frequency chirp f of duration Td [5]. At any in-
ibility caused by dust, fog, rain or other environmental stant in time, the echo signal received is shifted in fre-
factors [1, 2]. quency from the transmitted signal by the product of
This paper presents the visualisation work con- the roundtrip time Tp to the target and the rate of
ducted by the Australian Centre for Field Robotics change of frequency f [ ]
t 2 . If the received signal is
(ACFR), in conjunction with the CRC (Cooperative mixed with a portion of the transmitted signal and
Research Centre) for Mining Technology and Equip- ltered, the resulting output will be a constant beat
ment (CMTE) into visualisation systems that are frequency fb . To process range values from the radar,
largely impervious to the environmental conditions the analog signal must rst be captured digitally by the
found in many hostile environments. analog to digital card. The capture rate of this card

Copyright ARAA 2002 60

Figure 2: Radar Scanner Block Diagram

must be twice the frequency of the maximum frequency

signal it is capturing. The radar has been designed to
have a maximum range of 500m with range resolution
of around 0.5m. Two factors limit the range resolution
of FMCW systems, the rst is a function of the chirp
bandwidth f , and the second is the actual linearity
that can is achieved for the transmitted chirp. The
theoretical range resolution that can be achieved with
a chirp bandwidth of f = 600M Hz and a sweep time
of 1ms is:
Rchirp = = 0.25m (1)
where c is the speed of light.
Closed loop linearisation techniques can achieve lin-
earities of about 0.1% over most of their sweep. The
best resolution that can be achieved in this case is the
product of the linearity and the range to the target.
For a target at a range of 500m and a linearity of
0.1%(0.001), the resolution is:

Rlin = Range Linearity = 0.5m (2)

The two results are generally combined in quadra-
ture to determine the ultimate system range resolution:

R = Rchirp 2 = 0.56m
+ Rlin (3)
The actual beat frequency for a radar with the char-
acteristics dened above, and a maximum target range
(R) of 500m is just the product of the round trip time
to the target and the chirp slope f
t :

Figure 1: Radar Signal Processing Block Diagram

f f 2R f
fb = Tp Tp = = 2.0M Hz (4)
t Td c Td
This results in a capture frequency requirement of
5M Hz allowing for lter roll o.
In general many targets will be present in the beam,
so some form of spectrum analyser must be used to
extract all of the frequencies present in the received

signal. This function is implemented in software using
the Fast Fourier Transform (FFT) process preceded
by a weighting function and a hamming window, to
reduce range sidelobes. If the signal is observed for a
time Td = 1ms then the width of the FFT frequency
bin W = T1d = 1000Hz(25cm) and main lobe width is
twice that. The 3dB bandwidth of the lter produced
by the FFT process is 0.89 bins (22.25cm) for no win-
dowing, increasing to 1.3 bins (32.5cm) for a Hamming
window[6]. To produce a spectrum spanning the whole
range from 0m to 500m with a bin size of 25cm requires
2000 bins. These can be produced by a complex 2048
point FFT or a real 4096 point FFT. For this applica-
tion which uses only a single channel, the real FFT is
appropriate, so a 4096 point FFT is used.
Point targets can be extracted from the FFT spec-
trum by applying a threshold, T , to the data and
searching for peaks in the spectrum which exceed this
threshold. The threshold is computed from the spec-
trum by the equation

T = 2 + x (5) Figure 3: Complete Radar Scanner

where and x are the standard deviation and mean of
the sampled spectrum respectively.
Parabolic interpolation can be applied to each peak
complexity and reduces the reliability of such a sys-
to give a higher accuracy for point targets. The re-
tem. A scanning mirror radar allows the inertia of the
lationship between the measured beat frequency (fb )
systems rotating parts to be minimised. This makes
and the range is:
quicker scan times in both axes possible while minimis-
R Td c ing scanner drive power requirements and making spa-
= = 250m/M Hz (6) tial resolution requirements easier to achieve. With ap-
fb 2f
propriate design the need to pass any electrical signals
This can be used to obtain range values for each between rotating and non-rotating parts can be elim-
peak. An intensity of each peak can also be taken inated reducing complexity and removing any associ-
from the FFT spectrum, and is useful in giving addi- ated noise. Suitable mirrors are generally constructed
tional information about what kind of surface/target from aluminium with its good reectance properties
generated the peak, by discriminating between objects across millimetre wavelengths and weight advantages,
with dierent radar reectivities. for larger applications composite aluminium mirrors
provide low mirror mass while maintaining the required
4 Scanner Design mirror stiness. Due to the antenna aperture of this
For most robotic applications, a real beam is scanned radar being 250mm it was decided that a scanning ar-
by physically rotating the antenna using a pan/tilt rangement using a rotating mirror would be used, for
mechanism or by utilizing the reective properties of the advantages of such a system as stated in the sec-
a rotating mirror. Scanning systems using a rotat- tion above. A base scanning unit was built which could
ing antenna have the advantage that they are gener- be reused in dierent congurations to achieve the re-
ally more compact not requiring a separation between quired beam elevation range by simply changing the
the antenna and the scanning mirror as seen in Fig- dimensions of the mirror and the mirror-radar separa-
ure 3. Rotating antenna systems also do not suer tion. The scanning unit was designed to achieve az-
from angular resolution diminution due to unintended imuthal scan rates of 5Hz and elevation scan rates of
relative movement between the antenna and mirror. 0.05Hz. In the current conguration, as seen in Figure
However a scanning mechanism required to rotate the 3 the elevation range is from 10 above horizontal to
radar antenna and electronics will require heavier and 20 below horizontal. The scanning system needs to
more powerful drive components, slowing scan times have angular resolution better than that of the radar
and making the spatial resolution demands of the radar system (0.8 in this case) so that the radar systems de-
system more dicult to meet as well as signicantly sign capabilities are not compromised by the scanning
increasing the systems power requirements. Commu- system. This scanning unit was designed with mini-
nication of electrical signals between rotating and non- mum backlash to achieve an angular resolution of less
rotating parts of a rotating antenna radar increases the than 0.1 on both the azimuthal and elevation axes.

5 Visualisation Software
In this work, a client-server architecture is used, where
the radar acts as the server. The client, (in this case the
visualisation software) requests data from the radar in
order to visualise it. The advantage of the client-server
architecture is that the client is physically remote from
the radar, and more than one client can be simultane-
ously connected to the radar. Given that Ethernet
is the main transport protocol used in this radar (an
RS232 interface is the other), visualisation clients may
even be located in other countries. Clients only need to
know the IP address of the radar and satisfy security
criteria in order to communicate with the radar.
By isolating the client from the server in this man-
ner, we achieve:
Ecient utilisation of resources because:
The radar processor is optimised for perform-
ing the signal processing and controlling the
radar scanner. Figure 4: The Target Region
The visualisation client is optimised for visu-
alising data and can be (and will usually be) a 6 Results
dierent platform and processor to that used
by the radar. The following results show data taken from a region
outside the oces of the ACFR. An aerial image of
Multiple visualisation clients can access the radar the region is shown in Figure 4.
simultaneously from a variety of dierent physical The data from the radar is returned as range, bear-
locations remote to the radar. ing, azimuth triplets which are then plotted by the
Visualisation of the data can be tailored to the visualisation client in real-time. The ability to y
user. around the data on a graphics workstation greatly aids
the ability to interpret the data set, nevertheless the
Visualisation of the radar data is achieved by con- following gures give some indication as to the quality
verting the peaks in the radar data to a set of 3D points and resolution of the data generated.
coloured by the intensity of the peak. This is a simple
Figure 5 shows the point cloud generated by the sys-
Polar to Cartesian transform, since the range, elevation
tem in around 2 to 3 minutes. The overhead viewpoint
and azimuth are known from the radar data. Thus a
closely matches that shown in Figure 4. The sensor is
scene is built up as a set of points whose colours range
located at the centre of the blank circle. In Figure 6,
from blue (minimum intensity) to red (maximum in-
the same data is shown from a southerly viewpoint.
tensity). The scene is then rendered by the client.
In the images there are several artifacts due to the
In the current implementation of the software, op- sensing process. The dark anulus in the centre of the
tions are provided for visualising the data as: images represents many high intensity returns from the
Points representing a radar return, coloured by grassy region immediately surrounding the sensor. The
intensity regions where no data points are visible are due to
shadowing, where a medium to high density object
Delaunay 2D triangulation[7] of the points. Useful blocks the view of the sensor into the region behind it.
if the data is primarily 2D, for example a surface.
To demonstrate dierent visualisation techniques, a
Delaunay 3D triangulation of the points. Useful if small section of data has been chosen to evaluate. This
the data being represented is 3D. portion of data was chosen to be a tree with some at
ground surrounding it, as seen in Figure 7. This data
Volume Rendering of the data. Useful when data
set was chosen to highlight how the dierent visulisa-
has no inherent structure or the dimensionality of
tion techniques suit the contrasting structures.
the data is high relative to what 2D or 3D visual-
As seen in Figure 8, Delaunay triangulation is a nat-
isation techniques can represent.
urally intuitive representation of the ground. The sur-
By using these options, various parameters may be ad- face is colored according to intensity of the radar re-
justed in order to obtain the best visualisation for a turns, which gives some information about the radar
particular application. Other options, like contouring reective property of the surfaces observed.
of surfaces, can be easily added. To assist in the association between the rendered

Figure 5: Radar Data Top Down View
Figure 7: Camera image of tree data set

Figure 8: 3D Delaunay surface visulisation of the tree

Figure 6: Radar Data Southern View data set, from two view points.

Figure 10: Combined volumetric and surface visulisa-
Figure 9: 3D Delaunay surface visulisation of the tree
tion of the tree data set, from two view points.
data set, with texture mapping, from two view points.

now on the market.

terrain image and the environment as observed by a
human being, a more meaningful choice of color can It can produce true 3D images by measuring range
be used than the intensity of the radars observation. bearing and azimuth independently of any ma-
A logical choice would be to sense the illuminated light chine movement.
from each part of the terrain and use this value to give The preliminary results of this research are ex-
color to the 3D map. The results of this process can tremely promising, and will only improve as the system
be seen in Figure 9, where a basic fusion between the is upgraded.
radar map, and a camera image is performed.
Although the three-dimensional Delaunay surface is Acknowledgements
a good representation of the ground, the trees internal This research is supported by the CMTE (The CRC
structure is obscured by the surface. This suggests that for Mining Technology and Equipment), an Australian
an alternate visualisation technique would be more Commonwealth funded cooperative research centre
suitable for examining low to medium density struc- (CRC).
tures. The ability to set the opacity of points when
volume rendering is benecial in such structures by al- References
lowing the density of the object to be directly observed.
[1] E. Brookner, Radar Technology, chapter 15, pp.
A combined representation of volume rendering for the
213226, Artech House, 1986.
tree, and surface rendering for the ground can be seen
in Figure 10. Here a simple clustering technique has [2] P. Bhartia and I. J. Bahl, Millimeter Wave En-
been used to segment the two structures. The points gineering and Applications, John Wiley and Sons,
which composed the tree structure, along with their 1984.
uncertainty, were approximated by Gaussian distribu- [3] F. Nathanson, Radar Design Principles, McGraw
tions, where the variance is computed from the range Hill, 1969.
and angular resolution. Using this technique, data be- [4] K. Button and J. Wiltse (ed), Infrared and Millime-
tween sampled points can be interpolated by the sum
ter Waves: Millimeter Systems, vol. 4, Academic
of all the Gaussians present. Further explanation of
Press, 1981.
this technique can be found in [8].
As a nal note, experimental results have shown [5] G. Brooker, Active Sensors for Local Planning in
that the images presented here do not dier to any Mobile Robotics, chapter 8, pp. 137164, World
noticeable degree from those taken during heavy rain, Scientic, 2001.
through fog and through dust (except for the camera [6] H. Kirsten, Applying the DFT (FFT), Lecture
derived data which degrades noticably). notes, University of Stellenbosch, 1979.
[7] Hugues Hoppe, Tony DeRose, Tom Duchamp, John
7 Conclusion McDonald, and Werner Stuetzle, Surface recon-
If one of the primary purposes of a visualisation system struction from unorganized points, Computer
is to improve productivity by facilitating the operation Graphics (SIGGRAPH), vol. 26, no. 2, pp. 7178,
or automation of a machine during periods of poor vis- 1992.
ibility, then a radar unit is the obvious choice as: [8] Somajyoti Majumder, Sensor Fusion and Feature
It will function under all weather conditions. Based Navigation for Subsea Robots, Ph.D. thesis,
The University of Sydney, 2001.
It is as safe, if not safer than any laser scanners


View publication stats