Académique Documents
Professionnel Documents
Culture Documents
http://jcupt.xsw.bupt.cn
Department of Computer Science and Technology, Tsinghua University, Beijing 100084, China
Abstract
Traffic light detection and recognition is essential for autonomous driving in urban environments. A camera based
algorithm for real-time robust traffic light detection and recognition was proposed, and especially designed for autonomous
vehicles. Although the current reliable traffic light recognition algorithms operate well under way, most of them are mainly
designed for detection at a fixed position and the effect on autonomous vehicles under real-world conditions is still limited.
Some methods achieve high accuracy on autonomous vehicle, but they cant work normally without the aid of
high-precision priori map. The authors presented a camera-based algorithm for the problem. The image processing flow can
be divided into three steps, including pre-processing, detection and recognition. Firstly, red-green-blue (RGB) color space
is converted to hue-saturation-value (HSV) as main content of pre-processing. In detection step, the transcendental color
threshold method is used for initial filterings, meanwhile, the prior knowledge is performed to scan the scene in order to
quickly establish candidate regions. For recognition, this article use histogram of oriented gradients (HOG) features and
support vector machine (SVM) as well to recognize the state of traffic light. The proposed system on our autonomous
vehicle was evaluated. With voting schemes, the proposed can provide a sufficient accuracy for autonomous vehicles in
urban enviroment.
Keywords autonomous vehicle, traffic light detection and recognition, histogram of oriented gradients
1 Introduction
Over the past few decades, lots of attempts have been
made to autonomous vehicles. Nowadays, driving on
highway with autonomous vehicles has become more and
more reliable [1], while fully autonomous driving in real
urban environment is still a tough and challenging task [2].
Robust detection and recognition of traffic lights is
essential for autonomous vehicle to take appropriate
actions on intersection in urban environment. However,
robust detection of traffic lights is not easy to be carried,
for there would be a dreadful mess of objects in an image
in which colors are similar to the one of the traffic lights,
and the shape of traffic light is so simple that its hard to
extract sufficient features [3]. The worse situation may be
met that the traffic lights have a variety of types, of which,
some are horizontal arrangements while some are verticals,
Fig. 1
Issue 1
Guo Mu, et al. / Traffic light detection and recognition for autonomous vehicles
51
Hardware Implementation
Fig. 2
52
Sensor type
Mounting
position
Camera1
Front
Camera2
Camera3
SICK lidar1
SICK lidar2
IBEO Lux4
Millimeter radar
Front
Front
Front
Top
Top
Rear
2.2
Function
Traffic lights /
signs
Lane-marks
Obstacles
Front obstacles
Front obstacles
Road boundary
Rear obstacles
Coverage
30 m~90 m, 24
4 m~30 m, 72
4 m~120 m, 72
0 m~50 m, 180
0 m~50 m, 180
0 m~200 m, 72
0 m~200 m, 72
Software architecture
Fig. 3
2015
3 Proposed algorithm
The authors used an off-the-shelf camera (AVT Pike
F-100C) as vision sensor for detecting traffic lights. This
camera is mounted behind the windshield of our vehicle.
Issue 1
3.1
Guo Mu, et al. / Traffic light detection and recognition for autonomous vehicles
Pre-processing
Detection
(1)
53
m
pixel
(3)
H
green;
m [173.1, 216.3], S100, V 50
other; otherwise
where S is saturation-component, V is value-component.
After color segmentation, if the detected pixels form
connected regions, we can perceive their bounding boxes.
Fig. 5
54
2015
and label the traffic light pixel region with alignment to the
location and heading angle of host vehicles. When we come
to the same intersection, accurate regions for traffic lights
can be obtained by linear interpolation based on the
real-time data from high-precision INS.
3.3
Recognition
Fig. 6
4 Experiments
We ran the proposed algorithm on autonomous vehicles
and made tests in a real urban environment. The computer
is equipped with Intel i7 Quad-Core processor with 8GB
Memory. The images are captured with 1 000 1 000
pixels at 30 frams/s.
The experiment was done without high-precision INS at
different times of the day. Therefore, the traffic light
images include varying illumination, scale and pose. The
intersection state (go/stop) detection rate is shown in
Fig. 6.
Fig. 7
Issue 1
Guo Mu, et al. / Traffic light detection and recognition for autonomous vehicles
Fig. 8
Fig. 9
Correct
684
921
102
184
84
204
34
264
Missing
35
64
17
22
12
19
6
31
False alarm
43
14
1
0
2
1
3
2
55
References
1. Luettel T, Himmelsbach M, Wuensche H J. Autonomous ground
vehiclesConcepts and a path to the future. Proceedings of the IEEE,
2012,100 (Special Centennial Issue ): 18311839
2. Levinson J, Askeland J, Becker J, et al. Towards fully autonomous
driving: Systems and algorithms. Proceedings of the 2011 IEEE
Intelligent Vehicles Symposium (IVS11), Jun 59, 2011, Baden-Baden,
Germany. Piscataway, NJ, USA: IEEE, 2011: 163168
3. Buch N, Velastin S A, Orwell J. A review of computer vision techniques
for the analysis of urban traffic. IEEE Transactions on Intelligent
Transportation Systems, 2011, 12(3): 920939
4. de Charette R, Nashashibi F. Real time visual traffic lights recognition
based on spot light detection and adaptive traffic lights templates.
Proceedings of the 2009 IEEE Intelligent Vehicles Symposium (IVS09),
Jun 35, 2009, Xian, China. Piscataway, NJ, USA: IEEE, 2009: 358363
5. Chung Y C, Wang J M, Chen S W. A vision-based traffic light detection
system at intersections. Journal of Taiwan Normal University:
Mathematics, Science and Technology, 2002, 47(1): 6786
6. Yung N H C, Lai A H S. An effective video analysis method for detecting
red light runners. IEEE Transactions on Vehicular Technology, 2001,
50(4): 10741084
7. Gong J W, Jiang Y H, Xiong G M, et al. The recognition and tracking of
traffic lights based on color segmentation and CAMSHIFT for intelligent
vehicles. Proceedings of the 2010 IEEE Intelligent Vehicles Symposium
(IVS10), Jun 2124, 2010, San Diego, CA USA. Piscataway, NJ, USA:
IEEE, 2010: 431435
8. Hwang T H, Joo I H, Cho S I. . Detection of traffic lights for vision-based
car navigation system. Advances in Image and Video Technology:
Proceedings of the 1st Pacific Rim Symposium (PSIVT06), Dec 1013,
2006, Hsinchu, China. LNCS 4319. Berlin, Germany: Springer, 2006:
682691
9. Fairfield N, Urmson C. Traffic light mapping and detection. Proceedings
of the 2011 IEEE International Conference on Robotics and Automation
(ICRA11), May 913, 2011, Shanghai, China. Piscataway, NJ, USA:
IEEE, 2011: 54215426
56
2015
From p. 10
10. Xu L, Hou C L. An improved affine projection algorithm and its
application in acoustic echo cancellation. Proceedings of the 3rd
International Conference on Image and Signal Processing (CISP10): Vol
9, Oct 1618, 2010, Yantai, China. Piscataway, NJ, USA: IEEE, 2010:
43444348
11. Ping L, Lu R. An improved variable step-size affine projection algorithm
for narrowband interference suppression in DSSS systems. Proceedings of
the 2013 International Conference on Quality, Reliability, Risk,
From p. 23
12. Wang Z, Han Y, Lin T, et al. Virtual network embedding by exploiting
topological information. Proceedings of the IEEE Global
Communications Conference (GLOBECOM12), Dec 37, 2012,
Anaheim, CA, USA. Piscataway, NJ, USA: IEEE, 2012: 26032608
13. Liu J, Huang T, Chen J Y, et al. A new algorithm based on the proximity
From p. 30
3. Feng J H, Yang L, Fan P Z. A new multiple access protocol based on the
active state of neighboring nodes for mobile ad hoc networks.
Proceedings of the 3rd International Workshop on Signal Design and Its
Applications in Communications (IWSDA07), Sept 2327, 2007,
Chengdu, China. Piscataway, NJ, USA: IEEE, 2007: 270274
4. Yang L, Fan P Z, Hao L. Performance analysis of multi-channel slotted
ALOHA protocol based on power capture and backoff. Journal of
Southwest Jiaotong University, 2013, 48(4): 761768 (in Chinese)
5. Yao N M, Peng Z, Zuba M, et al. Improving aloha via backoff tuning in
underwater sensor networks. Proceedings of the 6th International ICST
Conference on Communications and Networking in China
(CHINACOM11), Aug 1719, 2011, Harbin, China. Piscataway, NJ,
USA: IEEE, 2011: 10381043