Vous êtes sur la page 1sur 7

How autonomous vehicles work

Autonomous vehicles, otherwise known as uninhabited autonomous vehicles (UAV), autopilot vehicles, driverless cars, auto-drive cars, or automated/autonomously guided vehicles (AGV), are regarded as intelligent machines. Alan Turing in his classic 1950 paper, Computing Machinery and Intelligence , asked Can Machines Think?. Of course, this begs the question of what it means to think. Yet, if we set apart mental activity not done by humans and by an artificial device but upon which we rely to take action that normally would follow from our own thoughts, then we have at least some elements of artificial thinking. Recall (memory), computation, and pattern recognition are activities that can be done better with computers than by humans, and we traditionally have regarded these as thinking. Navigating a car can be done entirely by a human thinking but often better by a computer, simply because of the precision.

A system of systems design goals


No one component of a system that drives such a vehicle is responsible but a collection of components.

Typical basic autonomous vehicle configuration using lasers, radar, and camera [1] Since the 1950s rapid development of intelligent systems has occurred, including tire monitoring, human-machine interfaces, assisted steering, brake management, navigation, and now semiautonomous and autonomous driving. Numerous and different systems have been deployed, giving drivers directions through the global positioning system (GPS) as to the best routes both audibly and by text [2]. The only step remaining was to coordinate the cars with each other, and full automation appeared to be the optimal way of doing this. In 1997, the California Partners for Advanced Transit and Highways 2006 (PATH) demonstrated that a row of cars could be driven automatically down the highway at normal speeds with close spacing [3]

California PATH Project [4] A very wide network and interconnected solutions is needed, however. Vehicles can be treated as parts of a large organism when viewed all at once, rather than as individuals in isolated personal units. That means there has to be an integrated transport system that coordinates traffic flow in order to make the system work optimally. Drivers may have the choice to override the recommendations of a car's computer, but such could cause grave complications the integrated system might not be able to handle, as the change in one component, a car, may well affect how the others will navigate. Perhaps research from swarm theory will be able to guide us, but at this stage, an integrated transit control system operates not unlike a packet of information being routed over a communications network by sophisticated management software or like a power grid, where one disruption can cause the whole network to go down. How much independence a driver should have will have to be assessed in such contexts. People may perforce have less of a choice about being in total charge about where and when they will navigate. We see the emergence of mesh networks, where cars communicate with each other, much in the same way people do by walking in crowded spaces, keeping spaced by selecting the most appropriate routes. For traffic management, it may be a matter of necessity to have a person entering a crowded area being switched to a network and driven according to the needs computed by the traffic management software. Such is not much different than an air traffic control system, where each airliner lands in turn, given the configuration of airport traffic. High Occupancy Vehicle (HOV) lanes are a precursor to what we may expect in selective autonomous traffic control. At this juncture, it appears almost necessary to have a fully autonomous vehicle that can respond to such a system. How feasible is this, and how would such work?

Major system elements in an autonomous vehicle


There are many types of autonomous vehicle systems each using a different combination of cameras, lasers, sensors, radar, and wireless navigation, such as web-based systems, speech recognition devices, transponders, and satellites, so there is no typical description [5]. However, autonomous navigation and control has been built on previously developed computer-assisted subsystems include steering, collision-avoidance, braking, and navigation information, information of which usually is routed and displayed via human-machine interfaces. To do all this, it is common to see the use of the Controller Area Network (CAN) protocol, a serial communications bus to carry the data. It is becoming a defacto standard because it uses a single interface, rather than relying upon either analog or digital input-output devices specific to each system component. [6]

Human-Machine Interfaces - from Bosch [7] On-board diagnostic systems (ODS) systems are integrated, such as those involving brakes, steering control (e.g.s, Active Front Steering (AFS), power assisted steering, and anti-skid), tire monitoring, engine and electrical system condition warning, and cooling systems. The car must have both lateral and longitudinal controls, lateral to maintain it on the road and longitudinal to control its place with other cars, including speed regulation. These systems receive information from the guidance system to move the car. For lateral control, there are Lane Departure Warning Systems (LDWS), and Lane Keeping Assist Systems (LKA). Parking assist systems also exist. Lateral control means avoiding sideswipe crashes and veering off course onto and off the shoulder of the road. Longitudinal control involves speed regulation, such as with Adaptive Cruise Control (ACC), and pre-crash break assist. The adaptive part of ACC depends upon radar or sensors to detect distances, and speed is then adjusted accordingly. Navigation is accomplished often by correlation between a real-time satellite generated map and sensor data. Route selection can be by the satellite map, web-based, or through a personal digital assistant (PDA). Cloud computing applications are coming into vogue with constantly updated information being used to plot routes, give weather information, and traffic information [8].

From Autonomous Cars and Society [9]

As an example of a detection system is Stanford University's Junior project, a winning entry in the U.S. Defense Research Projects Agency (DARPA) urban challenge, where vehicles are driven autonomously in a somewhat urban environment, is carrying forth research on LIght Detection And Ranging, or Laser Imaging Detection and Ranging (LIDER) to create maps of a car's environment and use that to navigate. Maps with locations of people and objects are continuously generated from this with centimeter accuracy and the data are used to determine an actual path. The Stanford project description gives an idea of the complexity of devices. The current suite of sensors includes a Velodyne HDL-64E S2 rotating 64-beam LIDAR and four cameras from Point Grey: a Ladybug3 spherical camera, two color Flea2s for forward stereo vision, and a Grasshopper for high resolution forward monocular vision. Also present are 6 Bosch automotive radars covering the front, rear, and side views, two SICK LD-LRS LIDAR scanners covering the blind spots, and an Applanix POS-LV 420 inertial GPS navigation system [10]. One method if guidance involves detecting via cameras the general edges of the road by shading. There are obvious cases where lines demarcate the sides of the road, but in darkness or in undefined situations, such as rural roads or where snow or rain obscures road edges, navigation is dependent upon being able to discern contrasts of light. Accurate guidance assumes that the central path of the road will be a shade different than the edges. Steerage is determined by a constant sampling and comparison between that center and what lies on either side. Statistical algorithms calculate the direction the car takes. In addition sensors, such as proximity detectors, detect obstacles, there being none in the center line but ones such as vegetation, railings, and so forth to the sides. These methods, coupled with GPS and real-time detailed digital maps generated from satellites enable route selection. While embedding markers in the road that are detectable by car sensors would ensure accurate steerage, the method would be too impractical, given the number of roads and sensors involved. Painting special lines on a road would be a more efficacious way of marking an accurate route.

The future of autonomous vehicles


Already, we see the computer is an extension of the mind, doing things line sorting data in a more efficient memory than humans have and doing exceedingly complex calculations in a microsecond that ordinarily would take years. As a step next to autonomously-driven vehicles there is being tested in 2011 by engineers led by Raul Rojas at the Free University of Berlin in Germany a largely autonomous car that can be controlled by a driver's mind through the use of a set of sensors placed on the head of the driver [11]. This certainly goes beyond recently introduced technologies such as speech recognition devices used to control vehicles. Certain challenges lie ahead for further autonomous vehicle development. As there are a number of different methods for driving autonomous vehicles, there needs to be standardization in order for all of them to work seamlessly in a mesh environment. They must be interoperable. Vehicles must be able to communicate with others and coordinate and plan actions, independent of human intervention. If a human does intervene, the autonomous system will have to be able to manage that. The issue of verification and validation must be overcome. One must be able to predict what will happen in ordinary situations, and in extraordinary ones, where there is unexpected behavior, or emergence, a safely fail system must be in place. That is, failure must be managed safely. Autonomous vehicles will have to be able to operate in environments where some components, such as GPS or mapping capabilities may be compromised or even lost, such as by blockage by buildings or trees. Override systems must exist, where errant vehicles can be brought under control in a safe manner. This all means that if an autonomous vehicle is deployed in an environment in which behavior is not 100%

predictable, a driver must be present. Another challenge is regulating any human-machine interaction, where a decision made by the autonomous system is correct, but the human may want to intervene because it does not appear to be intuitively correct. In this case the human would be wrong and the car correct. An autonomous vehicle must be able to manage a situation where another car is conventional or semi-autonomous, and where human behavior is not predictable. Vehicle-to-vehicle or vehicle-to-net (where the network makes navigation decisions) communication is being developed but this must be perfected. From a practical point of view, the equipment used to navigate occupies a great deal of space and the hardware will have to be further miniaturized. The added weight and space consumption present problems for increase fuel usage and reduce baggage capacity. For humans information relevant to vehicle control must be presented in a layperson's manner.

References (Subject is indicated by URL accessed 2 August 2011)


[1] http://www2.ece.ohio-state.edu/citr/Demo97/osu-av.html [2] http://reviews.cnet.com/best-gps/ [3] https://docs.google.com/viewer? url=http://www.ece.gatech.edu/~magnus/Papers/UGCPaper.pdf&embedded=true&chrome=true [4] https://docs.google.com/viewer? url=http://www.ece.gatech.edu/~magnus/Papers/UGCPaper.pdf&embedded=true&chrome=true [5] https://docs.google.com/viewer? url=http://www.ece.gatech.edu/~magnus/Papers/UGCPaper.pdf&embedded=true&chrome=true [6] http://en.wikipedia.org/wiki/Controller_area_network, https://docs.google.com/viewer? url=http://focus.ti.com/lit/an/sloa101a/sloa101a.pdf&embedded=true&chrome=true, http://zone.ni.com/devzone/cda/tut/p/id/2732 [7] http://www.bosch-presse.de/presseforum/details.htm?locale=en&txtID=5156 [8] http://www.isuppli.com/automotive-infotainment-and-telematics/news/pages/automotivenavigation-heads-into-the-cloud.aspx [9] https://docs.google.com/viewer?url=http://www.wpi.edu/Pubs/E-project/Available/E-project043007-205701/unrestricted/IQPOVP06B1.pdf&embedded=true&chrome=true [10] http://people.csail.mit.edu/kolter/lib/exe/fetch.php?media=pubs:levinson-iv2011.pdf - Towards Fully Autonomous Driving... [11] http://www.newscientist.com/blogs/onepercent/2011/02/mind-control-puts-you-in-charg.html

Resources (Subject is indicated by URL accessed 2 August 2011)


https://docs.google.com/viewer? url=http://www.spacetech.tudelft.nl/fileadmin/Faculteit/LR/Opleidingen/SpaceTech/Central_Case_Proj ect/doc/ST3_Worldwide_Inter-Modal_Navigation_System.pdf&embedded=true&chrome=true http://e2af.com/trend/080212.shtml

http://www2.ece.ohio-state.edu/citr/Demo97/osu-av.html

http://wn.com/automobile_navigation_system - compares different types Autonomous Cars and Society - https://docs.google.com/viewer?url=http://www.wpi.edu/Pubs/Eproject/Available/E-project-043007205701/unrestricted/IQPOVP06B1.pdf&embedded=true&chrome=true http://www.intempora.com/en/projects/automotive/adas/carsense http://en.wikipedia.org/wiki/Controller_area_network http://www.transport-research.info/web/projects/project_details.cfm?id=15260&page=outline

Stanford robot car "Junior" in action, DARPA Urban Challenge


http://www.youtube.com/watch?v=BSS0MZvoltw http://www2.ece.ohio-state.edu/citr/Demo97/osu-av.html https://docs.google.com/viewer? url=http://husky.if.uidaho.edu/pubs/2011/IJCNN11_RavMan_HWNeuralAutonVehPathTracking.pdf&e mbedded=true&chrome=true Transponder use - http://www.mikechiafulio.com/RIDE/system.html Path selection method - http://ftp.utcluj.ro/pub/docs/imaging/Autonomous_driving/Articole %20sortate/TUBraunschweig/iv980304.pdf Lane change algorithm for autonomous vehicles via virtual curvature method - http://findarticles.com/p/articles/mi_m5CYM/is_1_43/ai_n31126282/ Multi-Sensor Lane Finding in Urban Road Networks - https://docs.google.com/viewer? url=http://www.roboticsproceedings.org/rss04/p1.pdf&embedded=true&chrome=true Automating Acceptance Tests - https://docs.google.com/viewer?url=http://www.se-rwth.de/books/DissBerger.pdf&embedded=true&chrome=true http://www.eetimes.com/design/embedded-internet-design/4214506/Speech-recognition-in-the-car http://www.bosch-presse.de/presseforum/details.htm?locale=en&txtID=5156 https://docs.google.com/viewer? url=http://www2.selu.edu/Academics/Faculty/ck/paps/JFR.pdf&embedded=true&chrome=true neural networks - https://docs.google.com/viewer? url=http://husky.if.uidaho.edu/pubs/2011/IJCNN11_RavMan_HWNeuralAutonVehPathTracking.pdf&e

mbedded=true&chrome=true Collision avoidance - http://e2af.com/trend/080117.shtml http://www.intempora.com/en/projects/automotive/adas/carsense Excellent history summary of UAVs http://faculty.washington.edu/jbs/itrans/bishopahs.htm http://www.ivsource.net/archivep/2000/jul/a000731_carsense.html

Vous aimerez peut-être aussi