On the road to autonomous driving, sensor technologies such as ultrasound, radar and lidar are permanently evolving. By using them to complement one another, modern vehicles have a 360o view of their surroundings. Even now, that is enabling automated driving functions such as motorway driving.
Hidden behind bodywork parts and plastic covers, modern cars now conceal a whole lot of technology. And that includes vehicle sensor systems, with their various “sense organs”. Whether that is a GPS transmitter for the navigation system or ultrasound for the parking assist – sensor systems help vehicle and driver to cope better in their environment. But just how do these various technologies work, and what is still missing in modern cars before, in future, they will be able to move around completely independently in traffic?
Modern cars come with a variety of sensor devices. Since the early 1990s, these have included ultrasound sensors. Back then, they were fitted in cars primarily as park assist devices. These days, the range of their functions has expanded significantly: They can measure parking spaces as you maneuver, and recognize if vehicles are driving in the lane next to you (blind spot assist). The technology is based on acoustic signals, which is why they perform best at short range. Ultrasound sensors are cheap and robust, and are set to retain their place in the arsenal of sensor systems for some time to come.
The same goes for radar sensors and cameras, which are fitted in more and more vehicles today. Cameras are used, amongst other things, to identify lane markings, road signs, traffic lights and other road users. For lane recognition, a video camera is mounted behind the windshield, and it is used to monitor lane positioning and to alert the driver in the event of unintentional drifting out of lane. People and animals can be recognized by cameras even in the dark, thanks to infrared technology. And beyond that, for some time now cameras have also been offered as a park assist equipment option. In this case, they allow the driver to have an even better all-round view. Thanks to stereo cameras and modern algorithms, it is even possible to have 3D images from a bird’s-eye perspective.
Radar systems work with electromagnetic signals, which is why they are particularly good at sensing metal objects and suitable for measuring distances between vehicles. The radar sensors are normally fitted to the front and rear of the vehicle. While the backward-facing sensor records cars approaching from behind and overtaking traffic, the traffic traveling ahead of you is monitored via a long-range radar system to the front. Near-distance radar monitors the area immediately around the car.
Radar, cameras and ultrasound sensors were used in the past for functions that were independent of one another. Nowadays, though, all relevant data can be linked digitally using sensor fusion, which is a key precondition for automated driving. Particular emphasis is given here to functional safety. Redundancies and plausibility checks, as system-internal monitoring, reliably prevent incorrect interpretation of the data. For this, the signals from the vehicle sensors are compared against one another. The steering and engine are only controlled if the data is consistent. Hence it is already possible today to enable automated driving functions such as motorway driving.
Lidar systems are based on optical signals. As a result, obstacles, apart from metal objects, can be better detected than when using radar. Simpler short-distance lidar systems are already being used in emergency braking assists, but the truly high-performance devices are still in development. Alongside a number of smaller manufacturers such as the Intel subsidiary Mobileye, for a few years now the German sectoral giant Continental has also been active in this market. With its High Resolution 3D Flash Lidar, in future Continental is looking to enable real-time 3D monitoring of the surroundings, with image interpretation. The developers at Mobileye are planning something similar, and in addition to the sensor system they are also researching new data processing hardware. For 2025, the specialist is planning the market launch of a silicon-based “system-on-chip”, capable of better processing the huge data volumes generated with lidar systems.
One further technology that can help in the development of autonomous driving is networking cars and infrastructure. In this context, experts refer to car-2-X and car-2-car communication. The aim is that, using WLAN or a 5G network, the traffic infrastructure and road users should match their behaviors to each other. As yet, though, it is not fully clear how this hook-up is to be implemented, since there are still no internationally-recognized standards. However, it is clear that the range of sensor systems in the car of the future will be even more extensive and high-performing than today. That’s a development all drivers will benefit from, as the car gains in comfort and safety as a result.
(Stage photo: © BMW)
The IAA MOBILITY is transforming itself from a pure car show to an international mobility platform with four pillars: the Summit, the Conference, the “Blue Lane” and the downtown Munich Open Space. Under the slogan of “What will move us next”, it stands for the digital and climate-neutral mobility of the future. From 7 to 12 September 2021, the car, bike and tech industries come together at IAA MOBILITY in Munich.