Close×

IDTechEx provides trusted independent research on emerging technologies and their markets this report is on the camera technology needed for autonomous driving.

The jump from ADAS (advanced driver assistance systems) to autonomous driving is considerable. In ADAS, for the most part, sensors are required to detect presence. For example, in adaptive cruise control, sensors scan the road ahead to detect the vehicle in front, feeding this back to the vehicle to maintain a safe distance.

However, in autonomous driving, this sensing has to be operational at all times, in all directions, with much superior object classification required to respond to the different scenarios a driver would face on the road. To this end, the sensor requirement increases both in number and variety. The average number of cameras, radar, and ultrasonics sensors per vehicle goes up dramatically from level 2 (most common on today’s roads) to level 4 (autonomous driving). Furthermore, the need for excellent depth perception, combined with operation in a variety of conditions, opens the door to other sensors. IDTechEx believes that  level 4 autonomy is unachievable without the implementation of currently less conventional sensors, such as LiDAR or infrared cameras.

IDTechEx believes that the majority of level 4 vehicles will use Longwave Infrared (LWIR) cameras. In addition to its potential in passenger AEB, LWIR cameras are likely to be used in level 4 private vehicles and robotaxis for the same general reasons: detection of humans and other objects through heat signature and the ability to see in low visibility conditions such as night or fog, where RGB cameras would struggle, while radar alone provides no actual image data.

A key theme in all autonomous vehicles is the necessity of redundancy. In case of the failure of one sensor, multiple sensors of the same or different type are required to provide redundancy data, such that the vehicle remains operational. To this end, even if LWIR cameras aren’t the main sensor for image classification or depth perception, their ability to take on these responsibilities temporarily makes them an attractive option.

Furthermore, with increased responsibility on the vehicle to perceive surroundings and make sensible decisions in the absence of an actual driver, edge cases have to be covered by the sensor suite. An example of such an edge case would be at night, with a pedestrian crossing the road from behind an obstacle. A camera could maybe pick up the person’s head above the obstacle, but an LWIR camera could perceive the danger by detecting the characteristic heat signature of a human before that, as LWIR cameras can work at ranges above 100m. Visible light cameras have a limited range, depending on the headlights and ambient lighting.

Performance requirements at higher SAE levels

Performance-wise, there are some differences between the LWIR cameras likely to be used in ADAS vs autonomous driving. While AEB is most likely to use only one LWIR camera for forward detection, level 4 vehicles could use multiple to cover both sides and the rear or in a stereovision setup, where the input from two cameras is combined to achieve a superior depth perception compared to single camera imaging. An example of this would be in the technology used by Foresight Automotive, where both visible light and thermal cameras can be used as part of a stereovision setup for autonomous driving.

While most automotive LWIR cameras are at a maximum of approximately 0.3Mp (much lower than the tens of Mp achievable with a typical camera sensor, e.g., Sony IMX), LWIR cameras for autonomous driving are also likely to demand greater resolutions for object detection. Furthermore, most commercially available thermal cameras require a shutter system. This means there are split seconds when the camera is not capturing image data as it needs to recalibrate. Shutterless thermal cameras, such as from Valeo in series production by 2027, and AdaSky, will be an attractive option for level 4 vehicles.

 

comments powered by Disqus