Our expert team of engineers understands the importance of capable imaging applications to perform when it comes to autonomous driving. Therefore they must operate accurately 100% of the time. In order to prove our solutions, we put our image sensors through tough, real-world tests to ensure your confidence in the products you get from us.
See how our sensors are used in cameras that act as the eyes of the vehicle and direct the vehicle to make crucial decisions. These cameras must be able to operate in a variety of lighting scenarios such as a bright sunny day or in a snow blizzard. Watch how our autonomous vehicles are able to operate despite poor weather conditions while maintaining your safety due to our technology for LED flicker mitigation and high dynamic range situations.
Full Transcript
Hi, I’m Bahman Hadji from the Automotive Solutions Division of the Intelligent Sensing Group here at onsemi.
The automotive environment presents a number of different challenges for image sensors.
These sensors are used in the cameras which act as the eyes of the vehicle in safety-critical advanced driver-assistance systems, or ADAS, providing features such as adaptive cruise control and 360-degree surround view systems.
These “eyes” have to be able to operate in extremely low light situations as well as on a bright sunny day.
The image sensor must be able to capture all of the scene detail in these high dynamic range imaging situations because it is relied on by the brain “behind the wheel” to make decisions – whether that brain belongs to the driver or the artificial intelligence made up by the ADAS algorithms.
Traffic lights like the one we just pulled up to here are being operated with pulse-width modulation.
To save power and control intensity, the LEDs making up the light are being pulsed on and off at a rate that our eyes can’t perceive, which can be 90 times or more every second.
And it isn’t just traffic lights that have adopted this – pulsed LEDs have become prominent in modern vehicle headlight and taillight design, as well as electronic traffic signs or variable messaging systems, which are intended to convey information such as traffic conditions and speed limits to drivers.
To capture a high dynamic range scene, particularly the bright areas of a scene like this, a traditional image sensor would use a short exposure to avoid oversaturation – much shorter than the ON period of these LEDs.
But this results in the appearance of lights flickering in the video as its frames capture the LEDs sometimes while they’re on and sometimes while they’re off.
This undesired effect can be distracting to the driver and confusing to ADAS algorithms.
Exposing for a long enough period to guarantee capturing the LEDs on within each frame would lead to an oversaturated image, which is also not desirable.
At onsemi, when our world-class engineering teams design and characterize our image sensors, we take all of these considerations in mind.
While our testing includes controlled laboratory environments, we also put our image sensors to the test on the road to capture real-world scenes.
We've done the testing for you. Discover onsemi automotive image sensors for sensing for ADAS, human vision and autonomous vehicles.
Read up on the previous blogs in our On the Road series: