Selected ADAS, Viewing & Autonomous Vehicles Solution
Return to ADAS, Viewing & Autonomous Vehicles solution
The end goal of completely autonomous driving is to have no needed human interaction. Currently, the state of AD is Level 2, meaning that vehicles can drive themselves, but require that the driver keep their hands on the wheel. This does not mean the driver needs to have their eyes on the road, which opens up possible scenarios where the driver is distracted or pre-occupied. Advanced automated driving systems need to understand the state of the driver’s condition and awareness. In-cabin cameras, pointed at the driver with advanced driver monitoring algorithms, allow these systems to understand where the driver’s gaze is pointed, if their eyes are open or closed, position of their head, or if they are on their phone. These solutions require sensors with high sensitivity to infrared (940 nm), small footprints for optimal placement in the cabin, and advance global shutter efficiency to track eye and head movements.
Interactive Block Diagrams
The AR0234AT is a 1/2.6−inch 2Mp CMOS digital image sensor with an active−pixel array of 1920 (H) x 1200 (V). It incorporates a new innovative global shutter pixel design optimized for the accurate and fast capture of moving scenes at full resolution 120 frame per second.
The AR0135AT sensor incorporates a new innovative global shutter pixel design, with 10X lower dark current and 4X higher shutter efficiency vs. previous generation products. This 1/3-inch format, 1.2 MP imaging device has been designed to address the challenging requirements of automotive in-cabin cameras.
Almost all Advanced Driver Assistance Systems (ADAS) both today and in the foreseeable future are built primarily on machine vision to drive the decision process.