feedback
Rate this webpage
Need
Support?

Selected ADAS, Viewing & Autonomous Vehicles Solution
Selected Solution
Return to ADAS, Viewing & Autonomous Vehicles solution

In-Cabin

The end goal of completely autonomous driving is to have no needed human interaction. Currently, the state of AD is Level 2, meaning that vehicles can drive themselves, but require that the driver keep their hands on the wheel. This does not mean the driver needs to have their eyes on the road, which opens up possible scenarios where the driver is distracted or pre-occupied. Advanced automated driving systems need to understand the state of the driver’s condition and awareness. In-cabin cameras, pointed at the driver with advanced driver monitoring algorithms, allow these systems to understand where the driver’s gaze is pointed, if their eyes are open or closed, position of their head, or if they are on their phone. These solutions require sensors with high sensitivity to infrared (940 nm), small footprints for optimal placement in the cabin, and advance global shutter efficiency to track eye and head movements.
Interactive Block Diagrams
Technical Documents
Evaluating Functional Safety in Automotive Image Sensors
Almost all Advanced Driver Assistance Systems (ADAS) both today and in the foreseeable future are built primarily on machine vision to drive the decision process.
Evaluating Functional Safety in Automotive Image Sensors
Your request has been submitted for approval.
Please allow 2-5 business days for a response.
You will receive an email when your request is approved.
Request for this document already exists and is waiting for approval.