Multimodal Sensor Fusion for Autonomous Vehicles: Quantifying Unknown Risk and Operational Uncertainty. Moving beyond perception accuracy toward decision confidence, integrating camera, radar, LiDAR, and learned priors to estimate epistemic uncertainty, detect out-of-distribution events, and compute actionable risk metrics for safe planning under incomplete knowledge.
I’m heading to Barcelona for AutoSens Europe 2026 (22-24 September 2026)!
I’ll be presenting on multimodal sensor fusion and on quantifying uncertainty, basically enabling intelligent cars to take into account the “unknown”. Really looking forward to geeking out over ADAS tech with everyone there. See you soon!
Multimodal Sensor Fusion for Autonomous Vehicles: Quantifying Unknown Risk and Operational Uncertainty
Moving beyond perception accuracy toward decision confidence, integrating camera, radar, LiDAR, and learned priors to estimate epistemic uncertainty, detect out-of-distribution events, and compute actionable risk metrics for safe planning under incomplete knowledge.