rFpro & Sony Semiconductor Solutions Partnership Aims to Accelerate ADAS & Autonomous Vehicle Development

Driving simulation expert rFpro has announced a partnership with Sony Semiconductor Solutions to develop the next generation of high-fidelity sensor models. This latest collaboration will integrate the rFpro simulation software alongside physical CMOS image sensors for the first time in the automotive industry. With the goal of fast-tracking development of ADAS and autonomous driving technology using virtual driving environments, the rFpro and Sony partnership aims to reduce the industry’s dependence on collecting real-world data.

“By working closely with Sony and integrating its sensor models into our technology, we have achieved a high level of correlation in the simulation,” said Matt Daley, rFpro Operations Director. “Sony Semiconductor Solutions has been a crucial collaborator in developing our recently launched ray tracing technology and rFpro’s Multi-Exposure Camera Technology, which accurately replicates what cameras ‘see’ for the first time.”

Ray Tracing Technology

My colleague Carl Anthony, Managing Editor of AutoVision News, interviewed Daley for AutoSens Detroit 2023. The discussion revolved around rFpro’s ray tracing technology, a software-in-the-loop (SIL) solution that utilizes multiple light rays to paint an exacting replica of the natural world, especially in low-light scenarios. “It’s not just about making lovely daytime images,” Daley said in the interview. “The system needs to identify humans crossing the road at night where there’s not very much ambient light around.”

Ray tracing utilizes a multi-path technique to simulate the multiple reflections around a camera sensor. The software applies to every element in the simulation to create the best, highest-fidelity images. Moreover, rFpro’s multi-exposure camera API allows sensor models to “sample” the virtual world using the same exposure periods as the physical realm.

“With the rFpro rendering and Sony’s sensor model pipeline, motion blur, rolling shutter characteristics, color filter characteristics, and signal processing of the physical sensors exist in the simulated images,” Daley added. “It enables the sensor model to interact with the fast flickering LED light sources simulated on rFpro’s high fidelity assets and rendering systems, such as traffic and vehicle brake lights. Accurately replicating this phenomenon is critical to achieving high levels of correlation with the real world.”

Synthetic Training Data

Expediting the development of perception software running on deep neural networks requires significant data. In addition, the demand for more ADAS functionality and autonomous driving capabilities increases in modern cars and EVs, posing a considerable problem for development teams. However, automotive-grade simulation could generate synthetic data of virtually limitless driving, traffic, and weather scenarios, enabling the accelerated development of ADAS and autonomous driving features using the virtual world.

“Vehicles can drive thousands of high-value, high-activity virtual miles daily in simulation,” Daley said. “Edge cases can be identified and new iterations generated quickly to exercise sensor systems thoroughly. In addition, it removes the need to wait for exposure in the real world, where most miles driven are relatively uneventful.”