Algolux Shares Key Observations With ADAS Community, Reaffirms Commitment to Safety & AI Technology

Algolux has taken home the top award from the 2020 Tech.AD Europe Conference, based on voting by members of the automotive and autonomous vehicle industries. The company’s Eos Embedded Perception Software was named as the Most Innovative Use of Artificial Intelligence and Machine Learning in the Development of Autonomous Vehicles and Respective Technologies at the awards ceremony.

“Algolux was identified as a finalist from a large pool of nominees providing autonomous vehicle technologies and was presented with the first-place award at the Tech.AD Europe Award 2020 ceremony,” explained Daniel Wolter, CEO & Co-Founder of we.CONECT, organizers of the Automotive Tech.AD Conferences. “We congratulate Algolux on this acknowledgment by the automotive community on the impact Eos can provide to improve the accuracy of perception systems and further increase the safety of vehicles.”

Algolux is using the opportunity to highlight different sets of unique, but not unrelated, challenges in the ADAS community today. The first is the reliability of vision systems in deteriorating conditions (dust, haze, and poor lighting, among others). At the same time, Algolux is using the occasion to explain why a collective approach is best for solving these problems and offer some critical observations about what they see in the industry. 

Algolux VP of Marketing Dave Tokic accepts the first prize at the Tech.AD Europe Awards 2020.
Algolux VP of Marketing & Strategic Partnerships Dave Tokic (middle) accepts the honor on behalf of the company at the Tech.AD Europe Awards 2020. Photo: Algolux.

Are Roads Unsafe? What the Data Says

In a recent blog post, Algolux cites a report from the World Health Organization that over one million people die on our roadways yearly. Algolux notes that while road deaths are down globally, the number of accidents caused by human error is still remarkably high. ADAS technologies in their current form, according to Algolux, cannot “bridge the gap” entirely. That gap represents an important question today for Algolux and other engineers in the field. How do we make ADAS systems work in less than perfect conditions? 

“We have to take a step back and look at some of the key challenges today,” explained Dave Tokic, Vice President of Marketing & Strategic Partnerships for Algolux. “These systems work fairly well in good conditions, say a nice blue-sky sort of day, but fundamentally, when you get into scenarios of darkness and bad weather, these systems tend to fail dramatically.”  

According to research compiled by the National Safety Council, while we do only one-quarter of our driving after dark, 50 percent of traffic deaths happen at night. “Traveling at night will have its own challenges versus during the day,” said Ian Hoey of the California Highway Patrol’s Office of Community Outreach and Media Relations. “Some hazards at night include reduced visibility and the potential for increased wildlife activity in the roadways.”

It doesn’t matter, according to the National Safety Council’s data, whether the road is familiar or not. Driving at night is always more dangerous, and especially so for older Americans. The American Optometric Association says a 50-year-old driver may need twice as much light to see as well as a 30-year-old. Those 60 or better may also find it more challenging to drive at night, especially if road signs are not clearly lit.

Walking at night is also dangerous. There was a more than three percent increase in the number of pedestrians killed in traffic crashes in 2018, the most since 1990, according to NHTSA. The highest percentage of pedestrian fatalities are between the hours of 6:00 p.m. and 8:59 p.m., when light is at a premium, especially during fall and winter months. “Wear reflective clothing and carry a flashlight when walking at night or during the early morning,” advised Special First Lieutenant Jim Flegel, Traffic Safety Specialist, Michigan State Police. “Use sidewalks whenever available and never cross the street mid-block.”

YouTube video
While ADAS performance at night is a concern, testing from AAA last year found some systems were not up to par even during the day. Play the video for more.

What Should the ADAS Community Consider

While the ultimate outward goal is to reduce traffic fatalities, Algolux believes there are things the greater ADAS community can consider internally to gain a greater understanding and help advance the cause. Here are some of the key observations they have made recently.

#1: Not Something We Can Dismiss 

Tokic says existing data about road safety and the current capability of ADAS technology reveals a series of legitimate challenges, not one-off corner cases. Our schedules are such that we sometimes need to drive at night or when the weather is bad. We may get caught in a downpour, or have to set out for work during a foggy morning. These things are beyond our control.

There are times, in our daily routine, where we bounce back and forth between driver and pedestrian. We may wake up early for a run or bike ride before the sun rises, only to be a motorist a short while later during our morning commute. Each possible scenario, at different times during the day, presents a new challenge for ADAS technologies that must protect both vehicle occupants and pedestrians. Although COVID-19 has slowed us down somewhat, we have, up until this point, been a society on the move both day and night.

“We are using this as a preamble and asking ‘why are these systems failing today,'” Tokic said. “And this ties back to the kind of approach we are taking with Algolux.”

#2: Cameras Are Important

Tokic believes all in the ADAS community can capitalize on how far camera technology has come. “Cameras are fantastic,” he said. “They give you a lot of information; they’re dense in terms of the resolution of the surrounding scenes that you’re looking for, and they give a lot of semantic information.”

The takeaway here is understanding how camera images are well-suited for the human eye, but not necessarily so for machine vision algorithms. “For lack of better terms, the algorithm wishes it had all the data it could work with, not just what the human eye would like,” Tokic explained. “The image processing that’s done removes that information through the denoising, deblurring, and other things.”

#3: Work Together Whenever Possible

Tokic uses an example of an algorithm team that may receive the best and highest quality image possible. While this is a good thing initially, they end up going at it alone in their neural network approach for computer vision. “They’re based on optimizing, if you will, those networks in their own domain,” he continued. “And then they attach two things optimized in two separate domains and say, ‘okay, this is the best we can get.'”

#4: Training Data Does Not Come Overnight

Last (but not least) is the understanding of how much work goes into capturing the data necessary to train a neural network for the task at hand. “Generally, it’s done by grabbing a bunch of open-source data, capturing some real data, probably with older generations of cameras, and then hopefully capturing some data with the new camera for that system,” Tokic said. 

Ideally, engineers would like to capture all their training data with the actual camera that will be deployed in the vehicle. “But that’s just impractical today because you have to capture hundreds of thousands of training images and get them all annotated,” Tokic continued. “It’s a long, painful task.”

YouTube video

Building From the Ground Up

To address the full scope of the challenges facing the ADAS community, Algolux continues to build its team of AI deep learning, computational imaging, and computer vision experts. The objective for Algolux is having this combined understanding to create the right architecture for the task under one roof. “That allows us to understand what’s different about doing free space versus say, object detection, versus tracking and things like that,” Tokic said. “We bring all those disciplines together to allow us to look at the problem differently.” 

Algolux’s Eos Embedded Perception Software is foundational in their greater vision to ADAS, autonomy, and driver and pedestrian safety. The company describes it as an end-to-end learning approach, one that addresses perception in harsh conditions while also enabling the optimal architectures to reduce complexity and cost. 

“Eos Embedded Perception Software is an ensemble of vision capability, meaning both 2D and 3D object detection,” Tokic explained. “It includes things like free space, sign and traffic light recognition, and traffic light state recognition. We do the tracking so that we can understand where an object is, frame-to-frame, and we can do predictions off that. From there, we continue to build our stacks towards additional capabilities. In essence, we cover basically the functionality that’s required from a computer vision and perception standpoint.”

The Eos Embedded Perception Software is Algolux’s living and breathing solution, one modeled after their observations concerning the difficulties facing ADAS technology. On one hand, it tackles the deteriorating environmental conditions drivers must sometimes navigate; on the other, it accounts for the day-to-day design and development challenges engineers must work through. Following the high honors at the 2020 Tech.AD Europe Conference, Algolux remains committed to these and other initiatives in the interest of road safety.

“We have a very sharp focus on providing, as part of our mission, the industry’s most robust perception for any sensor, in any condition, to address the most safety-critical tasks,” Tokic said. “We take a very open approach, and are engaged with numerous different platforms to enable our customers.”