This Week's Top Advanced Vehicle Design News

April 7, 2021

Human perception isn't necessarily the model for autonomous vehicle sensor systems. But what is?

April 7, 2021—Think about all the different decisions you make, actively or passively, as you drive. Your brain takes in visual, tactile, and sound cues, all providing input for it to analyze and make a driving decision.

How quickly can you merge? Is it safe to make this turn on a crowded block? What lane do I need to be in for an exit? The individual data points that contribute to these decisions can be complex.

That's a difficult task for computers to replicate in autonomous vehicles. Moreover, it's not the goal of autonomous technology developers to just replicate human perception. The key to a successful autonomous vehicle environment is to make the vehicles better than human perception—flawless in decision-making.

These challenges were covered as part of a recent webinar hosted by Partners for Automated Vehicle Education. The goal, ultimately, is safe driving no matter what the road throws at a vehicle.

“You want to handle these harsh cases and be able to not only infuse redundancy but ensure that you’re building a superhuman driver that goes much beyond the capabilities of what we have today," said Felix Heide, chief technology officer and cofounder of Algolux, an AI software company for automotive solutions.

Perception and Sensor Fusion

Like the different sensory inputs that a human experiences while driving, full autonomy is going to rely on a suite of sensors that can provide redundancy, work in multiple conditions, and take in different kinds of information.

That means visible light cameras that can see, as well as LiDAR units that provide range and target information, radar that can back up spatial sensing, thermal cameras that can see in fog or at night, and much more. Working together, the system makes up what is called "sensor fusion."

That's just the hardware side. The software side also needs to be incredibly advanced, able to take information from all those sensors and make sense of the data in a single environment. The whole system is a perception engine.

“The job of the perception engine is to take the various inputs from those sensors and to fuse them into an understanding of those surroundings," said Hod Finkelstein, chief technology officer of Sense Photonics, a technology company specializing in LiDAR.

To read the entire story and learn more about advances in this field, head over to ADAPT.

Sponsored Recommendations

Best Body Shop and the 360-Degree-Concept

Spanesi ‘360-Degree-Concept’ Enables Kansas Body Shop to Complete High-Quality Repairs

Maximizing Throughput & Profit in Your Body Shop with a Side-Load System

Years of technological advancements and the development of efficiency boosting equipment have drastically changed the way body shops operate. In this free guide from GFS, learn...

ADAS Applications: What They Are & What They Do

Learn how ADAS utilizes sensors such as radar, sonar, lidar and cameras to perceive the world around the vehicle, and either provide critical information to the driver or take...

Banking on Bigger Profits with a Heavy-Duty Truck Paint Booth

The addition of a heavy-duty paint booth for oversized trucks & vehicles can open the door to new or expanded service opportunities.