Could thermal cameras help prevent the next fatal autonomous vehicle crash?April 10, 2018
When 90-95% of road accidents are attributed to human error, the potential of autonomous and self-driving vehicles is unquestionable. But today, this novel technology doesn’t fulfil the promise of safe driving.
So what’s missing? The technology is struggling with the detection of dangerous situations in non-standard circumstances, or as the industry calls them, “edge-cases.” These involve any cases where the sensor system of an autonomous car failed to detect danger. Most of these systems today utilize visible-light cameras, lasers, or radars which are ineffective in certain cases: in environmental conditions like black ice or fog, and situations when an object is hard to parse (partially concealed or of irregular shape pattern).
In this article, we will show how thermal cameras could be a game changer for the driverless car technology.
The first known person to die using a self-driving feature was Joshua Brown, whose Tesla Model S crashed into a truck that turned across his path in Florida, in May 2016. The car’s computers didn’t see the white truck against the bright sunny sky.
The second confirmed fatal crash on US roads happened just a couple weeks ago in California. Tesla’s Autopilot system was again controlling the car and failed to see a road separator. The company acknowledged that its system is imperfect and not a fully driverless car, and that the crash happened because the driver didn’t keep his hands on the wheel as per instructions.
Both systems relied on cameras and a radar-based cruise control.
Together with Uber’s crash in Arizona in March, when one of its self-driving cars killed a pedestrian, these series of events show the need for an overhaul of the existing technologies in the autonomous vehicle industry.
True, the Arizona accident would have been hard to avoid even with a human driver behind the wheel, fully awake and aware. Yet, this is not what it was supposed to be: autonomous cars promised to be better than humans.
Most importantly, progress toward practical autonomous driving still requires major improvements to the sensors that map a vehicle’s environment. Let’s take a look at some major shortcomings of existing sensor systems.
Some of the limitations and how thermal cameras can help
In recent years, automakers working on self-driving cars have been experimenting with radars, ultra-sonic sensors, forward-facing cameras, and other solutions. Each technology has its shortcomings, and all of it boils down to the challenge of being able to recognize an object or phenomenon in most challenging or unusual conditions.
Lidar sensors, which bounce laser beams off nearby objects to create highly accurate 3-D maps, have become a popular choice and is used by Alphabet, Uber, and Toyota. “Our Lidar can see perfectly well in the dark, as well as it sees in daylight, producing millions of points of information,” Marta Hall, Lidar’s president recently commented.
That’s true, Lidar can detect objects in considerable detail, day or night. But heavy snow or fog can obscure Lidar’s lasers; and their accuracy decreases with range.
Uber’s self-driving vehicle has multiple cameras and also rely on Lidar sensors, but their usefulness is limited at night time too – vulnerable during the dark hours when they need to drive passengers safely, as the current car crashes show.
Thermal cameras could be the key to safer self-driving vehicles. Thermal cameras have the capabilities to help in so-called “edge cases” in autonomous driving, where other sensors might fail.
Image source: Extremetech.com
First of all, thermal imaging can enhance driver’s vision in challenging environmental conditions such as black ice on the road, dust, sun glare, smoke, fog, or haze.
While cameras and radar or laser-based systems could have performed well in the sunny Florida, it is easy to imagine a car getting lost in snow, dust, and even direct sunlight with those sensors.
For folks that drive in the northern country, being able to see variations in temperature in the car’s surroundings is key to safety. In those areas black ice (invisible ice on the road) is a real problem. A thermal system would be able to see the lack of heat on asphalt – a telltale sign of black ice – and the car would adjust its path.
Another difficulty for self-driving cars is their inability to differentiate between pictures and real objects. An image of a deer on the back of a truck may be viewed by the car’s algorithms as a real animal, causing the car to stop. By detecting that there’s no heat being emitted by the deer-like object and understanding that this is an inanimate object and an image, the car’s system would keep the car going.
One of the most critical abilities of the self-driving technology is recognizing pedestrians, cyclists, and other actors sharing the road, but also animals. And the algorithms used have become quite good, but still not good enough.
A number of companies have been hard at work developing long-range IR cameras for self-driving vehicles. The system can see and classify objects, people, and animals based on their thermal signature in adverse conditions, where other sensors would fail. Cold-blooded animals, however, would likely pose a different challenge for thermal cameras.
In the last several years, the technology has become more available and cheaper – and thus it is becoming a viable solution for autonomous driving.
Today, manufacturers increasingly experiment with IR sensors in autonomous cars. Some types of thermal sensors have been installed in numerous traditional vehicles, including those made by General Motors, Peugeot, Mercedes, Audi, and BMW. Car manufacturers starting to consider adding this technology to self-driving vehicles too.
Opgal’s Sii FG, a multi-spectral fog vision camera, uses a highly sensitive IR sensor and advanced algorithms to deliver a clear picture in extreme bad weather – thick fog, smog, heavy rain, high humidity, or snow. Its unique sensors can provide a driver of an autonomous vehicle with clear images of objects, animals, and people at extreme long-range distances.
Current autonomous and self-driving solutions all have shortcomings that can be resolved when used in conjunction with thermal imaging. A combination of different sensors, which include the thermal ones, are likely the key to safe autonomous driving.
Of course, the typical driverless prototype is already outfitted with a barrage of sensors. While adding the extra thermal sensory abilities might cost more for manufacturers, the added safety makes a compelling argument. And with time, the cost of this technology will continue to drop.
- OGI Part 3: Opgal EyeCGas® Product Line
- New applications for thermal imaging devices around the world - Oct 2018
- New applications for thermal imaging devices around the world - Sep 2018
- THE VERSATILITY OF THERMAL IMAGING
- OGI P.2: Effectiveness of gas leak detection technologies
- All About Optical Gas Imaging (OGI) – Part 1: Complying with regulations
- Intro to IR (Part 5): Lens
- Intro to IR (Part 4): Optics
- Intro to IR (Part 3): Sensitivity, resolution and frame rate
- Intro to IR (Part 2): Cooled vs. uncooled cameras, sensitivity, resolution, frame rate
Defense (3) Environment (3) Fire Detection (4) Gas Leak Detection (11) General (1) Handheld Thermal Cameras (8) Industrial (1) Law Enforcement (5) Mobile (3) Multi-Camera PTZ Systems (0) Oil and Gas (6) Opgal (1) Personal Vision Systems (2) Safe City (5) Search and Rescue (3) Security (9) Thermal Cameras (24) Thermography (3)