In 1953, Isaac Asimov penned the short story “Sally”, depicting a farm where autonomous cars go once they’ve been removed from service. The cars can never die and have positronic brains, giving them distinct personalities. According to the story, the first generation of autonomous vehicles was released in 2015: “We even have a ’15 model Mat-O-Mot in working order. One of the original automatics. It was the first car here,” the story says. 2015 was two years ago. It seems that 2015 was a disappointing year for science fiction transportation.
When it comes to autonomous cars, the eternal car trip question is still “Are we there yet?” The short answer is no, and if you ask me, we’re not even close. Even Wymo, Google’s self-driving car, still needs a human driver 20 percent of the time. I believe there are three aspects of driverless cars that aren’t quite there yet: sensors, processing, and pricing.
The most important part of a car is the driver’s eyes. When we take the driver out of the car, something must replace those eyes, and there are currently three types of technology that can do this: cameras, LIDAR and radar. At first glance, cameras seem like the most cost-effective option, but they have several flaws: first, their range is limited, and second, light, shadows and bad weather conditions can all dangerously affect a camera’s perception ability.
Another option that is used today is the LIDAR surveying method. LIDAR, or Light Detection and Ranging, uses light and lasers to measure the distance ranges surrounding the car. It does this by shining a light at outside targets and measuring the pulses reflected from it. Lidar’s main flaw is its high cost and low range. Some companies offer to reduce the price and increase the performance. Still, weather conditions remain a problem.
This brings us to radar, a cheap and effective option, but one that suffers from two main problems: resolution and reliability. Next generation 4D imaging radar is a way to solve these problems and build a high-resolution picture under any weather conditions. It’s a complicated challenge, but without tackling it, true autonomous driving can’t be reached.
If the sensors of the car are the eyes, the processing unit is the brain. Processors take all the data collected by the sensors and turn it into actual driving. We might think that machines would naturally be better at driving than humans, with faster responses, automatic decision-making, and an infinite amount of attention and stamina. However, cars still need to “learn” how to drive. It’s an offline process. Some companies offer simulations in order to reduce the amount of miles that the machines will actually need to drive.
The main question is how to grow from sensor fusions, which means that all sensors send data to a processor that integrates the data into a smart fusion system. Regular fusion today needs high-end servers that can’t be used in a mass-produced car. Smart fusion will reduce the data on the detection level.
Also, the standard for processors is higher when it comes to autonomous cars — if your phone or laptop crashes, you can bang your fists on the table but you’ll still be able to restart it. If the processing system of your autonomous car crashes, it could lead to a car crash as well. Having a crash-proof system that also knows all of what we know about driving in every possible scenario takes up space. I predict that car manufacturers will put a server farm in the back of their vehicles, where it will act as the brain of the car.
We will never see commercial cars become autonomous if the necessary components aren’t cost-effective. As stated above, there are high costs associated with sensors and cameras, and until they come down, we can’t fill the roads with shiny new driverless cars. The good news is that companies are racing to bring down their prices, and as time goes by, they will gradually succeed.
We can assume that the customer will pay around $6000 more for an autonomous suit, which brings the costs of the entire hardware for the manufacturer to around $1000 for everything — camera, LIDAR, radar, and processing. Early adopters who don’t easily succumb to sticker shock will be the first in line to buy autonomous vehicles, and they’ll ring in the new industry. Just as cars became affordable, so will driverless ones.
So… are we there yet?
Almost. The future is still zooming towards us in high gear. Driverless technology will slowly start to seep into mainstream motor vehicles (like Tesla’s Autopilot), but, it will take time before you see it in your next car module.
Bodega and the problem with disruption