Table of contents
Table of contents
Among both traditional carmakers and cutting-edge tech behemoths, there is massive competition to bring autonomous vehicles to market.
It was a beautiful, sunny day of June 18, 1914 when the brilliant engineer Lawrence Sperry stunned the jury of Concours de la Securité en Aéroplane (Airline Safety Competition) by flying in front of their lodge with his hands held high. It was the first time the public had ever seen a gyroscopic stabilizer, one of the first autopiloting devices. Over a hundred years later, automatic flight control devices and maritime autopilots are common, while cars still require human operation. Thanks to machine learning and autonomous cars, that’s about to change.
What is the future of autonomous vehicles?
According to recent reports, autonomous cars are going to disrupt the private, public and freight transportation industries. A recent Deloitte publication reports that society is putting more and more trust in autonomous vehicles. In 2017, 74% of US, 72% of German and 69% of Canadian respondents declared that fully autonomous cars would not be safe. But those rates have now dropped significantly, to 47%, 45% and 44%, respectively. Plans for building self-driving cars have been revealed by BMW, Nissan and Ford, while Uber and the Google-affiliated Waymo are also in the thick of the race. Companies aim both to build urban driving vehicles and autonomous trucks, while a startup scene supporting autonomous technology is emerging. Thanks to the increasing popularity of autonomous cars, up to 40% of mileage could be driven in self-driving vehicles in 2030. But, as always, the devil is in the details.What is an autonomous car?
To answer that question, the National Highway Traffic Safety Administration uses the autonomous vehicle taxonomy designed by the Society of Automotive Engineers, which lists five levels of automation.- No automation – the driver performs all driving tasks
- Driver assistance – the car has built-in functions to assist the driver, who nonetheless must remain engaged in the driving process. Cruise control is one of the best examples.
- Partial automation – the vehicle has combined automated functions like acceleration and steering, but the driver must remain engaged. The gyroscopic stabilizer is an example of partial automation.
- Conditional automation – a human driver is necessary in totally unpredictable situations, but not required to monitor the environment all the time. BMW currently has a fleet of about 40 level 4 cars unleashed on testing grounds near Munich and in California.
- High automation – the car on this level may not even have a steering wheel and can deal with any situation encountered. Fully autonomous vehicles, which do not yet exist, occupy Level 5.
- Cameras – detect and track pedestrians and cyclists, monitor free space and traffic lights
- Articulating radars – detect moving vehicles at long range over a wide field of view
- Short-range radars – monitor objects around the vehicle
- Long-range radars – detect vehicles and measure velocity
- Lidars – detect fixed and moving with objects high-precision laser sensors
Two ways how autonomous cars work
There are currently two approaches to building the models that control autonomous vehicles. A component-based system – the controller is built with several independent models and software components each designed to handle one task, be it road sign recognition, managing the state of the vehicle or interpreting the sensors’ signals.- Pros – dividing the system into subsystems makes building the software easier. Each component can be optimized and developed individually thus improving the system as a whole.
- Cons – developing the model requires a massive amount of data to be gathered and processed. The image recognition module needs to be fed different data than the engine control device. This makes preparing the dataset to train more than a little challenging. What’s more, the process of integrating the subsystems may be a challenge in and of itself.
- Pros – it is easier to perform all the training within the simulation environment. Modern simulators provide the model with a high-quality, diverse urban environment. Using the simulated environment greatly reduces the cost of gathering data.
- Cons – this type of model may be harder to interpret or reverse-engineer. When it comes to further tuning the model or reducing the challenge posed by the reality gap (see below) it may be a significant obstacle.