In order to bring human-like mind concept to autonomous vehicles, a system has been created by scientists of Massachusetts Institute of Technology, which uses simple maps and visual data to enable driverless cars to navigate routes in new, complex environments.
The new autonomous control system learns the human drivers’ steering patterns as they navigate roads in a small area, using only data from video camera feeds and a GPS like map. Then, the driverless car could be controlled by the trained system, along a planned route in a new area, by imitating the human driver.
The system is able to detect mismatches between its maps and features on the road. The system can also determine its incorrect position, sensors or mapping to correct the car’s course.
“With our system, you don’t need to train on every road beforehand,” says first author Alexander Amini, an MIT graduate student. “You can download a new map for the car to navigate through roads it has never seen before.”
“Our objective is to achieve autonomous navigation that is robust for driving in new environments,” adds co-author Daniela Rus, director of the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science. “For example, if we train an autonomous vehicle to drive in an urban setting such as the streets of Cambridge, the system should also be able to drive smoothly in the woods, even if that is an environment it has never seen before.”