Advertisement

Researchers demonstrate the limits of driverless car technology

People without an in-depth knowledge of driverless vehicle technology could be forgiven for thinking the hard part of developing fully autonomous vehicles is getting them to navigate lanes and junctions and avoid crashing into other road-users.

However, researchers at the University of Washington, Stony Brook University, and UC Berkeley have just shown how much work still needs to be done on driverless technology, by blowing a big hole in one key part of autonomous-drive systems -- road-sign recognition.

While we already have vehicles with systems capable of reading road signs, the researchers showed how easy it is to hack the systems into interpreting the information shown on road-signs incorrectly, using nothing more complex than a few well-placed stickers.

The team analyzed the algorithm used by the vision system classifying the images, and then employed different methods of attack to manipulate signs and trick machine-learning models into reading them incorrectly.

For example, in one case stickers were used to trick the sign recognition system of an autonomous car into reading a stop sign as a 45mph sign instead.

It's not difficult to imagine the real-world implications of such an action.

The team came up with four different ways they could con the recognition systems into misreading things using just a colour printer and a camera, which they published in a paper entitled "Robust Physical-World Attacks on Machine Learning Models."

Perhaps most worrying is the fact all the "hacks" appear very subtle to anyone looking at them through human eyes as they are camouflaged as graffiti or art that is part of the sign's imagery.

One method involved printing a full-size poster to cover the sign. Though the sign looked almost normal to a human, other than being a little faded in places, the consequences for the computer's vision were catastrophic. From different angles and distances, the system classified a stop sign as a speed limit sign a worrying 100 percent of the time.

The good news is that these hacks can be overcome. Systems can be programmed to consider context, so a car can identify a misidentified sign based on criteria such as its location. It would then be able to tell, for instance, that if a highway speed sign appears in an urban area, something is wrong.

Regardless, it's a fascinating insight into just how much has to be considered when bringing truly driverless cars to our roads.