Google's self-driving car breakthrough: Stop sign no longer a problem
Auto auto's new code can also see cyclists' arm-waving, dodge roadworks
Vid Google has updated the software in its self-driving cars after spending the past year running prototypes around its hometown of Mountain View, California, to test the vehicles' performance on hectic city streets.
The advertising giant said its code had spotted hundreds of distinct objects, having logged thousands of miles of rubber on tarmac. As a result, its tech can now, we're told, identify buses, stop signs, pedestrians and cyclists, even picking out hand gestures by riders to indicate a turn.
And Google claimed that its self-driving car has a much better attention-span than a human being, because the system is programmed to pick up on lots of different bits of detail at once without being distracted.
Project director Chris Urmson at the web goliath said in a blog post:
As it turns out, what looks chaotic and random on a city street to the human eye is actually fairly predictable to a computer. As we’ve encountered thousands of different situations, we’ve built software models of what to expect, from the likely (a car stopping at a red light) to the unlikely (blowing through it).
We still have lots of problems to solve, including teaching the car to drive more streets in Mountain View before we tackle another town, but thousands of situations on city streets that would have stumped us two years ago can now be navigated autonomously.
To date, Google said it had logged nearly 700,000 autonomous miles with its self-driving cars.
The vid above from the firm shows, among other things, how the vehicles might overcome unexpected construction work on the road. ®
Sponsored: Navigating the threat landscape