Original URL: https://www.theregister.com/2013/03/27/adas_futures/

Experts agree: Your next car will be smarter than you

Google's dream car? Nope. Head-up displays, parking-spot search, 'platoons', and more

By Rik Myslewski

Posted in Personal Tech, 27th March 2013 06:03 GMT

Feature Forget Google's self-driving car – for a few years, at least. Today's real action in the computer-meets-car arena is in the development of advanced driver assistance systems (ADAS), as was made abundantly clear at last week's GPU Technology Conference.

"We're not going to find ourselves driving in an autonomous car tomorrow," said Ian Riches of the research and consulting firm Strategy Analytics. Instead, as self-driving capabilities begin to appear, they'll first be used for "repetitive and dull and boring" things such as parking and driving in congested traffic.

"It's the sort of thing you never see in car adverts," Riches said. "Generally it's the handsome guy driving on a mountainous, twisty road, the handsome guy phoning his gorgeous girlfriend, the stuff you see in the marketing videos. That's not what real life is like."

One such repetitive, dull, and boring real-life scenario now under investigation, he said, is the European Commission's "Safe Road Trains for the Environment" (SARTRE) project, in which cars would platoon behind a professional driver – piloting an 18-wheeler, for example – and their cars would semi-autonomously bunch up behind the truck in a tight convoy, allowing their drivers to engage in otherwise illegal activities such as texting or chatting away on their mobile phones.

Riches said that one clear advantage of SARTRE is that it would save money. "The guys behind will be saving some fuel," he said, "so that helps. But also, what do you think is cheaper? Doing this or doubling your road capacity?"

The SARTRE project: platooning behind a professional driver

Thanks to SARTRE, as C.W. McCall would say, "Mercy sakes alive, looks like we've got us a convoy!"

But before we even get to semi-autonomous cars on our highways, ADAS-enabled vehicles will provide us – and our cars – with road and traffic information, help us park, assist in lane changes, and snap us back to focus should our attention wander from the task at hand: safe driving.

The need for ADAS is especially acute in urban situations, said Audi research engineer Mario Tippelhofer. "We spend a lot of time thinking about how we can improve safety and how can we avoid accidents in urban areas," Tipplehofer said of his team at the Volkswagen Group of America Electronics Research Laboratory in Belmont, California.

"Our approach was to help the driver to be less stressed, more focused, going into those urban areas in a more relaxed manner," he said. "We're trying to paint a vision of what urban mobility can look like for our Audi customers in the near future.

The areas of study that his research group is investigating include prediction of road conditions and congestion, intuitive interfaces for the presentation of information, and other advanced assistance systems.

There's also the need for ADAS system to be personalized for each individual driver. "Right now," he said, "your car is mostly generic, for a generic driver. But if this car would be really tailored to your needs, it would know about your needs, it could assist you in a much better way."

Part of what's needed in automotive interfaces, Tippelhofer said, is the ability to provide positive suggestions, not merely negative notifications that something has or is about to go wrong. "Right now you have a lot of blinking lights, a lot of warnings in your instrument cluster and infotainment system. But what we really need instead of warning us is helping us to make the right decisions and to stay safe."

Audi's ADAS takes a lot of information from the cloud: traffic status, real-time parking data from sensor-equipped smart meters and parking lots, and weather information, for example. "But we're not only looking at what's happening around us," he said, "we're also looking inside the vehicle."

This personalization includes not only what the driver is doing and focused on in real time, but also what his driving patterns and history are. Adding the cloud-based info to the driver-personalization info, he said, will enable Audi to develop applications to help a driver navigate in what Tippelhofer calls "the urban megacity of the future."

One of those applications, he said, will combine both real-time and predictive parking advice for on-street and off-street parking that will direct a driver to open spots and obviate the need for the all-too-familiar urban "let's go around the block one more time" parking-spot search.

Parking-spot sensors have the obvious advantage of telling an urban parking agency when a meter has timed out, so that they can send a meter maid to that spot to write a ticket, Tippelhofer said, "But as a positive effect they also make that information available to companies like us so that we can see and direct a driver where there is an open parking spot."

Audi automotive-assistance distance and object sensors

Audi's sensors include radar, lidar, camera, and adaptive cruise-control (click to enlarge; source: Audi)

Predictive algorithms will also advise as to whether that parking spot will still be available when the driver reaches it, and drop it from the list of available spaces if it's likely to have been filled. "For example," he said, "I would leave from Palo Alto, going to San Francisco" – a 35-mile drive – "and as you can imagine, the parking-spot situation is going to change a lot by the time I actually get to the city."

In addition, the more Audi's ADAS learns about the car's driver, the more it can narrow its selection of suggested parking spots based on the driver's history of choosing spots close to or further away from his or her final destination, and factor in the price of the parking and the driver's choice of on-street or off-street parking.

Predictive modeling will also be used to learn a driver's customary routes to a frequent destination, predict traffic congestion on that route at a specific time or when a traffic-causing event is about to take place, and – without the driver firing up his or her navigation system – reroute the driver when the congestion is bad enough that avoiding it would be more efficient than driving through it, even if the distance traveled might be longer.

"For example, Tippelhofer said, "if you're going to San Francisco and there's a ball game or a 49ers game, there's going to be a big traffic jam around that particular destination. That is known. So we can look into schedules of social events that might affect the traffic flow, and based on our simulations make predictions as to what is most likely the best route for your specific destination."

Navigation clues will also be refined so that they won't merely be limited to such information as street names and distances. Instead, he said, Audi's ADAS will give visual-clue directions such as "turn left at the Starbucks" or "your destination is two blocks past the red church on the right."

In addition, multiple in-car cameras will keep an eye on the driver, checking out what he or she is focused on, how long the driver is looking away from the road – at, say, the car's infotainment system and the like – and direct the driver's attention back to the road when necessary.

"This needs to be done in a positive human-machine interface," Tippelhofer said, "because we don't want to distract the driver even more if we detect that he's not paying attention."

Not only would the system attempt to gently and non-intrusively suggest to the driver that it might be a good idea to get their eyes back on the road, but it could also kick in an adaptive cruise-control (ACC) system, which would not allow the driver to inadvertently accelerate due to lack of attention, and would keep a safe distance from the car in front.

Another aspect of driver assistance that the Audi group is investigating is how to help drivers merge into traffic. "Merging onto urban freeways is a very stressful situation," Tippelhofer said, "because there's a very short amount of time to make decisions, and they have to be the right ones."

To help this stressful process, Audi has equipped its test vehicles with multiple radar and lidar (light detection and ranging) sensors, along with cameras and ACC sensors. "We fuse all this information into a recommendation to the driver, what is the best possible way for you to merge into that spot that is opening up," he said.

This merge-recommendation system can also be personalized, since some drivers are willing and able to merge into a tighter spot than others. Audi's ADAS will learn your preferences and adapt accordingly.

Heads up!

Safer merging into traffic is also one of the goals of Victor Ng-Thow-Hing and his team at the Honda Research Institute USA in Mountain View, California. Their system's recommendations, however, are presented on an interface that's at the core of their research: an augmented-reality head-up display viewed through a vehicle's windshield.

Like Tippelhofer and his Audi team, Ng-Thow-Hing and his three fellow researchers at Honda are focused on providing a vehicle's driver with information that will make driving safer and less stressful, despite the fact that cars are being loaded up with more and more distracting technology.

Changing demographics are one spur to their research. At the same time that cars and their multiple displays are becoming more complex, Ng-Thow-Hing said, more and more older drivers are on the road – and they have slower reaction times and problems with perception of such critical driving factors as speed and distance.

"And we also have a lot of young hipsters," he said. "A lot of people have fallen in love with their smartphone devices, and they want to have that connectivity, social media, and all that stuff in their cars" and use them safely. Both populations need help, he said.

What Honda is working on is augmented reality in the car. "And I want to differentiate that from just a simple head-up display. What I mean is that we're looking at a real-world scene, and we're actually augmenting that scene with computer-graphic elements mixed into that."

As with Audi's system, Honda's system – which they call HI-CAR, for Honda Interactive Contextual Augmented Reality – uses information both from sensors and from the internet. As Ng-Thow-Hing explained, however, augmented reality in a car is far different from the augmented reality apps many of us have checked out on our smartphones and tablets.

Smartphones are great for augmented reality, he said, because they're mobile, portable, and all have cameras. Tablets can be even better because they have a larger display with which to interact. "But in both of these cases the augmented-reality experience is not very immersive," he said, "because you're looking at everything through a video overlay – you're not actually seeing it directly in the scene."

Not so with automobiles. "The car is the ultimate mobile device," Ng-Thow-Hing said, because it's connected, portable, loaded with sensors, can have a larger amount of compute power to work with, and the windshield offers a larger field of view. "And you're actually looking through and seeing the real world through your wiindshield – it's not like a video overlay."

One of the problems that Ng-Thow-Hing sees in current augmented-reality displays is that their designers overlay flat, 2D labels on objects in a 3D world – a method that he argues creates a "cognitive dissonance." His system instead annotates real-world images with labels that have been transformed using perspective – a label identifying a builing, for example, is displayed as if it were a sign on the building itself, in the same perspective as the building.

Augmented reality head-up displays for cars – cognitive dissonance problem

2D labels make you brain switch back and forth between perception modes (source: Honda)

Augmented reality head-up displays for cars – cognitive dissonance solution

3D labels keep your brain comfortably in the same perspective (source: Honda)

Overlaying computer-generated imagery onto a real-world view through a car's windshield also presents two other challenges that overlaying imagery onto a flat video on a smartphone or tablet doesn't have: first, your eyes change focal length when moving from close-in to far-off objects; second, parallax shifts occur and misalign the augmented reality images when you shift your point of view inside the car. According to Ng-Thow-Hing, however, his team is tackling both those problems.

But even when such technical problems have been overcome, a head-up augmented reality must be carefully designed so as not to cause what Ng-Thow-Hing defined as "inattentional blindness," meaning distraction when interface-design elements such as blinking red lights cause the driver to look only at them and not at the scene as a whole. Also to be shunned are elements on the display that aren't related to driving – "album covers and things like that" – which can distract the driver.

Such lousy augmented-reality interface design can be disastrous. "You're no longer driving very carefully," he said. "When that happens, it can lead to big accidents and crashes. Dying is a bad user experience."

Quite.

One of the non-intrusive user-interface elements that the Honda team has come up with is a proposed answer to the merging and lane-changing challenge mentioned above. A grid is projected onto the head-up display with the active car in the middle, and other cars around it – front, back, and sides – are indicated by red rectangles. With this information, the driver can know what's approaching from the car's blind spot without having to glance backward and take his or her eyes off the road.

Augmented reality head-up displays for cars – design iteration

An iterative design process led to Honda's grid-based car-location system (click to enlarge; source: Honda)

Another user interface that the team has developed aids left turns. The speed of oncoming cars is estimated by sensors, and their projected path is indicated by a yellow on-road overlay in the augmented-reality head-up display. The path of the driver's car is also indicated, turning left. When the oncoming car's path would intersect with the driver's care, that path turns red; when the turn would be safe, the driver's path turns green.

Before they test their system in an actual vehicle, the Honda team is using a driving simulator. Ng-Thow-Hing emphasized that much care and iteration must be undertaken to ensure that an augmented-reality display adds to the drivers's safety, not distracts as noted above. When the team moves from simulators to an actual vehicle, he said, "Most likely I'll be the test subject for the first car, and I don't want to die while doing it."

It's all about speed – processing speed, that is

One requirement for keeping Ng-Thow-Hing alive, of course, is making sure that an augmented-reality interface operates in real time. "If you think about something that differentiates computer vision for ADAS and computer vision for other areas, you can think of stability and robustness and functionality," said Victor Eruhimov, CTO of the Nizhny Novgorod, Russia–based computer-vision company Itseez. "But another key factor is the speed."

ADAS tasks, Eruhimov emphasized, must run in real time, and they must run on embedded systems. Other embedded tasks such as robotics – another area in which Itseez has been active – aren't as dependent on real-time activity. "And it's challenging to be real-time in the embedded environment," he said. "Their resources are really limited compared to desktop and server environments."

Hardware-software efficiency is obviously key to real-time performance, he said, and to the rescue comes the soon-to-be-released OpenVX hardware abstraction layer from the Khronos Group, which is aiming for provisional specification in the first half of this year, with the final specification to come in the second half of the year.

According to the Khronos Group, "The Khronos OpenVX working group has been formed to drive industry consensus to create a cross-platform API standard to enable hardware vendors to implement and optimize accelerated computer vision algorithms."

The working group, Eruhimov explained, is primarily targeting mobile and embedded use cases such as computational photography, augmented reality, and automotive applications such as ADAS. One of OpenVX's goals is to transparently support the use of the parallel-processing powers of a GPU to accelerate image processing.

OpenVX, by the way, is intended to be a hardware and software acceleration supplement to, and not a replacement of, the industry-standard, cross-platform OpenCV computer-vision library that was originally introduced by Intel in 1999 and is now freely available under a BSD license.

Khronos Group's OpenVX

OpenVX will speed software for CPUs, GPU-enabled parallelism, and dedicated hardware (source: Khronos Group)

Although OpenVX is still under development, Itseez has already managed to achieve some impressive ADAS results using just OpenCV and a single-core Nvidia Tegra–based embedded system – namely, the ability to recognize US, European, and Russian street signs captured by a camera behind a vehicle's windshield.

Itseez's system scans the entire field of vision of a moving vehicle and extracts sign information from the scene in real time. Although it's a somewhat primitive proof-of-concept effort, able only to recognize speed, stop, and yield signs, it works quite well in low-contrast situations, in bad weather, and when a sign is partially occluded by, for example, tree limbs.

The current system runs strictly on a single CPU core of a Tegra's ARM Cortex-A9. "When we go to a CUDA-enabled GPU," Eruhimov said, "we get a 6X boost on embedded systems, and when we're allowed to use four cores, we'll get full scalability – a 3X speedup."

If the current single-core, no-GPU, no-CUDA system can recognize three street signs in real time, the addition of those speed-ups – plus, of course, more powerful CPU and GPU cores – will bring sign-recognition to much higher levels of functionality. Add text-to-speech capabilities and pedestrian detection, and ADAS systems with such powers could be quite versatile, indeed.

So what's the future of these ADAS systems, and what's holding them back from becoming standard equipment on all new cars? According to Ian Riches of Strategy Analytics, that day is coming sooner than you might think.

"If you look at actual growth rates" of OEM spending on the development of automotive electronic systems from engines to chassis to security and more, "these advanced driver assistance systems are the second fastest growing area" between 2012 and 2017, Riches said.

Projected growth in automotive electronics systems

Want to invest in automotive electronics? ADAS looks to be a good bet (click to enlarge; source System Analytics)

The only area that's projected to grow faster is hybrid-electric and electric systems. "But if you pin me down and say, 'Which of those forecasts are you most confident will hit those high growth rates?" Riches said, "I'll say the ADAS – there's much higher uncertainty as to the HEV/EV, particularly when you move to the EV end of that."

There are a number of reasons for the projected explosive growth of ADAS, he said: governments worldwide are demanding safer vehicles, consumer interest is growing, and prices are dropping to such a degree that the cost of ADAS systems is approaching the hundreds of dollars, not the thousands that they currently cost.

"Also, automakers need new features on their vehicles," he said. "They're in a competitive environment. ... They're always looking for the latest features to make their vehicles stand out, to make theirs look more attractive in the marketplace."

Automakers are also becoming more flexible in their packaging of ADAS systems. "It used to be that if you wanted some of this stuff, you had to take the five or eight-thousand dollar 'technology pack option'. Yeah, that's kind of a steep ask, isn't it?"

Riches also said that consumers are easily seduced by ADAS features, using the head-up display as one example. "Once they've tried it, they love it." BMW's head-up display, he said, had a 99 per cent uptake rate once a car buyer gave it a go – admittedly a well-heeled car buyer looking at a top-end bimmer, of course, but that's still an exceptionally high uptake rate.

By 2019, Strategy Analytics projects that OEMs will spend over $20bn on ADAS systems, ranging from simple ultrasonic backup warnings all the way up to head-up displays and night-vision capabilities. "A significant chunk of change," Riches said – though he didn't have must respect for the latter capability. "In general," he said, "if you're looking to spend money on improving your night vision, buy a better headlamp system, not a night-vision system – apologies to any night-vision system vendors here."

ADAS demand at the system level

By 2019, a wide range of ADAS functions should cut down on car wrecks (click to enlarge; source System Analytics)

Riches admitted that Strategy Analytics' projections are quite "rosy." But what are the barriers that might suppress this growth? For one, he said, a disastrous, public failure of an ADAS system – but that has yet to happen. To illustrate the kind of bad press such an event could engender, he displayed a slide with a Reg article entitled "Satanic Renault takes hapless French bloke on 200km/h joyride".

That article was all over the European press, he said, but he chose The Reg's coverage to highlight because "This is a tech blog in the UK that's quite well known and is slightly more humorous than some, which is why I picked it up, because I tend to like the humorous approach to life," he said.

So do we, Mr. Riches – and thanks for the plug.

Humor is all well and good, but Riches noted that ADAS systems are making what he called "significant interventions" in a car's operations, commanding brake actuations, steering interventions, and the like. "There is great potential for getting it wrong in a big way that could ultimately hurt people," he said.

And that will happen, he said. "Statistically speaking it's bound to happen at some point." And when it does, ADAS uptake will suffer.

To ensure that ADAS systems help rather than distract drivers and cause catastrophic accidents and their inevitable bad publicity, Riches agreed with Honda's Ng-Thow-Hing that driver-assist systems have to be carefully crafted. They also need to provide information to the driver is subtle ways. "I don't know about you," he said, "but if I'm getting slightly too close to the car in front, I don't need my car to tell my wife."

But the growth and acceptance of ADAS systems is inevitable, he believes – but it's going to take some time. "Outsiders always underestimate how long it takes to get into automotive," he said. Automotive OEMs suffer from the "not invented here" syndrome, and that reluctance to deal with non-automotive partners will slow ADAS adoption.

That said, plans outlined by EU's crash-test body, the Euro NCAP, will spur ADAS adoption – but as those plans are implemented, the incentive for consumers to reach into their wallets for basic ADAS will, well, crash. Beginning in 2015, Riches said, Euro NCAP will begin to require driver-assist systems on cars before they'll award those cars the coveted five-star rating. The requirements will begin with some form of an autonomous braking system, then move on to pedestrian-detection, lane-departure warnings, and more.

That's all well and good for drivers and their passengers, Riches said, but when a feature becomes a government requirement, customers won't need to pay for it – it'll be up to the OEM to pay for what will become a standard feature.

"How are they going to get that money back?" asked Riches. "They're going to have to sell you something else." But those extra goodies over and above the Euro NCAP requirements could be added ADAS features. "Oh, you want traffic-sign recognition? That'll be an extra $300."

The sweet part of that strategy would be that adding such a feature might be a mere software upgrade or, at worst, a memory upgrade or a slightly more powerful processor – nothing that would cost the OEM anything close to $300.

Finally, if you've followed the rise of digital electronics in the automotive sphere, you've certainly heard of car-to-car communications, in which cars can share info amongst themselves such as location, speed, and even texting or voice. Riches sees that eventuality as a long way off.

Why? The ol' chicken-and-egg conundrum. "No one has shown me in a car-to-car communication system what me as a potential first-buyer-ever is going to get out of it."

Good point, sir – but The Reg wouldn't bet the farm that car-to-car communication won't eventually appear on one of Strategy Analytics' charts that project potential future markets. ®