Google jumps the shark from search results to your camera: Nest Hub, Pixels, and more from ad giant's coder confab

AI. Privacy. AI. Privacy. AI. Privacy. AI. Priva...

Keynote
Fishy ... an augmented reality demonstration at Google I/O 2019

Google I/O Google on Tuesday gathered developers to the open air Shoreline Amphitheater beside its Mountain View, California, headquarters to witness and applaud its latest technological marvels amid natural splendor.

As a prelude to the Google I/O 2019 main event, the Chocolate Factory treated the capacity crowd of code cobblers and associates to DJ Nao Tokui mixing electronic music with the help of a TensorFlow neural network. The first attempt failed, requiring a restart.

After that, the drum machine patterns, loops and horn hits ebbed and flowed fluidly enough that you'd never have known software played a role in the beat matching and selection of soulless sequenced songs, had the algorithmic input not been evident from the projection screen. So why then involve an AI system at all if it's not better than human-driven mixing?

You can watch the I/O keynote yourself from the vid below, starting from about four minutes in, or just read our summary instead:

Youtube Video

As Google CEO Sundar Pichai said about the US giant's addition of AR navigation capabilities to the Google I/O app, "It's the type of challenge we love."

Speaking on stage, Pichai framed Google as "a company that helps you get things done," which is a bit more aspirational and less specific than former CEO Larry Page's avowed desire "to create services that people in the world use twice a day just like a toothbrush."

At least Pichai's vision statement avoids the potentially undesirable association with dental work. It also offers soothing sentiment to those who fear Google's incessant data gathering.

AR for all!

Google's attempt to be helpful is being expressed through the integration of augmented reality (AR) with Search and Lens. Search results later this month will gain a View in 3D option when the result calls for a Knowledge Panel, something Google and businesses set up to enhance results. The 3D model can be viewed through the searcher's phone and overlaid on a live camera view.

Aparna Chennapragada, VP and general manager of AR products at Google, demonstrated this by showing off a model of a shark superimposed over a camera view of the Google I/O stage.

She also showed how 3D models of clothes on a phone screen can be compared to actual clothes seen through a live camera feed, hinting at potential e-commerce applications.

Google is working with partners like NASA, New Balance, Samsung, Target, Visible Body, Volvo, Wayfair to make 3D content available with search results.

Lens, the company's image recognition system for mobile device cameras, is gaining the ability to perform visual searches. For example, when pointed at a restaurant menu, it will be able to highlight popular menu items via the camera display and if the user taps on a displayed item, it will call up additional search info like an image of the menu item. Pointed at a sign in an unfamiliar language, Lens will take the camera image and translate the sign's text, overlaying the translation on the original text.

Google Go, the company's search app for inexpensive smartphones, will use Lens to read aloud text captured by a camera, highlighting words as they're spoken.

Duplex, the company's robocalling service for restaurant reservations, is evolving into a more general helper but the augmentation surgery isn't complete yet. Pichai previewed Duplex on the web, which will soon be able to work with the company's Assistant software to make car reservations and buy movie tickets on behalf of users.

It will function like Chrome's Autocomplete feature, but with the ability to pull data from other Google apps like Calendar, Gmail, and Chrome to fill in the required fields. According to Pichai, merchants won't need to alter their webpages or create any custom code to take advantage of this system.

Carolina Milanesi, technology analyst with Creative Strategies, told The Register via Twitter that she was struck by what Google Assistant will be able to do though its connection to device and cloud. Assistant, she said, is moving further ahead of Alexa and Siri.

Pixel gets cheap

Google previewed the Pixel 3a and 3a XL, the latest iterations of its very own smartphone, selling for $399 and $479. Available from Sprint, T-Mobile, Verizon and Google Fi, it features a 12.2MP (f/1.8) dual-pixel rear camera, 8MP front camera (f/2.0), Qualcomm Snapdragon 670 processor, 4GB LPDDR4x RAM, and 64GB storage.

Android Q was discussed but that's another story.

Scott Huffman, VP of engineering, previewed the next generation of Google Assistant, one that operates with local AI models rather than datasets in the cloud. This is the result of a technical breakthrough, he explained, that allows 100GB hosted data models to be replaced with 0.5GB ones that reside on devices and deliver 10x faster performance.

The new Assistant is coming to Pixel phones later this year.

Assistant is also being taught to understand what Google calls Personal References. For example, in a few months, you'll be able to say, "What's the weather like at mom's house?" and have the software identify the reference and use contact data to complete the query. And it will gain Driving Mode, now in preview, to deal with incoming calls and surface relevant information for the task at hand.

And as of today, Google Assistant can end ringing timers using the word "stop," without the "Hey/Ok Google" invocation phrase.

Spigot dribbling coins

Everything's just fine at Google's mothership

READ MORE

Google's long neglected Nest division, put under new leadership in 2016 and combined with Google's hardware group last year, has become the company's primary home hardware brand. Calling today's smart home "fragmented and frustrating," Google SVP of hardware Rick Osterloh said all Google's Home products will inherit the Nest brand.

The Google Home Hub thus become the Nest Hub and with the change comes a price drop to $129 because there's a new model with a camera, the Nest Hub Max, for $229. The surveillance chatterbox box includes facial recognition – if you want it – to differentiate between users, in the way that Assistant can distinguish between users via voice. Google insists facial recognition models are all done locally and not shared, and the company is publishing a set of privacy commitments to explain how its Nest devices handle video and audio data.

Rishi Chandra, VP of product and general manager of Google Home and Nest, perhaps aware of the criticism Amazon has received for how its Alexa connected speaker stores recordings, addressed the rationale for clarifying how Google handles data captured within homes in a blog post:

"We recognize that we’re a guest in your home, and we respect and appreciate that invitation – and these updates are among the many ways we hope to continue to earn your trust." ®

Sponsored: How to Process, Wrangle, Analyze and Visualize your Data with Three Complementary Tools

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER




Biting the hand that feeds IT © 1998–2019