How Google Assistant, Home, and Lens will completely change the way you search

Google I/O may not have revealed any new hardware, but it did introduce a plethora of ways to integrate Google into everything we do.

If Google I/O 2017 had a tagline, it might be something like “Rise of the Machine Learning.” The biggest thread throughout the keynote was an aggressive push to fully move from a mobile-first world to an AI-first world, changing the way we use Google and our devices to look at everything around us.

It used to be about algorithms, but now it’s about artificial intelligence. Google doesn’t simply want to be the tool we use in our browsers to find something, it wants to be within reach whenever we have a question, whether or not we’re even looking at a screen. We already see it with Google Assistant on Google Home, but now Google is starting to pull search out of our phones and apps in order to make it accessible everywhere.

But that also means changing our expectations. Where now we still need to explicitly ask Google to search for something, the new AI push will use our cameras, calendars, and favorite apps to deliver information where and when we need it, even if we don’t precisely know what we’re looking for. And in some instances, we might not even need to ask a question.

Seeing is believing

The only real new product Google unveiled at I/O was Google Lens. It’s not a separate app but rather an underlying platform that supercharges the way our phones integrate with their cameras and our photos. It’s kind of like Google Goggles meets Samsung’s Bixby, with a little augmented reality mixed in for good measure.

google search everywhere2 Google

Google Lens knows what it’s looking at and can tell you all about it.

And if it works as well as Google says it will, it could dramatically change the way we use search. Instead of opening Google Translate and typing a bit of text into a search field, you’ll be able to point your phone’s camera at the text, and Translate will work its magic. It’s likely similar to the way the Google Translate app uses Word Lens to instantly decipher text—so this is less about introducing a new feature, and more about making something dead simple so we’ll regularly use it.

And eventually it may work for everything else we see, too. Snap a picture of a flower and Google Lens will tell you what species it is—and from there you can use Assistant to learn more about it. This distills all the steps (and time) of a laborious image search down to a single simple action. We’re still Googling, but we’re barely thinking about it.

Home is where the search is

The Google Home desktop speaker is already a perfect way to limit our reliance on our phones without reducing our dependency on search (and Google in general). But now Home stands to become a true hub for our digital lives. The big new feature is hands-free calling—which would have been much splashier had Amazon not unveiled the same thing last week with the Echo Show. Fear not, though, because Google is also giving Home more responsibility for organizing our lives.

google search everywhere 3 Google

Now Google Home can tell you if there’s something you need to know before you ask.

A new feature called Proactive Assistant will rummage through your schedule and reminders to let you know if you’re forgetting something. So if there’s heavy traffic and you have to pick up your kid from soccer practice, Google Home will search through your calendar and traffic reports to let you know you should probably leave a little early.

It’s not too intrusive either. The spinning circle of lights at the top of the device will alert you to the notifications. From there, you can say, “Hey Google, what’s up,” and it’ll tell you what it found. Down the line, I could see it extending to other Google Now On Tap services, like breaking news stories and sports updates, and maybe even third-party apps.

Granted, it could all get a little too overbearing if Google isn’t judicious about what Google Home delivers, so there needs to be a balance between what’s useful and what’s overkill. But there’s a real opportunity to add a new dimension to search in a way that anticipates our needs, and delivers timely and relevant content like a true assistant should.

Personal touch

The mobile-first world that Google is transitioning away from isn’t just driven by traditional search. It’s also driven by apps. Our phones are filled with dozens, nay hundreds of apps that we only need to open a few times a month. And Google is looking to Assistant as the primary way to access all of that information without actually needing to open or even download the apps.

assistant transactions Google

With Transactions, you’ll be able to search, order, and pay for things through Google Assistant.

First, Google is bringing Google Home actions to phones. And with a screen, you’ll not only be able to talk to your apps through Assistant, but also interact with them in some new, fresh ways. Tasks like ordering food and accessing recipes can be swiftly done through Google Assistant, and like Google Lens, it dramatically cuts down on the time needed to accomplish these tasks.

Once you’ve told Assistant to access a specific app, Assistant will use the database to retrieve whatever you ask it. And soon that will extend to transactions as well, letting us buy products or order food just by asking. Developers can leverage Assistant to create an end-to-end ordering process that walks you through the whole transaction, from searching for something, to customizing your order, to checking out. And if it works as advertised, it will take a fraction of the time it takes now, without ever logging into a website or opening an app.

Google everywhere

Google’s new AI push isn’t about phones or apps or even Android. It’s about bringing Google everywhere it isn’t and using Assistant to fill in the gaps. It’s not a move away from mobile per se, but rather a way to make Google more versatile and expansive as we evolve away from traditional search.

google search everywhere4 Google

Soon we won’t need a bar to search for something.

But it’s still powered by the two main things that propelled Google to such astronomical heights: speed and accuracy. The three-pronged attack of Google Lens, Home, and Assistant will simultaneously expand our use of search and cut down the time we need to spend with it by delivering prompt, targeted responses.

And pretty soon we won’t need to open Chrome or tap a bar to find out about something. Google will just be there whenever we need it.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags GoogleGoogle HomeGoogle I/O 2017

Keep up with the latest tech news, reviews and previews by subscribing to the Good Gear Guide newsletter.
Michael Simon

Michael Simon

PC World (US online)
Show Comments

Most Popular Reviews

Latest Articles

Resources

PCW Evaluation Team

Cate Bacon

Aruba Instant On AP11D

The strength of the Aruba Instant On AP11D is that the design and feature set support the modern, flexible, and mobile way of working.

Dr Prabigya Shiwakoti

Aruba Instant On AP11D

Aruba backs the AP11D up with a two-year warranty and 24/7 phone support.

Tom Pope

Dynabook Portégé X30L-G

Ultimately this laptop has achieved everything I would hope for in a laptop for work, while fitting that into a form factor and weight that is remarkable.

Tom Sellers

MSI P65

This smart laptop was enjoyable to use and great to work on – creating content was super simple.

Lolita Wang

MSI GT76

It really doesn’t get more “gaming laptop” than this.

Featured Content

Product Launch Showcase

Don’t have an account? Sign up here

Don't have an account? Sign up now

Forgot password?