Google wants you to ask more complex questions, at least as far as its mobile app is concerned. The company stated as much in an announcement today where it detailed updates made to natural language search.
Natural language search has come a long way since Google was first able to understand and answer voice searches back in 2008.
Since then, Google has developed and introduced the Knowledge Graph in 2012, which is cable of recognizing searches involving individual entities — like celebrities or sports teams.
From there, the next progression was being able to answer questions about individual entities — like how old is a celebrity, or how did a sports team do last night.
The next evolution of natural language search brings us to today’s update, where Google says it is “growing up” even more.
The latest update to the Google app for iOS and Android is capable of understanding the meaning behind queries by breaking it down into semantics.
To illustrate, Google gives the following examples of types of queries the app is able to answer:
Understanding superlatives and ordered items:
- “Who are the tallest Mavericks players?”
- “What are the largest cities in Texas?”
- “What are the largest cities in Iowa by area?”
Understanding dates and times:
- “What was the population of Singapore in 1965?”
- “What songs did Taylor Swift record in 2014?”
- “What was the Royals roster in 2013?”
Understanding complex combinations of multiple entities:
- “What are some of Seth Gabel’s father-in-law’s movies?”
- “What was the U.S. population when Bernie Sanders was born?”
- “Who was the U.S. President when the Angels won the World Series?”
Google admits that it still hasn’t perfected natural language search, but that it’s growing and learning (and sometimes making mistakes alone the way).
The next time you have a complex question, Google invites you to ask it in its app using your own natural language. The more complex questions you’re able to hit it with, the harder it will try to understand the meaning behind what you’re asking.