Home Top Ad

Responsive Ads Here

Google has announced its rolling out a major upgrade to its search engine, which should mean it gets better at understanding your queries in...

Google is about to get a lot better at understanding your search queries

Google has announced its rolling out a major upgrade to its search engine, which should mean it gets better at understanding your queries in context – rather than just matching keywords, it'll actually figure out what you're trying to say.

Thanks to a new machine learning process, Google says, you won't have to try and stuff your searches with keywords to try and get the right results up first. Instead, you can be more natural and conversational.

"Particularly for longer, more conversational queries, or searches where prepositions like 'for' and 'to' matter a lot to the meaning, Search will be able to understand the context of the words in your query," says Google's Pandu Nayak.

"No matter what you’re looking for, or what language you speak, we hope you’re able to let go of some of your keyword-ese and search in a way that feels natural for you," adds Nayak.

Seek and you will find

Let's give you some of Google's examples to show how the changes work. Previously, in a "2019 brazil traveler to usa need a visa" search, Google would ignore the "to" because it's so common, and return results for US citizens going to Brazil. Now it recognizes the "to" and that the query is about travelers going from Brazil to the US.

On a search like "do estheticians stand a lot at work", Google previously wouldn't understand the context "stand" was used in (i.e. relating to the physical demands of the job). With the new update, it'll realize what you're trying to say.

These improvements are substantial enough that Google is calling this its most important search upgrade in five years. To begin with the new technology is only being applied to searches in US English, but it will expand to more languages over time.

For a more detailed look at the neural network innovations underpinning this improvement – specifically a training model called Bidirectional Encoder Representations from Transformers or BERT – head over to Google's blog post on the changes.



from TechRadar - All the latest technology news https://ift.tt/2p60xar

0 coment�rios: