Bidirectional Encoder Representations From Transformers
Latest
Google now understands more conversational search queries
Google Search has just gotten better at deciphering your sometimes conversational, sometimes awkwardly phrased queries. That's made possible by implementing a neural network-based technique for language processing called Bidirectional Encoder Representations from Transformers, or BERT, which gives Search the power to recognize the importance of word sequences. The company says it's the product's "biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search."