Modern day search engines are powered by incredibly large language models (eg: BERT) that do well with Q&A format. Current language models have already breached the 1 trillion parameters mark and are poised to get bigger and smarter. These language models(LMs) are trained on massive quantities of unstructured data. One of the main reasons why large pre-trained LMs are so successful is that they learn highly effective contextual representations.

https://analyticsindiamag.com/rethinking-search-age-of-ai-search-engine/

#bert #nlp #machine-learning

Beyond BERT: Rethinking Search In The Age of AI
1.25 GEEK