Saijo George

Curated by Saijo George

Read more

thursday9 Jan 2020

Google Open-Sources ALBERT Natural Language Model

Google AI has open-sourced A Lite Bert (ALBERT), a deep-learning natural language processing (NLP) model, which uses 89% fewer parameters than the state-of-the-art BERT model, with little loss of accuracy. The model can also be scaled-up to achieve new state-of-the-art performance on NLP benchmarks.

The research team described the model in a paper to be presented at the International Conference on Learning Representations. ALBERT uses two optimizations to reduce model size: a factorization of the embedding layer and parameter-sharing across the hidden layers of the network. Combining these two approaches results in a baseline model with only 12M parameters, compared to BERT’s 108M, while achieving an average of 80.1% accuracy on several NLP benchmarks compared with BERT’s 82.3% average. The team also trained aย “double-extra-large” ALBERT model with 235M parameters which performed better on benchmarks than the “large” BERT model with 334M parameters.

Google has released a TensorFlow-based implementation of ALBERT as well as models trained on an English-language corpus and a Chinese-language corpus.ย The ALBERT code and models are available on GitHub.

No Media

Saijo's tl;dr marketing newsletter has a perfect balance. You are sure to get the best SEO news without being overwhelmed by them.

Check Your Websites or Your Competitors Performance to Other Sites in a Core Web Vital Leaderboard 1 - SEO News

Jean-Christophe Chouinard Sr. SEO Specialist Seek