Facebook Open Sources Natural Language Processing Model
Written by Kay Ewbank   
Thursday, 26 September 2019

Facebook has made a new natural language processing model called RoBERTA available as open source. The model is an optimized version of Google's BERT model.

The Facebook researchers describe their model as a robustly optimized method for pretraining natural language processing (NLP) systems that improves on Bidirectional Encoder Representations from Transformers, or BERT, the self-supervised method released by Google in 2018.

fb

BERT has become know for the impressive results the technique has achieved on a range of NLP tasks while relying on un-annotated text drawn from the web. Most similar NLP systems are based on text that has been labeled specifically for a given task.

Facebook's new optimized method, RoBERTa, produces state-of-the-art results on the widely used NLP benchmark, General Language Understanding Evaluation (GLUE).

RoBERTa has been implemented in PyTorch, and the team modified key hyperparameters in BERT, including removing BERT’s next-sentence pretraining objective. RoBERTa was also trained with much larger mini-batches and learning rates. The developers say this allows RoBERTa to improve on the masked language modeling objective compared with BERT and leads to better downstream task performance.

After implementing these design changes, the Facebook model showed notably better performance on the MNLI, QNLI, RTE, STS-B, and RACE tasks and a sizable performance improvement on the GLUE benchmark. With a score of 88.5, RoBERTa reached the top position on the GLUE leaderboard, matching the performance of the previous leader, XLNet-Large. The team says these results highlight the importance of previously unexplored design choices in BERT training and help disentangle the relative contributions of data size, training time, and pretraining objectives.

There's a full description of RoBERTA and the research carried out in a paper published on arXiv.

fb
 

 

More Information

RoBERTa On GitHub

RoBERTa's technical details

Related Articles

Rule-Based Matching In Natural Language Processing  

Zalando Flair NLP Library Updated

Intel Open Sources NLP Architect

Google SLING: An Open Source Natural Language Parser

Spark Gets NLP Library

Microsoft Expands Cognitive Services APIs

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

Banner


Gender Differences In Coding Style
13/11/2024

A novel investigation into the gender gap between men and women regarding coding ability was undertaken by Dr Siân Brooke. Her conclusion? There is a difference in the Python code [ ... ]



CSS Ecosystem In the Spotlight
06/11/2024

The 2024 edition of the State of CSS has been posted, revealing that the latest features of the language not only do away with extra tooling, but even start taking on tasks that previously requir [ ... ]


More News

espbook

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Thursday, 26 September 2019 )