Facebook Open Sources Natural Language Processing Model
Written by Kay Ewbank   
Thursday, 26 September 2019

Facebook has made a new natural language processing model called RoBERTA available as open source. The model is an optimized version of Google's BERT model.

The Facebook researchers describe their model as a robustly optimized method for pretraining natural language processing (NLP) systems that improves on Bidirectional Encoder Representations from Transformers, or BERT, the self-supervised method released by Google in 2018.

fb

BERT has become know for the impressive results the technique has achieved on a range of NLP tasks while relying on un-annotated text drawn from the web. Most similar NLP systems are based on text that has been labeled specifically for a given task.

Facebook's new optimized method, RoBERTa, produces state-of-the-art results on the widely used NLP benchmark, General Language Understanding Evaluation (GLUE).

RoBERTa has been implemented in PyTorch, and the team modified key hyperparameters in BERT, including removing BERT’s next-sentence pretraining objective. RoBERTa was also trained with much larger mini-batches and learning rates. The developers say this allows RoBERTa to improve on the masked language modeling objective compared with BERT and leads to better downstream task performance.

After implementing these design changes, the Facebook model showed notably better performance on the MNLI, QNLI, RTE, STS-B, and RACE tasks and a sizable performance improvement on the GLUE benchmark. With a score of 88.5, RoBERTa reached the top position on the GLUE leaderboard, matching the performance of the previous leader, XLNet-Large. The team says these results highlight the importance of previously unexplored design choices in BERT training and help disentangle the relative contributions of data size, training time, and pretraining objectives.

There's a full description of RoBERTA and the research carried out in a paper published on arXiv.

fb
 

 

More Information

RoBERTa On GitHub

RoBERTa's technical details

Related Articles

Rule-Based Matching In Natural Language Processing  

Zalando Flair NLP Library Updated

Intel Open Sources NLP Architect

Google SLING: An Open Source Natural Language Parser

Spark Gets NLP Library

Microsoft Expands Cognitive Services APIs

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

Banner


Amazon Adds AWS Lambda Code Editing Tool
04/11/2024

Amazon has added a new code editing option for AWS Lambda in the AWS console based on the Code-OSS, Visual Studio Code Open Source code editor.



Rare Computer History Memorabilia Being Auctioned By Bonhams
23/10/2024

Invitations handwritten and signed by Charles Babbage, seminal papers by  Alan Turing and Claude Shannon, a "Blue Box" phone hacking device, a prototype Apple Macintosh and an Apple Lisa 2/10 are [ ... ]


More News

espbook

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Thursday, 26 September 2019 )