HomeTech PlusTECH & OTHER NEWSFacebook researchers propose ‘pre-fine-tuning’ to improve language model performance

Facebook researchers propose ‘pre-fine-tuning’ to improve language model performance

Machine learning researchers have achieved remarkable success with language model pretraining, which uses self-supervision, a training technique that doesn’t require labeled data. Pretraining refers to training a model with one task to help it recognize patterns that can be applied to a range of other tasks. In this way, pretraining imitates the way human beings process new knowledge. That is, using parameters of tasks that have been learned before, models learn to adapt to new and unfamiliar tasks.

For many natural language tasks, however, training examples for related problems exist. In an attempt to leverage these, researchers at Facebook propose “prefine-tuning,” a methodology of training language models that involves a learning step with over 4.8 million training examples performed on around 50 classification, summarization, question-answering, and commonsense reasoning datasets. They claim that prefine-tuning consistently improves performance for pretrained models while also significantly improving sample efficiency during fine-tuning.

It’s an approach that has been attempted before, often with success. In a 2019 study, researchers at the Allen Institute noticed that prefine-tuning a BERT model on a multiple choice question dataset appeared to teach the model something about multiple choice questions in general. A subsequent study found that prefine-tuning increased a model’s robustness for name swaps, where the names of different people were swapped in a sentence about which the model had to answer.

In order to ensure that their prefine-tuning stage incorporated general language representations, the researchers included tasks in four different domains: classification, commonsense reasoning, machine reading comprehension, and summarization. They call their prefintuned models MUPPET, which roughly stands for “Massive Multi-task Representation with Prefinetuning.”

After prefine-tuning RoBERTa and BART, two popular pretrained models for natural language understanding, the researchers tested their performance on widely-used benchmarks including RTE, BoolQ, RACE, SQuAD, and MNLI. Interestingly, the results show that prefine-tuning can hurt performance when few tasks are used to a critical point, usually above 15 tasks. But prefine-tuning beyond this point leads to performance improvements correlated with the number of language tasks. MUPPET models outperform their vanilla pretrained counterparts and leveraging representations with 34-40 tasks enables the models to reach higher even accuracies with less data than a baseline RoBERTa model.

“These [performance] gains are particularly strong in the low resource regime, where there is relatively little labeled data for fine-tuning,” the researchers wrote in a paper describing their work. “We show that we can effectively learn more robust representations through multitask learning at scale. “Our work shows how even seemingly very different datasets, for example, summarization and extractive QA, can help each other by improving the model’s representations.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more

Become a member

By VentureBeat Source Link

Technology For You
Technology For Youhttps://www.technologyforyou.org
Technology For You - One of the Leading Online TECHNOLOGY NEWS Media providing the Latest & Real-time news on Technology, Cyber Security, Smartphones/Gadgets, Apps, Startups, Careers, Tech Skills, Web Updates, Tech Industry News, Product Reviews and TechKnowledge...etc. Technology For You has always brought technology to the doorstep of the Industry through its exclusive content, updates, and expertise from industry leaders through its Online Tech News Website. Technology For You Provides Advertisers with a strong Digital Platform to reach lakhs of people in India as well as abroad.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

CYBER SECURITY NEWS

TECH NEWS

TOP NEWS