Meta discloses machine learning language model with 175 billion parameters

Spread the love

Meta makes a translation training model public for scientists. The Open Pretrained Transformer model is an NLP model with 175 billion parameters that machine learning scientists can use.

The model is called OPT-175B and is made by the artificial intelligence division of Meta. Company make that model public for other scientists and researchers. Meta says it does that so scientists can better learn how naturallanguage processing models work. Now academics often have to pay for APIs to use such models.

OPT-175B is a language model with 175 billion parameters. The model is trained on publicly available datasets. In addition to the full model, Meta also makes a number of smaller models available, so that scientists can study ‘the effect of scaling up’. This concerns models with 125 or 350 million parameters, and models with 1.3, 2.7, 6.7, 13 and 30 billion. Later, a model with 66 billion parameters will also be available.

In addition to the models themselves, Meta also makes code available that scientists can use to set up training models. According to Meta, this is the first time that this has happened on such a large scale. The company also makes notes and logs available of times when it conducted training. Meta says the model is also much more energy efficient than other NLP models, such as GPT-3:OPT-175B is said to reduce emissions by about seven times.

The data is made available under a non-commercial license. Facebook wants to prevent abuse and ‘keep integrity’. Researchers from governments, civil rights organizations and scientific institutes can use the tool.

You might also like
Exit mobile version