11 mai 2021
https://creativecommons.org/licenses/by-nc-nd/4.0/ , info:eu-repo/semantics/openAccess
Mauro Bennici, « ghostwriter19 @ ATE_ABSITA: Zero-Shot and ONNX to speed up BERT on sentiment analysis tasks at EVALITA 2020 », Accademia University Press, ID : 10.4000/books.aaccademia.6889
With the arrival of BERT in 2018, NLP research has taken a significant step forward. However, the necessary computing power has grown accordingly. Various distillation and optimization systems have been adopted but are costly in terms of cost-benefit ratio. The most important improvements are obtained by creating increasingly complex models with more layers and parameters.In this research, we will see how, by mixing transfer learning, zero-shot learning, and ONNX runtime, we can access the power of BERT right now, optimizing time and resources, achieving noticeable results on day one.