Ayuda
Ir al contenido

Dialnet


Resumen de LVBERT: Transformer-Based Model for Latvian Language Understanding

Artūrs Znotiņš, Guntis Barzdins

  • This paper presents LVBERT– the first publicly available monolingual language model pre-trained for Latvian. We show that LVBERT improves the stateof-the-art for three Latvian NLP tasks including Part-of-Speech tagging, Named Entity Recognition and Universal Dependency parsing. We release LVBERT to facilitate future research and downstream applications for Latvian NLP.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus