Ayuda
Ir al contenido

Dialnet


Resumen de Large Language Models for Latvian Named Entity Recognition

Rinalds Vīksna, Inguna Skadina

  • Transformer-based language models pre-trained on large corpora have demonstrated good results on multiple natural language processing tasks for widely used languages including named entity recognition (NER). In this paper, we investigate the role of the BERT models in the NER task for Latvian. We introduce the BERT model pre-trained on the Latvian language data. We demonstrate that the Latvian BERT model, pre-trained on large Latvian corpora, achieves better results (81.91 F1-measure on average vs 78.37 on M-BERT for a dataset with nine named entity types, and 79.72 vs 78.83 on another dataset with seven types) than multilingual BERT and outperforms previously developed Latvian NER systems.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus