HomeReseach Talks ➤ 050 26 04 2023

Exploring Cross-sentence Contexts for Named Entity Recognition with BERT

Sivaganeshan Aravinth
Slides Video Paper

Named Entity Recognition had been until recently considered as a sequence labelling task with each sentence as an input, but in many cases the information for NER might be found anywhere else in the text. Recent self-attention models like BERT provide a way for capturing long-distance relationships within text and to add cross-sentence context for NLP tasks. This paper presents a systematic study exploring the use of cross-sentence information for NER using BERT models in five languages.

    Page: /