Home ➤ Reseach Talks ➤ 050 26 04 2023
Exploring Cross-sentence Contexts for Named Entity Recognition with BERT
Sivaganeshan Aravinth
Named Entity Recognition had been until recently considered as a sequence labelling task with each sentence as an input, but in many cases the information for NER might be found anywhere else in the text. Recent self-attention models like BERT provide a way for capturing long-distance relationships within text and to add cross-sentence context for NLP tasks. This paper presents a systematic study exploring the use of cross-sentence information for NER using BERT models in five languages.
Page: /