This open research paper was presented at CLEI 2019 in Panama.

It uses 10 years of parliamentary discussions from Argentina and Costa Rica, as collected for the IE4OpenData project, to answer the research question: how important is dialect differences when using a large, multi-lingual pretrained contextual model (BERT). The answer is: dialect matters, but task is much more important.

The paper itself is available, together with its code and data. The are also available.