[1906.01502] How multilingual is Multilingual BERT?
9 days ago by arsyed
In this paper, we show that Multilingual BERT (M-BERT), released by Devlin et al. (2018) as a single language model pre-trained from monolingual corpora in 104 languages, is surprisingly good at zero-shot cross-lingual model transfer, in which task-specific annotations in one language are used to fine-tune the model for evaluation in another language. To understand why, we present a large number of probing experiments, showing that transfer is possible even to languages in different scripts, that transfer works best between typologically similar languages, that monolingual corpora can train models for code-switching, and that the model can find translation pairs. From these results, we can conclude that M-BERT does create multilingual representations, but that these representations exhibit systematic deficiencies affecting certain language pairs.nlp transfer-learning multilingual bert
9 days ago by arsyed
related tagsadversarial ai albert algorithms algoritmes allennlp analysis api arxiv attention bert blogs branding cade_metz christopherpenn classification cloudcomputing compress compression context coreml dataset deep-learning deeplearning development distillation elasticsearch elmo embeddings entityextraction eric_wallace evaluation explainability exposition fairness fakenews fast.ai fast_ai fastai featuredsnippets fine-tune fine-tuning galo generative german google gp2 gpgpu gpt-2 gpt2 grover healthcare heygoogle howto interpretability intro ir keras language later learning libs linguistics machine-learning machine_learning machinelearning microservices ml mobile model models multilingual muppets neal_lathia netapinotes neural-mt neural-net neuralnetwork news newsletter nlp nlu nn nvidia openai opensource paper pretrained python pytorch qa quantization question_answering ranking research-article research search-intent search sem semantics sentenceclassification sentiment sentimentanalysis seo squad summarization swift tensorflow text textanalysis textgeneration textmining toread transfer-learning transferlearning transformer transformers trends tutorial tutorials video vision washingtonpost wordembedding
Copy this bookmark: