TAPAS (Table Parser) is a weakly supervised Transformer-based question answering model that reasons over tables without generating logical forms. Instead,...
Machine Learning in NLP is making a lot of progress. It can be used for many language tasks, primarily thanks...
Masked Language Modeling (MLM) is a language task very common in Transformer architectures today. It involves masking part of the...
Convolutional BERT (ConvBERT) improves the original BERT by replacing some Multi-headed Self-attention segments with cheaper and naturally local operations, so-called...
Transformer architectures have gained a lot of attention in the field of Natural Language Processing. Ever since the original Transformer...
Wav2vec 2 is the successor of the Wav2vec model and was developed by Facebook AI. It can be used for performing speech...
Transformers have significantly changed the way in which Natural Language Processing tasks can be performed. This architecture, which trumps the...
The Bidirectional and Auto-Regressive Transformer or BART is a Transformer that combines the Bidirectional Encoder (i.e. BERT like) with an...
The Text-to-Text Transfer Transformer or T5 is a type of Transformer that is capable of being trained on a variety of tasks with a...
Deep learning approaches have boosted the field of Natural Language Processing in recent years. A variety of tasks can now...