Transformer-based models implemented in tensorflow 2.x(using keras).
-
Updated
Nov 23, 2021 - Python
Transformer-based models implemented in tensorflow 2.x(using keras).
[NeurIPS 2023 Main Track] This is the repository for the paper titled "Don’t Stop Pretraining? Make Prompt-based Fine-tuning Powerful Learner"
The source code used for paper "Empower Entity Set Expansion via Language Model Probing", published in ACL 2020.
Recent Advances in Vision-Language Pre-training!
Sample tutorials for training Natural Language Processing Models with Transformers
Code to reproduce experiments from the paper "Continual Pre-Training Mitigates Forgetting in Language and Vision" https://arxiv.org/abs/2205.09357
Comparing Selective Masking Methods for Depression Detection in Social Media
A transformer-based language model trained on politics-related Twitter data. This repo is the official resource of the paper "PoliBERTweet: A Pre-trained Language Model for Analyzing Political Content on Twitter", LREC 2022
A Context Aware Approach for Generating Natural Language Attacks.
Grammar test suite for masked language models
😷 The Fill-Mask Association Test (FMAT): Measuring Propositions in Natural Language.
Final assigment for "Gestione dell'Informazione" ("Search Engines") course @ UniMoRe
Measuring Biases in Masked Language Models for PyTorch Transformers. Support for multiple social biases and evaluation measures.
Training the first Cypriot Large Language Model on the Masked Language Modeling objective for predicting a given masked word token within a given context
Score masked language models on grammar test suites
Unscrambles shuffled letters in a word sequence.
Transformers Intuition
BERT Attention Visualization is a web application powered by Streamlit, offering intuitive visualization of attention weights generated by BERT-based models.
Transformers Pre-Training with MLM objective — implemented encoder-only model and trained from scratch on Wikipedia dataset.
Data pipelines for both TensorFlow and PyTorch!
Add a description, image, and links to the masked-language-models topic page so that developers can more easily learn about it.
To associate your repository with the masked-language-models topic, visit your repo's landing page and select "manage topics."