All Projects → nicola-decao → KnowledgeEditor

nicola-decao / KnowledgeEditor

Licence: MIT license
Code for Editing Factual Knowledge in Language Models

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to KnowledgeEditor

Nlp Architect
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Stars: ✭ 2,768 (+3118.6%)
Mutual labels:  transformers
thermostat
Collection of NLP model explanations and accompanying analysis tools
Stars: ✭ 126 (+46.51%)
Mutual labels:  transformers
jax-models
Unofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (+25.58%)
Mutual labels:  transformers
COCO-LM
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (+26.74%)
Mutual labels:  transformers
gpl
Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+151.16%)
Mutual labels:  transformers
KB-ALBERT
KB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
Stars: ✭ 215 (+150%)
Mutual labels:  transformers
Nn
🧑‍🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+6551.16%)
Mutual labels:  transformers
Basic-UI-for-GPT-J-6B-with-low-vram
A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
Stars: ✭ 90 (+4.65%)
Mutual labels:  transformers
SnowflakeNet
(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
Stars: ✭ 74 (-13.95%)
Mutual labels:  transformers
TransQuest
Transformer based translation quality estimation
Stars: ✭ 85 (-1.16%)
Mutual labels:  transformers
KoBERT-Transformers
KoBERT on 🤗 Huggingface Transformers 🤗 (with Bug Fixed)
Stars: ✭ 162 (+88.37%)
Mutual labels:  transformers
Transformer-Implementations
Library - Vanilla, ViT, DeiT, BERT, GPT
Stars: ✭ 34 (-60.47%)
Mutual labels:  transformers
naru
Neural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (-11.63%)
Mutual labels:  transformers
nlp-papers
Must-read papers on Natural Language Processing (NLP)
Stars: ✭ 87 (+1.16%)
Mutual labels:  transformers
Transformer-in-PyTorch
Transformer/Transformer-XL/R-Transformer examples and explanations
Stars: ✭ 21 (-75.58%)
Mutual labels:  transformers
Pytorch Sentiment Analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+3631.4%)
Mutual labels:  transformers
LIT
[AAAI 2022] This is the official PyTorch implementation of "Less is More: Pay Less Attention in Vision Transformers"
Stars: ✭ 79 (-8.14%)
Mutual labels:  transformers
clip-italian
CLIP (Contrastive Language–Image Pre-training) for Italian
Stars: ✭ 113 (+31.4%)
Mutual labels:  transformers
text
Using Transformers from HuggingFace in R
Stars: ✭ 66 (-23.26%)
Mutual labels:  transformers
STAM-pytorch
Implementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Stars: ✭ 109 (+26.74%)
Mutual labels:  transformers

KnowledgeEditor

Code for Editing Factual Knowledge in Language Models (https://arxiv.org/abs/2104.08164).

@inproceedings{decao2021editing,
  title={Editing Factual Knowledge in Language Models}, 
  author={Nicola De Cao and Wilker Aziz and Ivan Titov},
  journal={Proceedings of the 2021 Conference on Empirical Methods in 
           Natural Language Processing (EMNLP2021)},
  url={https://arxiv.org/abs/2104.08164},
  year={2021},
}

Please consider citing our works if you use code from this repository.

Models and Data

This folder contains the datasets and the base models used for this work.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].