thunlp / Kernelgat
Licence: mit
The source codes for Fine-grained Fact Verification with Kernel Graph Attention Network.
Stars: ✭ 92
Programming Languages
python
139335 projects - #7 most used programming language
Labels
Projects that are alternatives of or similar to Kernelgat
Ktf
Kernel Test Framework - a unit test framework for the Linux kernel
Stars: ✭ 81 (-11.96%)
Mutual labels: kernel
Theseus
A modern experimental OS written from scratch in Rust to explore novel OS structure, state management techniques, and how to maximally leverage the power of language by shifting OS responsibilities into the compiler.
Stars: ✭ 1,273 (+1283.7%)
Mutual labels: kernel
Darwin Xnu
The Darwin Kernel (mirror). This repository is a pure mirror and contributions are currently not accepted via pull-requests, please submit your contributions via https://developer.apple.com/bug-reporting/
Stars: ✭ 9,504 (+10230.43%)
Mutual labels: kernel
Linux 0.11
A heavily commented linux kernel source code in Chinese.
Stars: ✭ 81 (-11.96%)
Mutual labels: kernel
Windows Kernel Explorer
A free but powerful Windows kernel research tool.
Stars: ✭ 1,299 (+1311.96%)
Mutual labels: kernel
Mimiker
Simple unix-like operating system for education and research purposes
Stars: ✭ 85 (-7.61%)
Mutual labels: kernel
Webboot
Tools to let a u-root instance boot signed live distro images over the web
Stars: ✭ 78 (-15.22%)
Mutual labels: kernel
Softiwarp
SoftiWARP: Software iWARP kernel driver and user library for Linux
Stars: ✭ 84 (-8.7%)
Mutual labels: kernel
Binderfilter
A Linux kernel IPC firewall and logger for Android and Binder
Stars: ✭ 70 (-23.91%)
Mutual labels: kernel
Kernel Graph Attention Network (KGAT)
There are source codes for Fine-grained Fact Verification with Kernel Graph Attention Network.
For more information about the FEVER 1.0 shared task can be found on this website.
😃 What's New
Fact Extraction and Verification with SCIFACT
The shared task introduces scientific claim verification for helping scientists, clinicians, and public to verify the credibility of such claims with scientific literature, especially for the claims related to COVID-19.
>> Reproduce Our Results >> About SCIFACT Dataset >> Our Paper
Requirement
- Python 3.X
- fever_score
- Pytorch
- pytorch_pretrained_bert
- transformers
Data and Checkpoint
- All data and BERT based chechpoints can be found at Ali Drive.
- RoBERTa based models and chechpoints can be found at Ali Drive.
Retrieval Model
- BERT based ranker.
- Go to the
retrieval_model
folder for more information.
Pretrain Model
- Pre-train BERT with claim-evidence pairs.
- Go to the
pretrain
folder for more information.
KGAT Model
- Our KGAT model.
- Go to the
kgat
folder for more information.
Results
The results are all on Codalab leaderboard.
User | Pre-train Model | Label Accuracy | FEVER Score |
---|---|---|---|
GEAR_single | BERT (Base) | 0.7160 | 0.6710 |
a.soleimani.b | BERT (Large) | 0.7186 | 0.6966 |
KGAT | RoBERTa (Large) | 0.7407 | 0.7038 |
KGAT performance with different pre-trained language model.
Pre-train Model | Label Accuracy | FEVER Score |
---|---|---|
BERT (Base) | 0.7281 | 0.6940 |
BERT (Large) | 0.7361 | 0.7024 |
RoBERTa (Large) | 0.7407 | 0.7038 |
CorefBERT (RoBERT Large) | 0.7596 | 0.7230 |
Citation
@inproceedings{liu2020kernel,
title={Fine-grained Fact Verification with Kernel Graph Attention Network},
author={Liu, Zhenghao and Xiong, Chenyan and Sun, Maosong and Liu, Zhiyuan},
booktitle={Proceedings of ACL},
year={2020}
}
@inproceedings{liu2020adapting,
title = {Adapting Open Domain Fact Extraction and Verification to COVID-FACT through In-Domain Language Modeling},
author = {Liu, Zhenghao and Xiong, Chenyan and Dai, Zhuyun and Sun, Si and Sun, Maosong and Liu, Zhiyuan},
booktitle = {Findings of the Association for Computational Linguistics: EMNLP 2020},
year={2020}
}
Contact
If you have questions, suggestions and bug reports, please email:
[email protected]
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].