Black-Box-TuningICML'2022: Black-Box Tuning for Language-Model-as-a-Service
Stars: ✭ 99 (+17.86%)
awesome-huggingface🤗 A list of wonderful open-source projects & applications integrated with Hugging Face libraries.
Stars: ✭ 436 (+419.05%)
LM-CNLCChinese Natural Language Correction via Language Model
Stars: ✭ 15 (-82.14%)
X-TransformerX-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
Stars: ✭ 127 (+51.19%)
converseConversational text Analysis using various NLP techniques
Stars: ✭ 147 (+75%)
Ask2TransformersA Framework for Textual Entailment based Zero Shot text classification
Stars: ✭ 102 (+21.43%)
XENAXENA is the managed remote administration platform for botnet creation & development powered by blockchain and machine learning. Aiming to provide an ecosystem which serves the bot herders. Favoring secrecy and resiliency over performance. It's micro-service oriented allowing for specialization and lower footprint. Join the community of the ulti…
Stars: ✭ 127 (+51.19%)
mlp-gpt-jaxA GPT, made only of MLPs, in Jax
Stars: ✭ 53 (-36.9%)
dasher-webDasher text entry in HTML, CSS, JavaScript, and SVG
Stars: ✭ 34 (-59.52%)
Robotics-Planning-Dynamics-and-ControlRPDC : This contains all my MATLAB codes for the Robotics, Planning, Dynamics and Control . The implementations model various kinds of manipulators and mobile robots for position control, trajectory planning and path planning problems.
Stars: ✭ 171 (+103.57%)
PDDL.jlJulia parser, interpreter and compiler interface for the Planning Domain Definition Language (PDDL). Planners not included.
Stars: ✭ 52 (-38.1%)
Angry-HEXAn artificial player for the popular video game Angry Birds
Stars: ✭ 16 (-80.95%)
manipulathorManipulaTHOR, a framework that facilitates visual manipulation of objects using a robotic arm
Stars: ✭ 64 (-23.81%)
DocSumA tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Stars: ✭ 58 (-30.95%)
Transformer-MM-Explainability[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+476.19%)
intervalThis PHP library provides some tools to handle intervals. For instance, you can compute the union or intersection of two intervals.
Stars: ✭ 25 (-70.24%)
KnowledgeEditorCode for Editing Factual Knowledge in Language Models
Stars: ✭ 86 (+2.38%)
Basic-UI-for-GPT-J-6B-with-low-vramA repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
Stars: ✭ 90 (+7.14%)
Text-SummarizationAbstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-54.76%)
swig-srilmSWIG Wrapper for the SRILM toolkit
Stars: ✭ 33 (-60.71%)
WellcomeMLRepository for Machine Learning utils at the Wellcome Trust
Stars: ✭ 31 (-63.1%)
gap-text2sqlGAP-text2SQL: Learning Contextual Representations for Semantic Parsing with Generation-Augmented Pre-Training
Stars: ✭ 83 (-1.19%)
Transformers-TutorialsThis repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+3266.67%)
awesome-probabilistic-planningA curated list of online resources for probabilistic planning: papers, software and research groups around the world!
Stars: ✭ 45 (-46.43%)
pytorch-vitAn Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (+197.62%)
pantherPerception-Aware Trajectory Planner in Dynamic Environments
Stars: ✭ 115 (+36.9%)
gpt-neo-fine-tuning-exampleFine-Tune EleutherAI GPT-Neo And GPT-J-6B To Generate Netflix Movie Descriptions Using Hugginface And DeepSpeed
Stars: ✭ 157 (+86.9%)
mongolian-nlpUseful resources for Mongolian NLP
Stars: ✭ 119 (+41.67%)
subword-lstm-lmLSTM Language Model with Subword Units Input Representations
Stars: ✭ 45 (-46.43%)
LIT[AAAI 2022] This is the official PyTorch implementation of "Less is More: Pay Less Attention in Vision Transformers"
Stars: ✭ 79 (-5.95%)
TransQuestTransformer based translation quality estimation
Stars: ✭ 85 (+1.19%)
PCPMPresenting Collection of Pretrained Models. Links to pretrained models in NLP and voice.
Stars: ✭ 21 (-75%)
personality-predictionExperiments for automated personality detection using Language Models and psycholinguistic features on various famous personality datasets including the Essays dataset (Big-Five)
Stars: ✭ 109 (+29.76%)
modulesThe official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
Stars: ✭ 25 (-70.24%)
CharLMCharacter-aware Neural Language Model implemented by PyTorch
Stars: ✭ 32 (-61.9%)
transformers-interpretModel explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Stars: ✭ 861 (+925%)
CodexA free note-taking software for programmers and Computer Science students
Stars: ✭ 242 (+188.1%)
bert-squeeze🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-33.33%)
awesome-codexA list dedicated to products, demos and articles related to 🤖 OpenAI's Codex.
Stars: ✭ 115 (+36.9%)
pysentimientoA Python multilingual toolkit for Sentiment Analysis and Social NLP tasks
Stars: ✭ 274 (+226.19%)
inline-codeInline-Code Tool for Editor.js 2.0
Stars: ✭ 32 (-61.9%)
molecule-attention-transformerPytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (-45.24%)
asr2424-hour Automatic Speech Recognition
Stars: ✭ 27 (-67.86%)
transformer generalizationThe official repository for our paper "The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers". We significantly improve the systematic generalization of transformer models on a variety of datasets using simple tricks and careful considerations.
Stars: ✭ 58 (-30.95%)
CoLAKECOLING'2020: CoLAKE: Contextualized Language and Knowledge Embedding
Stars: ✭ 86 (+2.38%)
thermostatCollection of NLP model explanations and accompanying analysis tools
Stars: ✭ 126 (+50%)
SnowflakeNet(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
Stars: ✭ 74 (-11.9%)
gplPowerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (+157.14%)
scrum-planning-pokerPlease feel FREE to try it and give feedback by searching Scrum敏捷估算 in WeChat mini program.
Stars: ✭ 30 (-64.29%)
scrumonlineAlways up to date scrumonline docker build
Stars: ✭ 18 (-78.57%)
sr planSave and restore query plans in PostgreSQL
Stars: ✭ 57 (-32.14%)
mlmachine learning
Stars: ✭ 29 (-65.48%)
Vaaku2VecLanguage Modeling and Text Classification in Malayalam Language using ULMFiT
Stars: ✭ 68 (-19.05%)
prompts-aiAdvanced playground for GPT-3
Stars: ✭ 156 (+85.71%)