Product-Categorization-NLPMulti-Class Text Classification for products based on their description with Machine Learning algorithms and Neural Networks (MLP, CNN, Distilbert).
Stars: ✭ 30 (-40%)
Vol3xpVolatility Explorer Suit
Stars: ✭ 31 (-38%)
remixer-pytorchImplementation of the Remixer Block from the Remixer paper, in Pytorch
Stars: ✭ 37 (-26%)
mhA memory editor for iOS/macOS with JavaScript support
Stars: ✭ 35 (-30%)
miniconsUtility for analyzing Transformer based representations of language.
Stars: ✭ 28 (-44%)
gctoolkitTool for parsing GC logs
Stars: ✭ 1,127 (+2154%)
CHKVConsistent Hashing based Key-Value Memory Storage
Stars: ✭ 20 (-60%)
golgothaContextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (-22%)
label-studio-transformersLabel data using HuggingFace's transformers and automatically get a prediction service
Stars: ✭ 117 (+134%)
iPerceiveApplying Common-Sense Reasoning to Multi-Modal Dense Video Captioning and Video Question Answering | Python3 | PyTorch | CNNs | Causality | Reasoning | LSTMs | Transformers | Multi-Head Self Attention | Published in IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
Stars: ✭ 52 (+4%)
semi-memoryTensorflow Implementation on Paper [ECCV2018]Semi-Supervised Deep Learning with Memory
Stars: ✭ 49 (-2%)
text2classMulti-class text categorization using state-of-the-art pre-trained contextualized language models, e.g. BERT
Stars: ✭ 15 (-70%)
Text-SummarizationAbstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-24%)
Transformers-TutorialsThis repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+5556%)
small-textActive Learning for Text Classification in Python
Stars: ✭ 241 (+382%)
lessramPure PHP implementation of array data structures that use less memory.
Stars: ✭ 20 (-60%)
Pytorch-NLUPytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (+202%)
gd.pyAn API Wrapper for Geometry Dash written in Python.
Stars: ✭ 87 (+74%)
CPU-MEM-monitorA simple script to log Linux CPU and memory usage (using top or pidstat command) over time and output an Excel- or OpenOfficeCalc-friendly report
Stars: ✭ 41 (-18%)
elastic transformersMaking BERT stretchy. Semantic Elasticsearch with Sentence Transformers
Stars: ✭ 153 (+206%)
Reloaded.AssemblerMinimal .NET wrapper around the simple, easy to use Flat Assembler written by Tomasz Grysztar. Supports both x64 and x86 development.
Stars: ✭ 17 (-66%)
poolA generic C memory pool
Stars: ✭ 81 (+62%)
ParsBigBirdPersian Bert For Long-Range Sequences
Stars: ✭ 58 (+16%)
deepfrogAn NLP-suite powered by deep learning
Stars: ✭ 16 (-68%)
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+358%)
smaller-transformersLoad What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.
Stars: ✭ 66 (+32%)
RETRO-pytorchImplementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+846%)
language-plannerOfficial Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
Stars: ✭ 84 (+68%)
profilingNon-discriminatory profiling of Ruby code leveraging the ruby-prof gem
Stars: ✭ 12 (-76%)
NhaamaMulti-purpose .NET memory-editing library
Stars: ✭ 25 (-50%)
molecule-attention-transformerPytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (-8%)
transformers-lightningA collection of Models, Datasets, DataModules, Callbacks, Metrics, Losses and Loggers to better integrate pytorch-lightning with transformers.
Stars: ✭ 45 (-10%)
nuwa-pytorchImplementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (+594%)
Ask2TransformersA Framework for Textual Entailment based Zero Shot text classification
Stars: ✭ 102 (+104%)
converseConversational text Analysis using various NLP techniques
Stars: ✭ 147 (+194%)
HelvetaCSModern C++ CS:GO base
Stars: ✭ 41 (-18%)
NoSpawnChunksHelps manage server memory by dynamically unloading chunks
Stars: ✭ 21 (-58%)
v8-inspector-apiA simple node module to access V8 inspector + some tools to export and read the data.
Stars: ✭ 43 (-14%)
redis-key-dashboardThis tool allows you to do a small analysis of the amount of keys and memory you use in Redis. It allows you to see overlooked keys and notice overuse.
Stars: ✭ 42 (-16%)
question generatorAn NLP system for generating reading comprehension questions
Stars: ✭ 188 (+276%)
stressSingle-purpose tools to stress resources
Stars: ✭ 24 (-52%)
cpu monitorROS node that publishes all nodes' CPU and memory usage
Stars: ✭ 52 (+4%)
robustness-vitContains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (+56%)
nlp workshop odsc europe20Extensive tutorials for the Advanced NLP Workshop in Open Data Science Conference Europe 2020. We will leverage machine learning, deep learning and deep transfer learning to learn and solve popular tasks using NLP including NER, Classification, Recommendation \ Information Retrieval, Summarization, Classification, Language Translation, Q&A and T…
Stars: ✭ 127 (+154%)
gnn-lspeSource code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+230%)
Transformer-MM-Explainability[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+868%)
spark-transformersSpark-Transformers: Library for exporting Apache Spark MLLIB models to use them in any Java application with no other dependencies.
Stars: ✭ 39 (-22%)
DocSumA tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Stars: ✭ 58 (+16%)
text2textText2Text: Cross-lingual natural language processing and generation toolkit
Stars: ✭ 188 (+276%)
BERT-NERUsing pre-trained BERT models for Chinese and English NER with 🤗Transformers
Stars: ✭ 114 (+128%)
rmemMTuner SDK - Memory profiling library
Stars: ✭ 25 (-50%)