All Projects → cs230-stanford → Cs230 Code Examples

cs230-stanford / Cs230 Code Examples

Licence: other
Code examples in pyTorch and Tensorflow for CS230

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Cs230 Code Examples

Commonsense Rc
Code for Yuanfudao at SemEval-2018 Task 11: Three-way Attention and Relational Knowledge for Commonsense Machine Comprehension
Stars: ✭ 112 (-93.42%)
Mutual labels:  natural-language-processing
Dat8
General Assembly's 2015 Data Science course in Washington, DC
Stars: ✭ 1,516 (-10.88%)
Mutual labels:  natural-language-processing
Pymetamap
Python wraper for MetaMap
Stars: ✭ 119 (-93%)
Mutual labels:  natural-language-processing
Lingo
package lingo provides the data structures and algorithms required for natural language processing
Stars: ✭ 113 (-93.36%)
Mutual labels:  natural-language-processing
Unified Summarization
Official codes for the paper: A Unified Model for Extractive and Abstractive Summarization using Inconsistency Loss.
Stars: ✭ 114 (-93.3%)
Mutual labels:  natural-language-processing
Flair
A very simple framework for state-of-the-art Natural Language Processing (NLP)
Stars: ✭ 11,065 (+550.5%)
Mutual labels:  natural-language-processing
Nlp Papers
Papers and Book to look at when starting NLP 📚
Stars: ✭ 111 (-93.47%)
Mutual labels:  natural-language-processing
Nlpcc Wordseg Weibo
NLPCC 2016 微博分词评测项目
Stars: ✭ 120 (-92.95%)
Mutual labels:  natural-language-processing
Cogcomp Nlpy
CogComp's light-weight Python NLP annotators
Stars: ✭ 115 (-93.24%)
Mutual labels:  natural-language-processing
Pytextrank
Python implementation of TextRank for phrase extraction and summarization of text documents
Stars: ✭ 1,675 (-1.53%)
Mutual labels:  natural-language-processing
Declutr
The corresponding code from our paper "DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations". Do not hesitate to open an issue if you run into any trouble!
Stars: ✭ 111 (-93.47%)
Mutual labels:  natural-language-processing
Rbert
Implementation of BERT in R
Stars: ✭ 114 (-93.3%)
Mutual labels:  natural-language-processing
Dynamic Coattention Network Plus
Dynamic Coattention Network Plus (DCN+) TensorFlow implementation. Question answering using Deep NLP.
Stars: ✭ 117 (-93.12%)
Mutual labels:  natural-language-processing
Deep Nlp Seminars
Materials for deep NLP course
Stars: ✭ 113 (-93.36%)
Mutual labels:  natural-language-processing
Discobert
Code for paper "Discourse-Aware Neural Extractive Text Summarization" (ACL20)
Stars: ✭ 120 (-92.95%)
Mutual labels:  natural-language-processing
Opus Mt
Open neural machine translation models and web services
Stars: ✭ 111 (-93.47%)
Mutual labels:  natural-language-processing
Stanford Tensorflow Tutorials
This repository contains code examples for the Stanford's course: TensorFlow for Deep Learning Research.
Stars: ✭ 10,098 (+493.65%)
Mutual labels:  natural-language-processing
Dialoglue
DialoGLUE: A Natural Language Understanding Benchmark for Task-Oriented Dialogue
Stars: ✭ 120 (-92.95%)
Mutual labels:  natural-language-processing
Scattertext
Beautiful visualizations of how language differs among document types.
Stars: ✭ 1,722 (+1.23%)
Mutual labels:  natural-language-processing
Nonautoreggenprogress
Tracking the progress in non-autoregressive generation (translation, transcription, etc.)
Stars: ✭ 118 (-93.06%)
Mutual labels:  natural-language-processing

CS230 Code Examples

Tutorials

We are happy to introduce some code examples that you can use for your CS230 projects. The code contains examples for TensorFlow and PyTorch, in vision and NLP. The structure of the repository is the following:

README.md
pytorch/
    vision/
        README.md
    nlp/
        README.md
tensorflow/
    vision/
        README.md
    nlp/
        README.md

You'll find a README.md in each sub-directory.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].