All Projects β†’ huangwl18 β†’ language-planner

huangwl18 / language-planner

Licence: MIT license
Official Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to language-planner

Haystack
πŸ” Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+3958.33%)
Mutual labels:  transformers, language-model
Tokenizers
πŸ’₯ Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+5944.05%)
Mutual labels:  transformers, language-model
gpt-j
A GPT-J API to use with python3 to generate text, blogs, code, and more
Stars: ✭ 101 (+20.24%)
Mutual labels:  language-model, gpt-3
wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-53.57%)
Mutual labels:  transformers, language-model
KB-ALBERT
KBκ΅­λ―Όμ€ν–‰μ—μ„œ μ œκ³΅ν•˜λŠ” 경제/금육 도메인에 νŠΉν™”λœ ν•œκ΅­μ–΄ ALBERT λͺ¨λΈ
Stars: ✭ 215 (+155.95%)
Mutual labels:  transformers, language-model
Clue
δΈ­ζ–‡θ―­θ¨€η†θ§£ζ΅‹θ―„εŸΊε‡† Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+2786.9%)
Mutual labels:  transformers, language-model
minicons
Utility for analyzing Transformer based representations of language.
Stars: ✭ 28 (-66.67%)
Mutual labels:  transformers, language-model
COCO-LM
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (+29.76%)
Mutual labels:  transformers, language-model
backprop
Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+172.62%)
Mutual labels:  transformers, language-model
gpt-j-api
API for the GPT-J language model 🦜. Including a FastAPI backend and a streamlit frontend
Stars: ✭ 248 (+195.24%)
Mutual labels:  language-model, gpt-3
gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+96.43%)
Mutual labels:  transformers
Black-Box-Tuning
ICML'2022: Black-Box Tuning for Language-Model-as-a-Service
Stars: ✭ 99 (+17.86%)
Mutual labels:  language-model
converse
Conversational text Analysis using various NLP techniques
Stars: ✭ 147 (+75%)
Mutual labels:  transformers
Robotics-Planning-Dynamics-and-Control
RPDC : This contains all my MATLAB codes for the Robotics, Planning, Dynamics and Control . The implementations model various kinds of manipulators and mobile robots for position control, trajectory planning and path planning problems.
Stars: ✭ 171 (+103.57%)
Mutual labels:  planning
Deep-NLP-Resources
Curated list of all NLP Resources
Stars: ✭ 65 (-22.62%)
Mutual labels:  language-model
fix
Allows you to use OpenAI Codex to fix errors in the command line.
Stars: ✭ 72 (-14.29%)
Mutual labels:  codex
PyTorch-Model-Compare
Compare neural networks by their feature similarity
Stars: ✭ 119 (+41.67%)
Mutual labels:  transformers
DocSum
A tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Stars: ✭ 58 (-30.95%)
Mutual labels:  transformers
xpandas
Universal 1d/2d data containers with Transformers functionality for data analysis.
Stars: ✭ 25 (-70.24%)
Mutual labels:  transformers
pytorch-vit
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (+197.62%)
Mutual labels:  transformers

Open in Colab

Language Models as Zero-Shot Planners:
Extracting Actionable Knowledge for Embodied Agents

[Project Page] [Paper] [Video]

Wenlong Huang1, Pieter Abbeel1, Deepak Pathak*2, Igor Mordatch*3 (*equal advising)

1University of California, Berkeley, 2Carnegie Mellon University, 3Google Brain

This is the official demo code for our Language Models as Zero-Shot Planners paper. The code demonstrates how Large Language Models, such as GPT-3 and Codex, can generate action plans for complex human activities (e.g. "make breakfast"), even without any further training. The code can be used with any available language models from OpenAI API and Huggingface Transformers with a common interface.

If you find this work useful in your research, please cite using the following BibTeX:

@article{huang2022language,
      title={Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents},
      author={Huang, Wenlong and Abbeel, Pieter and Pathak, Deepak and Mordatch, Igor},
      journal={arXiv preprint arXiv:2201.07207},
      year={2022}
    }

Local Setup or Open in Colab

Requirements

  • Python=3.6.13
  • CUDA=11.3

Setup Instructions

git clone https://github.com/huangwl18/language-planner.git
cd language-planner/
conda create --name language-planner-env python=3.6.13
conda activate language-planner-env
pip install --upgrade pip
pip install -r requirements.txt

Running Code

See demo.ipynb (or Open in Colab) for a complete walk-through of our method. Feel free to experiment with any household tasks that you come up with (or any tasks beyond household domain if you provide necessary actions in available_actions.json)!

Note:

  • It is observed that best results can be obtained with larger language models. If you cannot run Huggingface Transformers models locally or on Google Colab due to memory constraint, it is recommended to register an OpenAI API account and use GPT-3 or Codex (As of 01/2022, $18 free credits are awarded to new accounts and Codex series are free after admitted from the waitlist).
  • Due to language models' high sensitivity to sampling hyperparameters, you may need to tune sampling hyperparameters for different models to obtain the best results.
  • The code uses the list of available actions supported in VirtualHome 1.0's Evolving Graph Simulator. The available actions are stored in available_actions.json. The actions should support a large variety of household tasks. However, you may modify or replace this file if you're interested in a different set of actions or a different domain of tasks (beyond household domain).
  • A subset of the manually-annotated examples originally collected by the VirtualHome paper is used as available examples in the prompt. They are transformed to natural language format and stored in available_examples.json. Feel free to change this file for a different set of available examples.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].