All Projects → stas00 → Ipyexperiments

stas00 / Ipyexperiments

Licence: other
jupyter/ipython experiment containers for GPU and general RAM re-use

Projects that are alternatives of or similar to Ipyexperiments

Jupyterlab Lsp
Coding assistance for JupyterLab (code navigation + hover suggestions + linters + autocompletion + rename) using Language Server Protocol
Stars: ✭ 796 (+521.88%)
Mutual labels:  ipython, jupyter-notebook, jupyter, notebook
Nteract
📘 The interactive computing suite for you! ✨
Stars: ✭ 5,713 (+4363.28%)
Mutual labels:  ipython, jupyter-notebook, jupyter, notebook
Digital Signal Processing Lecture
Digital Signal Processing - Theory and Computational Examples
Stars: ✭ 532 (+315.63%)
Mutual labels:  ipython, jupyter-notebook, jupyter, notebook
Signals And Systems Lecture
Continuous- and Discrete-Time Signals and Systems - Theory and Computational Examples
Stars: ✭ 166 (+29.69%)
Mutual labels:  ipython, jupyter-notebook, jupyter, notebook
Cookbook 2nd Code
Code of the IPython Cookbook, Second Edition, by Cyrille Rossant, Packt Publishing 2018 [read-only repository]
Stars: ✭ 541 (+322.66%)
Mutual labels:  ipython, jupyter-notebook, jupyter
Prml
PRML algorithms implemented in Python
Stars: ✭ 10,206 (+7873.44%)
Mutual labels:  jupyter-notebook, jupyter, notebook
Spark R Notebooks
R on Apache Spark (SparkR) tutorials for Big Data analysis and Machine Learning as IPython / Jupyter notebooks
Stars: ✭ 109 (-14.84%)
Mutual labels:  jupyter-notebook, jupyter, notebook
Nbstripout
strip output from Jupyter and IPython notebooks
Stars: ✭ 738 (+476.56%)
Mutual labels:  ipython, jupyter-notebook, jupyter
Hands On Nltk Tutorial
The hands-on NLTK tutorial for NLP in Python
Stars: ✭ 419 (+227.34%)
Mutual labels:  jupyter-notebook, jupyter, notebook
Cookbook 2nd
IPython Cookbook, Second Edition, by Cyrille Rossant, Packt Publishing 2018
Stars: ✭ 704 (+450%)
Mutual labels:  ipython, jupyter-notebook, jupyter
Ansible Jupyterhub
Ansible role to setup jupyterhub server (deprecated)
Stars: ✭ 14 (-89.06%)
Mutual labels:  ipython, jupyter-notebook, jupyter
Ipybind
IPython / Jupyter integration for pybind11
Stars: ✭ 63 (-50.78%)
Mutual labels:  ipython, jupyter-notebook, jupyter
Data Science Your Way
Ways of doing Data Science Engineering and Machine Learning in R and Python
Stars: ✭ 530 (+314.06%)
Mutual labels:  jupyter-notebook, jupyter, notebook
Telepyth
Telegram notification with IPython magics.
Stars: ✭ 54 (-57.81%)
Mutual labels:  ipython, jupyter-notebook, jupyter
Nbconflux
nbconflux converts Jupyter Notebooks to Atlassian Confluence pages
Stars: ✭ 82 (-35.94%)
Mutual labels:  jupyter-notebook, jupyter, notebook
Sklearn Classification
Data Science Notebook on a Classification Task, using sklearn and Tensorflow.
Stars: ✭ 518 (+304.69%)
Mutual labels:  jupyter-notebook, jupyter, notebook
Sci Pype
A Machine Learning API with native redis caching and export + import using S3. Analyze entire datasets using an API for building, training, testing, analyzing, extracting, importing, and archiving. This repository can run from a docker container or from the repository.
Stars: ✭ 90 (-29.69%)
Mutual labels:  ipython, jupyter-notebook, jupyter
Vscodejupyter
Jupyter for Visual Studio Code
Stars: ✭ 337 (+163.28%)
Mutual labels:  ipython, jupyter-notebook, jupyter
Quantitative Notebooks
Educational notebooks on quantitative finance, algorithmic trading, financial modelling and investment strategy
Stars: ✭ 356 (+178.13%)
Mutual labels:  jupyter-notebook, jupyter, notebook
Sparkmagic
Jupyter magics and kernels for working with remote Spark clusters
Stars: ✭ 954 (+645.31%)
Mutual labels:  jupyter-notebook, jupyter, notebook

pypi ipyexperiments version Conda ipyexperiments version Anaconda-Server Badge ipyexperiments python compatibility PyPI - Downloads ipyexperiments license

ipyexperiments

jupyter/ipython experiment containers and utils for profiling and reclaiming GPU and general RAM, and detecting memory leaks.

About

This module's main purpose is to help calibrate hyper parameters in deep learning notebooks to fit the available GPU and General RAM, but, of course, it can be useful for any other use where memory limits is a constant issue. It is also useful for detecting memory leaks in your code. And over time other goodies that help with running machine learning experiments have been added.

This package is slowly evolving into a suite of different helper modules that are designed to help diagnose issues with memory leakages and make the debug of these easy.

Currently the package contains several modules:

  1. IpyExperiments - a smart container for ipython/jupyter experiments (documentation / demo)
  2. CellLogger - per cell memory profiler and more features (documentation / demo)
  3. ipython utils - workarounds for ipython memory leakage on exception (documentation)
  4. memory debugging and profiling utils (documentation)

Using this framework you can run multiple consequent experiments without needing to restart the kernel all the time, especially when you run out of GPU memory - the familiar to all "cuda: out of memory" error. When this happens you just go back to the notebook cell where you started the experiment, change the hyper parameters, and re-run the updated experiment until it fits the available memory. This is much more efficient and less error-prone then constantly restarting the kernel, and re-running the whole notebook.

As an extra bonus you get access to the memory consumption data, so you can use it to automate the discovery of the hyper parameters to suit your hardware's unique memory limits.

The idea behind this module is very simple - it implements a python function-like functionality, where its local variables get destroyed at the end of its run, giving us memory back, except it'll work across multiple jupyter notebook cells (or ipython). In addition it also runs gc.collect() to immediately release badly behaved variables with circular references, and reclaim general and GPU RAM. It also helps to discover memory leaks, and performs various other useful things behind the scenes.

If you need a more fine-grained memory profiling, the CellLogger sub-system reports RAM usage on a per cell-level when used with jupyter or per line of code in ipython. You get the resource usage report automatically as soon as a command or a cell finished executing. It includes other features, such as resetting RNG seed in python/numpy/pytorch if you need a reproducible result when re-running the whole notebook or just one cell.

Currently this sub-system logs GPU RAM, general RAM and execution time. But it can be expanded to track other important things. While there are various similar loggers out there, the main focus of this implementation is to help track GPU, whose main scarce resource is GPU RAM.

Usage demo

Installation

  • pypi:

    pip install ipyexperiments
    
  • conda:

    conda install -c conda-forge -c stason ipyexperiments
    
  • dev:

    pip install git+https://github.com/stas00/ipyexperiments.git
    

Usage

Here is an example with using code from the fastai v1 library, spread out through 8 jupyter notebook cells:

# cell 1
exp1 = IPyExperimentsPytorch() # new experiment
# cell 2
learn1 = language_model_learner(data_lm, bptt=60, drop_mult=0.25, pretrained_model=URLs.WT103)
# cell 3
learn1.lr_find()
# cell 4
del exp1
# cell 5
exp2 = IPyExperimentsPytorch() # new experiment
# cell 6
learn2 = language_model_learner(data_lm, bptt=70, drop_mult=0.3, pretrained_model=URLs.WT103)
# cell 7
learn2.lr_find()
# cell 8
del exp2

Demo

See this demo notebook, to see how this system works.

Documentation

  1. IPyExperiments
  2. CellLogger sub-system
  3. ipython utils
  4. memory debug/profiling utils

Contributing and Testing

Please see CONTRIBUTING.md.

Caveats

Google Colab

As of this writing colab runs a really old version of ipython (5.5.0) which doesn't support the modern ipython events API.

To solve this problem automatically so you never have to think about it again, always add this cell as the very first one in each colab notebook

# This magic cell should be put first in your colab notebook.
# It'll automatically upgrade colab's really antique ipython/ipykernel to their
# latest versions which are required for packages like ipyexperiments
from packaging import version
import IPython, ipykernel
if version.parse(IPython.__version__) <= version.parse("5.5.0"):
    !pip install -q --upgrade ipython
    !pip install -q --upgrade ipykernel

    import os
    import signal
    os.kill(os.getpid(), signal.SIGTERM)
print(f"ipykernel=={ipykernel.__version__}")
print(f"IPython=={IPython.__version__}")

If you're on the default old ipykernel/ipython this cell will update it, then crash the current session. It will automatically restart the execution and the code will work normally.

History

A detailed history of changes can be found here.

Related Projects

(If you know of a related pytorch gpu memory profiler please send a PR to add the link. Thank you!)

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].