All Projects → ynshung → blender-colab

ynshung / blender-colab

Licence: MIT license
Render Blender 3.x and 2.9x scenes with Google Colaboratory

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to blender-colab

Rd Blender Docker
A collection of Docker containers for running Blender headless or distributed ✨
Stars: ✭ 111 (+42.31%)
Mutual labels:  blender, rendering
Armory
3D Engine with Blender Integration
Stars: ✭ 2,350 (+2912.82%)
Mutual labels:  blender, rendering
Bitwrk
Bitcoin-fueled Peer-to-Peer Blender Rendering (and more)
Stars: ✭ 114 (+46.15%)
Mutual labels:  blender, rendering
book-ml
書籍「今すぐ試したい!機械学習・深層学習(ディープラーニング)画像認識プログラミングレシピ」のソースコードを配布するレポジトリです。
Stars: ✭ 29 (-62.82%)
Mutual labels:  google-colab, google-colaboratory
Google-Colab-QBittorrent
This is my version of https://github.com/MinorMole/RcloneLab
Stars: ✭ 44 (-43.59%)
Mutual labels:  google-colab, google-colaboratory
steam-stylegan2
Train a StyleGAN2 model on Colaboratory to generate Steam banners.
Stars: ✭ 30 (-61.54%)
Mutual labels:  google-colab, google-colaboratory
Appleseed
A modern open source rendering engine for animation and visual effects
Stars: ✭ 1,824 (+2238.46%)
Mutual labels:  blender, rendering
Herebedragons
A basic 3D scene implemented with various engines, frameworks or APIs.
Stars: ✭ 1,616 (+1971.79%)
Mutual labels:  blender, rendering
Blender Cli Rendering
Python scripts for rendering images using Blender 2.83 from command-line interface
Stars: ✭ 241 (+208.97%)
Mutual labels:  blender, rendering
Sheepit Client
Client for the free and distributed render farm "SheepIt Render Farm"
Stars: ✭ 244 (+212.82%)
Mutual labels:  blender, rendering
Deep-Learning-with-GoogleColab
Deep Learning Applications (Darknet - YOLOv3, YOLOv4 | DeOldify - Image Colorization, Video Colorization | Face-Recognition) with Google Colaboratory - on the free Tesla K80/Tesla T4/Tesla P100 GPU - using Keras, Tensorflow and PyTorch.
Stars: ✭ 63 (-19.23%)
Mutual labels:  google-colab, google-colaboratory
RefRESH
Create RefRESH data: dataset tools for Learning Rigidity in Dynamic Scenes with a Moving Camera for 3D Motion Field Estimation (ECCV 2018)
Stars: ✭ 51 (-34.62%)
Mutual labels:  blender, rendering
carla-colab
How to run CARLA simulator on colab
Stars: ✭ 81 (+3.85%)
Mutual labels:  google-colab, google-colaboratory
DuBLF DuBlast
Quick Playblast tool for Blender
Stars: ✭ 18 (-76.92%)
Mutual labels:  blender, rendering
Hdcycles
Cycles Hydra Delegate
Stars: ✭ 197 (+152.56%)
Mutual labels:  blender, rendering
osci-render
〰📺🔊 Software for making music by drawing objects on an oscilloscope using audio.
Stars: ✭ 135 (+73.08%)
Mutual labels:  blender, rendering
export multi
Use the multi-exporter for Blender and check in (and tweak) various scenes step by step.
Stars: ✭ 31 (-60.26%)
Mutual labels:  blender, rendering
Crystal-Caustics
💎 Approximated crystal caustics effect in Unity.
Stars: ✭ 60 (-23.08%)
Mutual labels:  rendering
uvlayout bridge
Blender Add-On: A bridge between Headus UVlayout and Blender
Stars: ✭ 20 (-74.36%)
Mutual labels:  blender
io pdx mesh
Import/Export files for the Clausewitz game engine
Stars: ✭ 49 (-37.18%)
Mutual labels:  blender

blender-colab

Open In Colab

This is a Python script that allows you to render Blender 3.0+ and older version scene using Google Colaboratory. You can upload the blender files using direct upload, Google Drive or URL. Rendered frames can be downloaded directly or through Google Drive. This script provides basic functionality so you may modify the script to your liking to suit your needs.

Usage

Upload type

  • direct: Upload your blender file in the next cell.
  • google_drive: The blender file will be downloaded directly from Google Drive. You need to specify the path to the blender/zip file at drive_path.
  • url: Direct link to the blender file in url_blend.
  • gdrive_relative: The Google Drive folder specified at drive_path will be copied directly (as if it's a zipped file).

Download type

  • direct: Output files will be automatically downloaded in your browser. (Probably does not work with multiple files?)
  • google_drive: The output files will be pasted into the specified drive_output_path once rendering is finished.
  • gdrive_relative: The output frames will be automatically rendered into the specified drive_output_path.

A few notes

  1. You must own a Google account.
  2. One notebook can only run for maximum time of 12 hours (24 hours for Google Colab Pro) but not guaranteed.
  3. EEVEE rendering is not supported in a virtual machine.
  4. This script is not tested fully yet. Expect some errors.
  5. Do note that your access to GPU may be limited or blocked if you render for many hours.
  6. This script is intended for those who have no access to high-end GPU for rendering. Please use them responsibly!

FAQ

An error occured!

Check which section of the code failed and identify the error (such as misspelled files or path). If you don't understand the error, try re-running the code with the play button at the side. If it still fails, go to Runtime > Restart and run all to restart the code or try Runtime > Factory reset runtime. If all else fails, open an issue in GitHub with the error log you encountered attached and the details of your setup.

Common errors:

  • MessageError: TypeError: Failed to fetch while downloading: The tab must be opened so that the frames can be downloaded.

Disclaimer

Google Colab is targeted to researchers and students to run AI/ML tasks, data analysis and education, not rendering 3D scenes. Because the computing power provided are free, the usage limits, idle timeouts and speed of the rendering may varies time by time. Colab Pro and Colab Pro+ are available for those who wanted to have more powerful GPU and longer runtimes for rendering. See the FAQ for more info. In some cases, it might be faster to use an online Blender renderfarm.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].