All Projects → aws-samples → Amazon Sagemaker Script Mode

aws-samples / Amazon Sagemaker Script Mode

Licence: apache-2.0
Amazon SageMaker examples for prebuilt framework mode containers, a.k.a. Script Mode, and more (BYO containers and models etc.)

Projects that are alternatives of or similar to Amazon Sagemaker Script Mode

Cs231n
Stanford cs231n'18 assignment
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook
Tf playground
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook
Animecolordeoldify
Colorise Anime/Manga Sketches with DeOldify
Stars: ✭ 83 (+1.22%)
Mutual labels:  jupyter-notebook
Fonduer Tutorials
A collection of simple tutorials for using Fonduer
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook
Pyepr
Powerful, automated analysis and design of quantum microwave chips & devices [Energy-Participation Ratio and more]
Stars: ✭ 81 (-1.22%)
Mutual labels:  jupyter-notebook
Neural Networks
brief introduction to Python for neural networks
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook
Deterministic Variational Inference
Sample code for running deterministic variational inference to train Bayesian neural networks
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook
Ml pocket reference
Resources for Machine Learning Pocket Reference
Stars: ✭ 83 (+1.22%)
Mutual labels:  jupyter-notebook
Unsupervised anomaly detection
A Notebook where I implement differents anomaly detection algorithms on a simple exemple. The goal was just to understand how the different algorithms works and their differents caracteristics.
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook
Imageclassification
Deep Learning: Image classification, feature visualization and transfer learning with Keras
Stars: ✭ 83 (+1.22%)
Mutual labels:  jupyter-notebook
Nbconflux
nbconflux converts Jupyter Notebooks to Atlassian Confluence pages
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook
Nasnet Keras
Keras implementation of NASNet-A
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook
Tensorflow Demo
Local AI demo and distributed AI demo using TensorFlow
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook
Yolo resnet
Implementing YOLO using ResNet as the feature extraction network
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook
Coronabr
Série histórica dos dados sobre COVID-19, a partir de informações do Ministério da Saúde
Stars: ✭ 83 (+1.22%)
Mutual labels:  jupyter-notebook
Intro To Text Analytics
introduction to text analytics in python training for odsc west 2018
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook
Language Translation
Neural machine translator for English2German translation.
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook
Rsn
Learning to Exploit Long-term Relational Dependencies in Knowledge Graphs, ICML 2019
Stars: ✭ 83 (+1.22%)
Mutual labels:  jupyter-notebook
Machine Learning Portfolio
Machine learning portfolio
Stars: ✭ 83 (+1.22%)
Mutual labels:  jupyter-notebook
Ydf Recsys2015 Challenge
Solution of RecSys Challenge 2015
Stars: ✭ 82 (+0%)
Mutual labels:  jupyter-notebook

Amazon SageMaker Script Mode Examples

This repository contains examples and related resources regarding Amazon SageMaker Script Mode and SageMaker Processing. With Script Mode, you can use training scripts similar to those you would use outside SageMaker with SageMaker's prebuilt containers for various frameworks such TensorFlow, PyTorch, and Apache MXNet. Similarly, in SageMaker Processing, you can supply ordinary data preprocessing scripts for almost any language or technology you wish to use, such as the R programming language.

Currently this repository has resources for TensorFlow, Bring Your Own (BYO models, plus Script Mode-style experience with your own containers), and Miscellaneous (Script Mode-style experience for SageMaker Processing etc.). There also is an Older Resources section with examples of older framework versions.

For those new to SageMaker, there is a set of 2-hour workshops covering the basics at Amazon SageMaker Workshops.

  • TensorFlow Resources:

    • TensorFlow 2 Sentiment Analysis: SageMaker's prebuilt TensorFlow 2 container is used in this example to train a custom sentiment analysis model. Distributed hosted training in SageMaker is performed on a multi-GPU instance, and SageMaker Batch Transform is used for asynchronous, large scale inference/batch scoring. PREREQUISITES: From the tf-sentiment-script-mode directory, upload ONLY the Jupyter notebook sentiment-analysis.ipynb.

    • TensorFlow 2 Workflow with SageMaker Pipelines: This example shows a complete workflow for TensorFlow 2, starting with prototyping followed by automation with Amazon SageMaker Pipelines. To begin, SageMaker Processing is used to transform the dataset. Next, Local Mode training and Local Mode endpoints are demonstrated for prototyping training and inference code, respectively. Automatic Model Tuning is used to automate the hyperparameter tuning process. Finally, the workflow is automated with SageMaker Pipelines. PREREQUISITES: For the Local Mode sections of the example, use a SageMaker Notebook Instance rather than SageMaker Studio. From the tf-2-workflow-smpipelines directory, upload ONLY the Jupyter notebook tf-2-workflow-smpipelines.ipynb.

    • TensorFlow Highly Performant Batch Inference & Training: The focus of this example is highly performant batch inference using TensorFlow Serving, along with Horovod distributed training. To transform the input image data for inference, a preprocessing script is used with the Amazon SageMaker TensorFlow Serving container. PREREQUISITES: be sure to upload all files in the tf-batch-inference-script directory (including the subdirectory code and files) to the directory where you will run the related Jupyter notebook.

    • TensorFlow Text Classification with Word Embeddings: In this example, TensorFlow's tf.keras API is used with Script Mode for a text classification task. An important aspect of the example is showing how to load preexisting word embeddings such as GloVe in Script Mode. Other features demonstrated include Local Mode endpoints as well as Local Mode training. PREREQUISITES: (1) Use a GPU-based (P3 or P2) SageMaker notebook instance, and (2) be sure to upload all files in the tf-word-embeddings directory (including subdirectory code) to the directory where you will run the related Jupyter notebook.

    • TensorFlow with Horovod & Inference Pipeline: Script Mode with TensorFlow is used for a computer vision task, in a demonstration of Horovod distributed training and doing batch inference in conjunction with an Inference Pipeline for transforming image data before inputting it to the model container. This is an alternative to the previous example, which uses a preprocessing script with the Amazon SageMaker TensorFlow Serving Container rather than an Inference Pipeline. PREREQUISITES: be sure to upload all files in the tf-horovod-inference-pipeline directory (including the subdirectory code and files) to the directory where you will run the related Jupyter notebook.

  • Bring Your Own (BYO) Resources:

    • lightGBM BYO: In this repository, most samples use Amazon SageMaker prebuilt framework containers for TensorFlow and other frameworks. For this example, however, we'll show how to BYO container to create a Script Mode-style experience similar to a prebuilt SageMaker framework container, using lightGBM, a popular gradient boosting framework. PREREQUISITES: From the lightgbm-byo directory, upload the Jupyter notebook lightgbm-byo.ipynb.

    • Deploy Pretrained Models: SageMaker's prebuilt PyTorch container is used to demonstrate how you can quickly take a pretrained or locally trained model and deploy them as SageMaker hosted API endpoints. There are examples for both OpenAI's GPT-2 and BERT. PREREQUISITES: From the deploy-pretrained-model directory, upload the entire BERT or GPT2 folder's contents, depending on which model you select. Run either Deploy_BERT.pynb or Deploy_GPT2.ipynb.

  • Miscellaneous Resources:

    • K-means clustering: Most of the samples in this repository involve supervised learning tasks in Amazon SageMaker Script Mode. For this example, by contrast, we'll undertake an unsupervised learning task, and do so with the Amazon SageMaker K-means built-in algorithm rather than Script Mode. PREREQUISITES: From the k-means-clustering directory, upload the Jupyter notebook k-means-clustering.ipynb.

    • R in SageMaker Processing: In this example, R is used to perform some operations on a dataset and generate a plot within SageMaker Processing. The job results including the plot image are retrieved and displayed, demonstrating how R can be easily used within a SageMaker workflow. PREREQUISITES: From the r-in-sagemaker-processing directory, upload the Jupyter notebook r-in-sagemaker_processing.ipynb.

  • Older Resources:

    • TensorFlow 2 Workflow with the AWS Step Functions Data Science SDK: NOTE: This example has been superseded by the TensorFlow 2 Workflow with SageMaker Pipelines example above. This example shows a complete workflow for TensorFlow 2 with automation by the AWS Step Functions Data Science SDK, an older alternative to Amazon SageMaker Pipelines. To begin, SageMaker Processing is used to transform the dataset. Next, Local Mode training and Local Mode endpoints are demonstrated for prototyping training and inference code, respectively. Automatic Model Tuning is used to automate the hyperparameter tuning process. PREREQUISITES: From the tf-2-workflow directory, upload ONLY the Jupyter notebook tf-2-workflow.ipynb.

    • TensorFlow Distributed Training Options: NOTE: Besides the options listed here for TensorFlow 1.x, there are additional options for TensorFlow 2, including [A] built-in SageMaker Distributed Training for both data and model parallelism, and [B] native distribution strategies such as MirroredStrategy as demonstrated in the TensorFlow 2 Sentiment Analysis example above. This TensorFlow 1.x example demonstrates two other distributed training options for SageMaker's Script Mode: (1) parameter servers, and (2) Horovod. PREREQUISITES: From the tf-distribution-options directory, upload ONLY the Jupyter notebook tf-distributed-training.ipynb.

    • TensorFlow Eager Execution: NOTE: This TensorFlow 1.x example has been superseded by the TensorFlow 2 Workflow example above. This example shows how to use Script Mode with Eager Execution mode in TensorFlow 1.x, a more intuitive and dynamic alternative to the original graph mode of TensorFlow. It is the default mode of TensorFlow 2. Local Mode and Automatic Model Tuning also are demonstrated. PREREQUISITES: From the tf-eager-script-mode directory, upload ONLY the Jupyter notebook tf-boston-housing.ipynb.

License

The contents of this repository are licensed under the Apache 2.0 License except where otherwise noted.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].