All Projects โ†’ aws โ†’ Sagemaker Training Toolkit

aws / Sagemaker Training Toolkit

Licence: apache-2.0
Train machine learning models within a ๐Ÿณ Docker container using ๐Ÿง  Amazon SageMaker.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Sagemaker Training Toolkit

Amazon Sagemaker Examples
Example ๐Ÿ““ Jupyter notebooks that demonstrate how to build, train, and deploy machine learning models using ๐Ÿง  Amazon SageMaker.
Stars: โœญ 6,346 (+4217.01%)
Mutual labels:  aws, training
Infrastructure As Code Training
Materials for learning how to use infrastructure-as-code
Stars: โœญ 268 (+82.31%)
Mutual labels:  aws, training
Mongodb For Python Developers
MongoDB for Python developers course handouts from Talk Python Training
Stars: โœญ 141 (-4.08%)
Mutual labels:  training
Selfie2anime
Anime2Selfie Backend Services - Lambda, Queue, API Gateway and traffic processing
Stars: โœญ 146 (-0.68%)
Mutual labels:  aws
Dynomite
โšก๐Ÿฆ€ ๐Ÿงจ make your rust types fit DynamoDB and visa versa
Stars: โœญ 144 (-2.04%)
Mutual labels:  aws
Aws Csa Pro 2019
AWS Certified Solution Architect Professional (SAP-C01)
Stars: โœญ 143 (-2.72%)
Mutual labels:  aws
Bmw Labeltool Lite
This repository provides you with a easy to use labeling tool for State-of-the-art Deep Learning training purposes.
Stars: โœญ 145 (-1.36%)
Mutual labels:  training
Hollowtrees
A ruleset based watchguard to keep spot/preemptible instance based clusters safe, with plugins for VMs, Kubernetes, Prometheus and Pipeline
Stars: โœญ 141 (-4.08%)
Mutual labels:  aws
Serverless Sentry Plugin
This plugin adds automatic forwarding of errors and exceptions to Sentry (https://sentry.io) and Serverless (https://serverless.com)
Stars: โœญ 146 (-0.68%)
Mutual labels:  aws
Refunc
A lib make building AWS Lambda compatible layer easily
Stars: โœญ 144 (-2.04%)
Mutual labels:  aws
Awesome Kubernetes
A curated list for awesome kubernetes sources ๐Ÿšข๐ŸŽ‰
Stars: โœญ 12,306 (+8271.43%)
Mutual labels:  aws
Terraform Aws Vpc
Terraform module which creates VPC resources on AWS
Stars: โœญ 2,043 (+1289.8%)
Mutual labels:  aws
Serverless Sam
Serverless framework plugin to export AWS SAM templates for a service
Stars: โœญ 143 (-2.72%)
Mutual labels:  aws
Serverless Ide Vscode
Serverless IDE: Enhanced support for AWS SAM and CloudFormation in VS Code
Stars: โœญ 145 (-1.36%)
Mutual labels:  aws
Bootstrap4
Repository for my tutorial course: Bootstrap 4 Essential Training on LinkedIn Learning and Lynda.com.
Stars: โœญ 142 (-3.4%)
Mutual labels:  training
Telescopes
Telescopes is a cloud instance types and full cluster layout recommender consisting of on-demand and spot/preemptible AWS EC2, Google, Azure, Oracle and Alibaba cloud instances.
Stars: โœญ 146 (-0.68%)
Mutual labels:  aws
Serverless Dynamodb Autoscaling
Serverless Plugin for Amazon DynamoDB Auto Scaling configuration.
Stars: โœญ 142 (-3.4%)
Mutual labels:  aws
Expressjs
This is the repository for my course, Building a Website with Node.js and Express.js on LinkedIn Learning and Lynda.com.
Stars: โœญ 143 (-2.72%)
Mutual labels:  training
Aws Openbsd
AWS OpenBSD image builder (AMI) and cloud-init replacement
Stars: โœญ 144 (-2.04%)
Mutual labels:  aws
Jets
Ruby on Jets
Stars: โœญ 1,915 (+1202.72%)
Mutual labels:  aws

SageMaker

SageMaker Training Toolkit

Latest Version Supported Python Versions Code Style: Black

Train machine learning models within a Docker container using Amazon SageMaker.

๐Ÿ“š Background

Amazon SageMaker is a fully managed service for data science and machine learning (ML) workflows. You can use Amazon SageMaker to simplify the process of building, training, and deploying ML models.

To train a model, you can include your training script and dependencies in a Docker container that runs your training code. A container provides an effectively isolated environment, ensuring a consistent runtime and reliable training process.

The SageMaker Training Toolkit can be easily added to any Docker container, making it compatible with SageMaker for training models. If you use a prebuilt SageMaker Docker image for training, this library may already be included.

For more information, see the Amazon SageMaker Developer Guide sections on using Docker containers for training.

๐Ÿ›  Installation

To install this library in your Docker image, add the following line to your Dockerfile:

RUN pip3 install sagemaker-training

๐Ÿ’ป Usage

The following are brief how-to guides. For complete, working examples of custom training containers built with the SageMaker Training Toolkit, please see the example notebooks.

Create a Docker image and train a model

  1. Write a training script (eg. train.py).

  2. Define a container with a Dockerfile that includes the training script and any dependencies.

    The training script must be located in the /opt/ml/code directory. The environment variable SAGEMAKER_PROGRAM defines which file inside the /opt/ml/code directory to use as the training entry point. When training starts, the interpreter executes the entry point defined by SAGEMAKER_PROGRAM. Python and shell scripts are both supported.

    FROM yourbaseimage:tag
    
    # install the SageMaker Training Toolkit 
    RUN pip3 install sagemaker-training
    
    # copy the training script inside the container
    COPY train.py /opt/ml/code/train.py
    
    # define train.py as the script entry point
    ENV SAGEMAKER_PROGRAM train.py
    
  3. Build and tag the Docker image.

    docker build -t custom-training-container .
    
  4. Use the Docker image to start a training job using the SageMaker Python SDK.

    from sagemaker.estimator import Estimator
    
    estimator = Estimator(image_name="custom-training-container",
                          role="SageMakerRole",
                          train_instance_count=1,
                          train_instance_type="local")
    
    estimator.fit()
    

    To train a model using the image on SageMaker, push the image to ECR and start a SageMaker training job with the image URI.

Pass arguments to the entry point using hyperparameters

Any hyperparameters provided by the training job are passed to the entry point as script arguments. The SageMaker Python SDK uses this feature to pass special hyperparameters to the training job, including sagemaker_program and sagemaker_submit_directory. The complete list of SageMaker hyperparameters is available here.

  1. Implement an argument parser in the entry point script. For example, in a Python script:

    import argparse
    
    if __name__ == "__main__":
      parser = argparse.ArgumentParser()
    
      parser.add_argument("--learning-rate", type=int, default=1)
      parser.add_argument("--batch-size", type=int, default=64)
      parser.add_argument("--communicator", type=str)
      parser.add_argument("--frequency", type=int, default=20)
    
      args = parser.parse_args()
      ...
    
  2. Start a training job with hyperparameters.

    {"HyperParameters": {"batch-size": 256, "learning-rate": 0.0001, "communicator": "pure_nccl"}}
    

Read additional information using environment variables

An entry point often needs additional information not available in hyperparameters. The SageMaker Training Toolkit writes this information as environment variables that are available from within the script. For example, this training job includes the channels training and testing:

from sagemaker.pytorch import PyTorch

estimator = PyTorch(entry_point="train.py", ...)

estimator.fit({"training": "s3://bucket/path/to/training/data", 
               "testing": "s3://bucket/path/to/testing/data"})

The environment variables SM_CHANNEL_TRAINING and SM_CHANNEL_TESTING provide the paths to the channels:

import argparse
import os

if __name__ == "__main__":
  parser = argparse.ArgumentParser()

  ...

  # reads input channels training and testing from the environment variables
  parser.add_argument("--training", type=str, default=os.environ["SM_CHANNEL_TRAINING"])
  parser.add_argument("--testing", type=str, default=os.environ["SM_CHANNEL_TESTING"])

  args = parser.parse_args()

  ...

When training starts, SageMaker Training Toolkit will print all available environment variables. Please see the reference on environment variables for a full list of provided environment variables.

Get information about the container environment

To get information about the container environment, initialize an Environment object. Environment provides access to aspects of the environment relevant to training jobs, including hyperparameters, system characteristics, filesystem locations, environment variables and configuration settings. It is a read-only snapshot of the container environment during training, and it doesn't contain any form of state.

from sagemaker_training import environment

env = environment.Environment()

# get the path of the channel "training" from the `inputdataconfig.json` file
training_dir = env.channel_input_dirs["training"]

# get a the hyperparameter "training_data_file" from `hyperparameters.json` file
file_name = env.hyperparameters["training_data_file"]

# get the folder where the model should be saved
model_dir = env.model_dir

# train the model
data = np.load(os.path.join(training_dir, file_name))
x_train, y_train = data["features"], keras.utils.to_categorical(data["labels"])
model = ResNet50(weights="imagenet")
...
model.fit(x_train, y_train)

#save the model to the model_dir at the end of training
model.save(os.path.join(model_dir, "saved_model"))

Execute the entry point

To execute the entry point, call entry_point.run().

from sagemaker_training import entry_point, environment

env = environment.Environment()

# read hyperparameters as script arguments
args = env.to_cmd_args()

# get the environment variables
env_vars = env.to_env_vars()

# execute the entry point
entry_point.run(uri=env.module_dir,
                user_entry_point=env.user_entry_point,
                args=args,
                env_vars=env_vars)

If the entry point execution fails, trainer.train() will write the error message to /opt/ml/output/failure. Otherwise, it will write to the file /opt/ml/success.

๐Ÿ“œ License

This library is licensed under the Apache 2.0 License. For more details, please take a look at the LICENSE file.

๐Ÿค Contributing

Contributions are welcome! Please read our contributing guidelines if you'd like to open an issue or submit a pull request.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].