All Projects → sooftware → pytorch-lr-scheduler

sooftware / pytorch-lr-scheduler

Licence: MIT license
PyTorch implementation of some learning rate schedulers for deep learning researcher.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to pytorch-lr-scheduler

rx-scheduler-transformer
rxjava scheduler transformer tools for android
Stars: ✭ 15 (-76.92%)
Mutual labels:  scheduler, transformer
Jobs
jobs 分布式任务调度平台
Stars: ✭ 245 (+276.92%)
Mutual labels:  scheduler
Rufus Scheduler
scheduler for Ruby (at, in, cron and every jobs)
Stars: ✭ 2,223 (+3320%)
Mutual labels:  scheduler
Schedulis
Schedulis is a high performance workflow task scheduling system that supports high availability and multi-tenant financial level features, Linkis computing middleware, and has been integrated into data application development portal DataSphere Studio
Stars: ✭ 222 (+241.54%)
Mutual labels:  scheduler
Quartzite
Quarzite is a thin idiomatic Clojure layer on top the Quartz Scheduler
Stars: ✭ 194 (+198.46%)
Mutual labels:  scheduler
Sundial
A Light-weight Job Scheduling Framework
Stars: ✭ 230 (+253.85%)
Mutual labels:  scheduler
React Timeline 9000
React Timeline
Stars: ✭ 184 (+183.08%)
Mutual labels:  scheduler
thain
Thain is a distributed flow schedule platform.
Stars: ✭ 81 (+24.62%)
Mutual labels:  scheduler
Kue Scheduler
A job scheduler utility for kue, backed by redis and built for node.js
Stars: ✭ 240 (+269.23%)
Mutual labels:  scheduler
Wexflow
An easy and fast way to build automation and workflows on Windows, Linux, macOS, and the cloud.
Stars: ✭ 2,435 (+3646.15%)
Mutual labels:  scheduler
Minicron
🕰️ Monitor your cron jobs
Stars: ✭ 2,351 (+3516.92%)
Mutual labels:  scheduler
Threadly
A library of tools to assist with safe concurrent java development. Providing unique priority based thread pools, and ways to distrbute threaded work safely.
Stars: ✭ 196 (+201.54%)
Mutual labels:  scheduler
Laravel Cronless Schedule
Run the Laravel scheduler without relying on cron
Stars: ✭ 231 (+255.38%)
Mutual labels:  scheduler
Scheduler
GPL version of JavaScript Event Calendar
Stars: ✭ 190 (+192.31%)
Mutual labels:  scheduler
Powerjob
Enterprise job scheduling middleware with distributed computing ability.
Stars: ✭ 3,231 (+4870.77%)
Mutual labels:  scheduler
Cacule Cpu Scheduler
The CacULE CPU scheduler is based on interactivity score mechanism. The interactivity score is inspired by the ULE scheduler (FreeBSD scheduler).
Stars: ✭ 185 (+184.62%)
Mutual labels:  scheduler
Sfdx Mass Action Scheduler
🚀 Declaratively schedule Process Builder, Flows, Quick Actions, Email Alerts, Workflow Rules, or Apex to process records from Reports, List Views, SOQL, or Apex.
Stars: ✭ 200 (+207.69%)
Mutual labels:  scheduler
Jqwidgets
Angular, Vue, React, Web Components, Blazor, Javascript, jQuery and ASP .NET Framework,
Stars: ✭ 227 (+249.23%)
Mutual labels:  scheduler
Advanced-xv6
Modern improvements for MIT's xv6 OS
Stars: ✭ 26 (-60%)
Mutual labels:  scheduler
Shardingsphere Elasticjob Cloud
Stars: ✭ 248 (+281.54%)
Mutual labels:  scheduler

pytorch-lr-scheduler

PyTorch implementation of some learning rate schedulers for deep learning researcher.

Usage

WarmupReduceLROnPlateauScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.warmup_reduce_lr_on_plateau_scheduler import WarmupReduceLROnPlateauScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-10)

    scheduler = WarmupReduceLROnPlateauScheduler(
        optimizer, 
        init_lr=1e-10, 
        peak_lr=1e-4, 
        warmup_steps=30000, 
        patience=1,
        factor=0.3,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            if timestep < warmup_steps:
                scheduler.step()
                
        val_loss = validate()
        scheduler.step(val_loss)

TransformerLRScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.transformer_lr_scheduler import TransformerLRScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-10)

    scheduler = TransformerLRScheduler(
        optimizer=optimizer, 
        init_lr=1e-10, 
        peak_lr=0.1,
        final_lr=1e-4, 
        final_lr_scale=0.05,
        warmup_steps=3000, 
        decay_steps=17000,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            scheduler.step()

TriStageLRScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.tri_stage_lr_scheduler import TriStageLRScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-10)

    scheduler = TriStageLRScheduler(
        optimizer, 
        init_lr=1e-10, 
        peak_lr=1e-4, 
        final_lr=1e-7, 
        init_lr_scale=0.01, 
        final_lr_scale=0.05,
        warmup_steps=30000, 
        hold_steps=70000, 
        decay_steps=100000,
        total_steps=200000,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            scheduler.step()

ReduceLROnPlateauScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.reduce_lr_on_plateau_lr_scheduler import ReduceLROnPlateauScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-4)

    scheduler = ReduceLROnPlateauScheduler(optimizer, patience=1, factor=0.3)

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
        
        val_loss = validate()
        scheduler.step(val_loss)

WarmupLRScheduler

  • Visualize

  • Example code
import torch

from lr_scheduler.warmup_lr_scheduler import WarmupLRScheduler

if __name__ == '__main__':
    max_epochs, steps_in_epoch = 10, 10000

    model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
    optimizer = torch.optim.Adam(model, 1e-10)

    scheduler = WarmupLRScheduler(
        optimizer, 
        init_lr=1e-10, 
        peak_lr=1e-4, 
        warmup_steps=4000,
    )

    for epoch in range(max_epochs):
        for timestep in range(steps_in_epoch):
            ...
            ...
            scheduler.step()

Installation

git clone [email protected]:sooftware/pytorch-lr-scheduler.git
cd pytorch-lr-scheduler
pip install .

Troubleshoots and Contributing

If you have any questions, bug reports, and feature requests, please open an issue on Github.

I appreciate any kind of feedback or contribution. Feel free to proceed with small issues like bug fixes, documentation improvement. For major contributions and new features, please discuss with the collaborators in corresponding issues.

Code Style

I follow PEP-8 for code style. Especially the style of docstrings is important to generate documentation.

License

This project is licensed under the MIT LICENSE - see the LICENSE.md file for details

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].