All Projects → labbots → Google Drive Upload

labbots / Google Drive Upload

Licence: mit
Bash scripts to upload files to google drive

Programming Languages

shell
77523 projects
bash
514 projects

Projects that are alternatives of or similar to Google Drive Upload

Plexidrive
Scripts to facilitate the use of cloud storage (such as Google Drive) as storage for Plex media server
Stars: ✭ 118 (-70.72%)
Mutual labels:  bash-script, google-drive
Larasail
LaraSail - Set Sail with your Laravel app on DigitalOcean
Stars: ✭ 348 (-13.65%)
Mutual labels:  bash-script
Torrent To Google Drive Downloader
Simple notebook to stream torrent files to Google Drive using Google Colab.
Stars: ✭ 266 (-34%)
Mutual labels:  google-drive
Purse
GPG asymmetric (YubiKey) password manager
Stars: ✭ 313 (-22.33%)
Mutual labels:  bash-script
Bash Yaml
Read a yaml file and create variables in bash
Stars: ✭ 278 (-31.02%)
Mutual labels:  bash-script
Dotbare
Manage dotfiles and any git directories interactively with fzf
Stars: ✭ 327 (-18.86%)
Mutual labels:  bash-script
Goodls
This is a CLI tool to download shared files and folders from Google Drive.
Stars: ✭ 256 (-36.48%)
Mutual labels:  google-drive
Termux Archlinux
You can use setupTermuxArch.bash 📲 to install Arch Linux in Termux on Amazon, Android, Chromebook and Windows. https://sdrausty.github.io/termux-archlinux/
Stars: ✭ 384 (-4.71%)
Mutual labels:  bash-script
Fb2cal
Fetch Facebook Birthdays events and create an ICS file for use with calendar apps
Stars: ✭ 335 (-16.87%)
Mutual labels:  google-drive
Linuxbashshellscriptforops
Linux Bash Shell Script and Python Script For Ops and Devops
Stars: ✭ 298 (-26.05%)
Mutual labels:  bash-script
Google Drive Ftp Adapter
Google Drive FTP Adapter to connect to google drive through the FTP protocol
Stars: ✭ 292 (-27.54%)
Mutual labels:  google-drive
Google Drive Player Script
Grab google drive streaming links (redirector.googlevideo.com/videoplayback?..)
Stars: ✭ 283 (-29.78%)
Mutual labels:  google-drive
Jupyterlab Google Drive
Cloud storage for JupyterLab using Google Drive
Stars: ✭ 332 (-17.62%)
Mutual labels:  google-drive
Bashcached
memcached server built on bash + socat
Stars: ✭ 270 (-33%)
Mutual labels:  bash-script
Rotten Scripts
Scripts that will make you go WOW 😍💻
Stars: ✭ 344 (-14.64%)
Mutual labels:  bash-script
Remot3d
Remot3d: is a simple tool created for large pentesters as well as just for the pleasure of defacers to control server by backdoors
Stars: ✭ 263 (-34.74%)
Mutual labels:  bash-script
Static status
🚦Bash script to generate a static status page.
Stars: ✭ 286 (-29.03%)
Mutual labels:  bash-script
Stackedit
In-browser Markdown editor
Stars: ✭ 18,744 (+4551.12%)
Mutual labels:  google-drive
Move Wsl
Easily move your WSL distros VHDX file to a new location.
Stars: ✭ 389 (-3.47%)
Mutual labels:  bash-script
Python Aria Mirror Bot
A telegram bot for all your mirror needs
Stars: ✭ 383 (-4.96%)
Mutual labels:  google-drive

Google drive upload

Latest Release Stars License

Codacy grade Github Action Checks

Buy us a tree

Google drive upload is a collection of shell scripts runnable on all POSIX compatible shells ( sh / ksh / dash / bash / zsh / etc ).

It utilizes google drive api v3 and google OAuth2.0 to generate access tokens and to authorize application for uploading files/folders to your google drive.

  • Minimal
  • Upload or Update files/folders
  • Recursive folder uploading
  • Sync your folders
    • Overwrite or skip existing files.
  • Resume Interrupted Uploads
  • Share files/folders
    • To anyone or a specific email.
  • Config file support
    • Easy to use on multiple machines.
  • Latest gdrive api used i.e v3
  • Pretty logging
  • Easy to install and update
    • Self update
    • Auto update
    • Can be per-user and invoked per-shell, hence no root access required or global install with root access.
  • An additional sync script for background synchronisation jobs. Read Synchronisation section for more info.

Table of Contents

Compatibility

As this is a collection of shell scripts, there aren't many dependencies. See Native Dependencies after this section for explicitly required program list.

Linux or MacOS

For Linux or MacOS, you hopefully don't need to configure anything extra, it should work by default.

Android

Install Termux and done.

It's fully tested for all usecases of this script.

iOS

Install iSH

While it has not been officially tested, but should work given the description of the app. Report if you got it working by creating an issue.

Windows

Use Windows Subsystem

Again, it has not been officially tested on windows, there shouldn't be anything preventing it from working. Report if you got it working by creating an issue.

Installing and Updating

Native Dependencies

This repo contains two types of scripts, posix compatible and bash compatible.

These programs are required in both bash and posix scripts.

Program Role In Script
curl All network requests
file or mimetype Mimetype generation for extension less files
find To find files and folders for recursive folder uploads
xargs For parallel uploading
mkdir To create folders
rm To remove files and folders
grep Miscellaneous
sed Miscellaneous
mktemp To generate temporary files ( optional )
sleep Self explanatory
ps To manage different processes

If BASH is not available or BASH is available but version is less tham 4.x, then below programs are also required:

Program Role In Script
awk For url encoding in doing api requests
date For installation, update and Miscellaneous
cat Miscellaneous
stty or zsh or tput To determine column size ( optional )

These are the additional programs needed for synchronisation script:

Program Role In Script
tail To show indefinite logs

Installation

You can install the script by automatic installation script provided in the repository.

This will also install the synchronisation script provided in the repo.

Installation script also checks for the native dependencies.

Default values set by automatic installation script, which are changeable:

Repo: labbots/google-drive-upload

Command name: gupload

Sync command name: gsync

Installation path: $HOME/.google-drive-upload

Source: release { can be branch }

Source value: latest { can be branchname }

Shell file: .bashrc or .zshrc or .profile

For custom command names, repo, shell file, etc, see advanced installation method.

Now, for automatic install script, there are two ways:

Basic Method

To install google-drive-upload in your system, you can run the below command:

curl --compressed -Ls https://github.com/labbots/google-drive-upload/raw/master/install.sh | sh -s

and done.

Advanced Method

This section provides information on how to utilise the install.sh script for custom usescases.

These are the flags that are available in the install.sh script:

Click to expand
  • -p | --path <dir_name>

    Custom path where you want to install the script.

    Note: For global installs, give path outside of the home dir like /usr/bin and it must be in the executable path already.


  • -c | --cmd <command_name>

    Custom command name, after installation, script will be available as the input argument.

    To change sync command name, use install sh -c gupload sync='gsync'


  • -r | --repo <Username/reponame>

    Install script from your custom repo, e.g --repo labbots/google-drive-upload, make sure your repo file structure is same as official repo.


  • -B | --branch <branch_name>

    Specify branch name for the github repo, applies to custom and default repo both.


  • -R | --release <tag/release_tag>

    Specify tag name for the github repo, applies to custom and default repo both.


  • -t | --time 'no of days'

    Specify custom auto update time ( given input will taken as number of days ) after which script will try to automatically update itself.

    Default: 5 ( 5 days )


  • -s | --shell-rc <shell_file>

    Specify custom rc file, where PATH is appended, by default script detects .zshrc, .bashrc. and .profile.


  • --sh | --posix

    Force install posix scripts even if system has compatible bash binary present.


  • -q | --quiet

    Only show critical error/sucess logs.


  • -U | --uninstall

    Uninstall the script and remove related files.\n


  • -D | --debug

    Display script command trace.


  • -h | --help

    Display usage instructions.


Now, run the script and use flags according to your usecase.

E.g:

curl --compressed -Ls https://github.com/labbots/google-drive-upload/raw/master/install.sh | sh -s -- -r username/reponame -p somepath -s shell_file -c command_name -B branch_name

Updation

If you have followed the automatic method to install the script, then you can automatically update the script.

There are two methods:

  1. Use the script itself to update the script.

    gupload -u or gupload --update

    This will update the script where it is installed.

    If you use the this flag without actually installing the script,

    e.g just by sh upload.sh -u then it will install the script or update if already installed.

  2. Run the installation script again.

    Yes, just run the installation script again as we did in install section, and voila, it's done.

  3. Automatic updates

    By default, script checks for update after 5 days. Use -t / --time flag of install.sh to modify the interval.

Note: Above methods always obey the values set by user in advanced installation, e.g if you have installed the script with different repo, say myrepo/gdrive-upload, then the update will be also fetched from the same repo.

Usage

First, we need to obtain our oauth credentials, here's how to do it:

Generating Oauth Credentials

  • Follow Enable Drive API section.
  • Open google console.
  • Click on "Credentials".
  • Click "Create credentials" and select oauth client id.
  • Select Application type "Desktop app" or "other".
  • Provide name for the new credentials. ( anything )
  • This would provide a new Client ID and Client Secret.
  • Download your credentials.json by clicking on the download button.

Now, we have obtained our credentials, move to the First run section to use those credentials:

Enable Drive API

  • Log into google developer console at google console.
  • Click select project at the right side of "Google Cloud Platform" of upper left of window.

If you cannot see the project, please try to access to https://console.cloud.google.com/cloud-resource-manager.

You can also create new project at there. When you create a new project there, please click the left of "Google Cloud Platform". You can see it like 3 horizontal lines.

By this, a side bar is opened. At there, select "API & Services" -> "Library". After this, follow the below steps:

  • Click "NEW PROJECT" and input the "Project Name".
  • Click "CREATE" and open the created project.
  • Click "Enable APIs and get credentials like keys".
  • Go to "Library"
  • Input "Drive API" in "Search for APIs & Services".
  • Click "Google Drive API" and click "ENABLE".

Go back to oauth credentials setup

First Run

On first run, the script asks for all the required credentials, which we have obtained in the previous section.

Execute the script: gupload filename

Now, it will ask for following credentials:

Client ID: Copy and paste from credentials.json

Client Secret: Copy and paste from credentials.json

Refresh Token: If you have previously generated a refresh token authenticated to your account, then enter it, otherwise leave blank. If you don't have refresh token, script outputs a URL on the terminal script, open that url in a web browser and tap on allow. Copy the code and paste in the terminal.

Root Folder: Gdrive folder url/id from your account which you want to set as root folder. You can leave it blank and it takes root folder as default.

If everything went fine, all the required credentials have been set, read the next section on how to upload a file/folder.

Config

After first run, the credentials are saved in config file. By default, the config file is ${HOME}/.googledrive.conf.

To change the default config file or use a different one temporarily, see -z / --config custom in Upload Script Custom Flags.

This is the format of a config file:

CLIENT_ID="client id"
CLIENT_SECRET="client secret"
REFRESH_TOKEN="refresh token"
SYNC_DEFAULT_ARGS="default args of gupload command for gsync"
ROOT_FOLDER_NAME="root folder name"
ROOT_FOLDER="root folder id"
ACCESS_TOKEN="access token"
ACCESS_TOKEN_EXPIRY="access token expiry"

You can use a config file in multiple machines, the values that are explicitly required are CLIENT_ID, CLIENT_SECRET and REFRESH_TOKEN.

If ROOT_FOLDER is not set, then it is asked if running in an interactive terminal, otherwise root is used.

ROOT_FOLDER_NAME, ACCESS_TOKEN and ACCESS_TOKEN_EXPIRY are automatically generated using REFRESH_TOKEN.

SYNC_DEFAULT_ARGS is optional.

A pre-generated config file can be also used where interactive terminal access is not possible, like Continuous Integration, docker, jenkins, etc

Just have to print values to "${HOME}/.googledrive.conf", e.g:

printf "%s\n" "CLIENT_ID=\"client id\"
CLIENT_SECRET=\"client secret\"
REFRESH_TOKEN=\"refresh token\"
" >| "${HOME}/.googledrive.conf"

Note: Don't skip those backslashes before the double qoutes, it's necessary to handle spacing.

Upload

For uploading files/remote gdrive files, the syntax is simple;

gupload filename/foldername/file_id/file_link -c gdrive_folder_name

where filename/foldername is input file/folder and gdrive_folder_name is the name of the folder on gdrive, where the input file/folder will be uploaded.

and file_id/file_link is the accessible gdrive file link or id which will be uploaded without downloading.

If gdrive_folder_name is present on gdrive, then script will upload there, else will make a folder with that name.

Note: It's not mandatory to use -c / -C / --create-dir flag.

Apart from basic usage, this script provides many flags for custom usecases, like parallel uploading, skipping upload of existing files, overwriting, etc.

Upload Script Custom Flags

These are the custom flags that are currently implemented:

  • -z | --config

    Override default config file with custom config file.

    Default Config: ${HOME}/.googledrive.conf

    If you want to change the default value of the config path, then use this format,

    gupload --config default=your_config_file_path


  • -c | -C | --create-dir

    Option to create directory. Will provide folder id. Can be used to specify workspace folder for uploading files/folders.

    If this option is used, then input file is optional.


  • -r | --root-dir <google_folderid>

    Google folder id or url to which the file/directory to upload.

    If you want to change the default value of the rootdir stored in config, then use this format,

    gupload --root-dir default=root_folder_[id/url]


  • -s | --skip-subdirs

    Skip creation of sub folders and upload all files inside the INPUT folder/sub-folders in the INPUT folder, use this along with -p/--parallel option to speed up the uploads.


  • -p | --parallel <no_of_files_to_parallely_upload>

    Upload multiple files in parallel, Max value = 10, use with folders.

    Note:

    • This command is only helpful if you are uploading many files which aren't big enough to utilise your full bandwidth, using it otherwise will not speed up your upload and even error sometimes,
    • 1 - 6 value is recommended, but can use upto 10. If errors with a high value, use smaller number.
    • Beaware, this isn't magic, obviously it comes at a cost of increased cpu/ram utilisation as it forks multiple shell processes to upload ( google how xargs works with -P option ).

  • -o | --overwrite

    Overwrite the files with the same name, if present in the root folder/input folder, also works with recursive folders and single/multiple files.

    Note: If you use this flag along with -d/--skip-duplicates, the skip duplicates flag is preferred.


  • -d | --skip-duplicates

    Do not upload the files with the same name, if already present in the root folder/input folder, also works with recursive folders.


  • -f | --file/folder

    Specify files and folders explicitly in one command, use multiple times for multiple folder/files.

    For uploading multiple input into the same folder:

    • Use -C / --create-dir ( e.g ./upload.sh -f file1 -f folder1 -f file2 -C <folder_wherw_to_upload> ) option.

    • Give two initial arguments which will use the second argument as the folder you wanna upload ( e.g: ./upload.sh filename <folder_where_to_upload> -f filename -f foldername ).

      This flag can also be used for uploading files/folders which have - character in their name, normally it won't work, because of the flags, but using -f -[file|folder]namewithhyphen works. Applies for -C/--create-dir too.

      Also, as specified by longflags ( --[file|folder] ), you can simultaneously upload a folder and a file.

      Incase of multiple -f flag having duplicate arguments, it takes the last duplicate of the argument to upload, in the same order provided.


  • -cl | --clone

    Upload a gdrive file without downloading, require accessible gdrive link or id as argument.


  • -S | --share <optional_email_address>

    Share the uploaded input file/folder, grant reader permission to provided email address or to everyone with the shareable link.


  • --speed 'speed'

    Limit the download speed, supported formats: 1K, 1M and 1G.


  • -R | --retry 'num of retries'

    Retry the file upload if it fails, postive integer as argument. Currently only for file uploads.


  • -in | --include 'pattern'

    Only include the files with the given pattern to upload - Applicable for folder uploads.

    e.g: gupload local_folder --include "1", will only include the files with pattern '1' in the name.

    Note: Only provide patterns which are supported by find -name option.


  • -ex | --exclude 'pattern'

    e.g: gupload local_folder --exclude "1", will exclude all the files with pattern '1' in the name.

    Note: Only provide patterns which are supported by find -name option.


  • --hide

    This flag will prevent the script to print sensitive information like root folder id or drivelink


  • -q | --quiet

    Supress the normal output, only show success/error upload messages for files, and one extra line at the beginning for folder showing no. of files and sub folders.


  • -v | --verbose

    Dislay detailed message (only for non-parallel uploads).


  • -V | --verbose-progress

    Display detailed message and detailed upload progress(only for non-parallel uploads).


  • --skip-internet-check

    Do not check for internet connection, recommended to use in sync jobs.


  • -i | --save-info <file_to_save_info>

    Save uploaded files info to the given filename."


  • -u | --update

    Update the installed script in your system, if not installed, then install.


  • --uninstall

    Uninstall the script from your system.


  • --info

    Show detailed info, only if script is installed system wide.


  • -h | --help

    Display usage instructions.


  • -D | --debug

    Display script command trace.


Multiple Inputs

For using multiple inputs at a single time, you can use the -f/--file/--folder or -cl/--clone flag as explained above.

Now, to achieve multiple inputs without flag, you can just use glob or just give them as arguments.

e.g:

  • gupload a b c d

    a/b/c/d will be treated as file/folder/gdrive_link_or_id.


  • gupload *mp4 *mkv

    This will upload all the mp4 and mkv files in the folder, if any.

    To upload all files, just use *. For more info, google how globs work in shell.


  • gupload a b -d c d -c e

    a/b/c/d will be treated as file/folder/gdrive_link_or_id and e as gdrive_folder.


Resuming Interrupted Uploads

Uploads interrupted either due to bad internet connection or manual interruption, can be resumed from the same position.

  • Script checks 3 things, filesize, name and workspace folder. If an upload was interrupted, then resumable upload link is saved in "$HOME/.google-drive-upload/", which later on when running the same command as before, if applicable, resumes the upload from the same position as before.
  • Small files cannot be resumed, less that 1 MB, and the amount of size uploaded should be more than 1 MB to resume.
  • No progress bars for resumable uploads as it messes up with output.
  • You can interrupt many times you want, it will resume ( hopefully ).

Additional Usage

Synchronisation

This repo also provides an additional script ( sync.sh ) to utilise upload.sh for synchronisation jobs, i.e background jobs.

Basic Usage

To create a sync job, just run

gsync folder_name -d gdrive_folder

Here, folder_name is the local folder you want to sync and gdrive_folder is google drive folder name.

In the local folder, all the contents present or added in the future will be automatically uploaded.

Note: Giving gdrive_folder is optional, if you don't specify a name with -d/--directory flags, then it will upload in the root folder set by gupload command.

Also, gdrive folder creation works in the same way as gupload command.

Default wait time: 3 secs ( amount of time to wait before checking new files ).

Default gupload arguments: None ( see -a/--arguments section below ).

Sync Script Custom Flags

Read this section thoroughly to fully utilise the sync script, feel free to open an issue if any doubts regarding the usage.

Click to expand
  • -d | --directory

    Specify gdrive folder name, if not specified then local folder name is used.


  • -j | --jobs

    See all background jobs that were started and still running.

    Use -j/--jobs v/verbose to show additional information for jobs.

    Additional information includes: CPU usage & Memory usage and No. of failed & successful uploads.


  • -p | --pid

    Specify a pid number, used for --jobs or --kill or --info flags, multiple usage allowed.


  • -i | --info

    Print information for a specific job. These are the methods to do it:

    • By specifying local folder and gdrive folder of an existing job,

      e.g: gsync local_folder -d gdrive folder -i

    • By specifying pid number,

      e.g: gsync -i -p pid_number

    • To show info of multiple jobs, use this flag multiple times,

      e.g: gsync -i pid1 -p pid2 -p pid3. You can also use it with multiple inputs by adding this flag.


  • -k | --kill

    Kill background jobs, following are methods to do it:

    • By specifying local_folder and gdrive_folder,

      e.g. gsync local_folder -d gdrive_folder -k, will kill that specific job.

    • pid ( process id ) number can be used as an additional argument to kill a that specific job,

      e.g: gsync -k -p pid_number.

    • To kill multiple jobs, use this flag multiple times,

      e.g: gsync -k pid1 -p pid2 -p pid3. You can also using it with multiple inputs with this flag.

    • This flag can also be used to kill all the jobs,

      e.g: gsync -k all. This will stop all the background jobs running.


  • -t | --time time_in_seconds

    The amount of time that sync will wait before checking new files in the local folder given to sync job.

    e.g: gsync -t 4 local_folder, here 4 is the wait time.

    To set default time, use gsync local_folder -t default=4, it will stored in your default config.


  • -l | --logs

    To show the logs after starting a job or show log of existing job.

    This option can also be used to make a job sync on foreground, rather in background, thus ctrl + c or ctrl +z can exit the job.

    • By specifying local_folder and gdrive_folder,

      e.g. gsync local_folder -d gdrive_folder -l, will show logs of that specific job.

    • pid ( process id ) number can be used as an additional argument to show logs of a specific job,

      e.g: gsync -l -p pid_number.

    Note: If used with multiple inputs or pid numbers, then only first pid/input log is shown, as it goes on indefinitely.


  • -a | --arguments

    As the script uses gupload, you can specify custom flags for background job,

    e.g: gsync local_folder -a '-q -p 4 -d'

    To set some arguments by default, use gsync -a default='-q -p 4 -d'.

    In this example, will skip existing files, 4 parallel upload in case of folder.


  • -fg | --foreground

    This will run the job in foreground and show the logs.

    Note: A already running job cannot be resumed in foreground, it will just show the existing logs.


  • -in | --include 'pattern'

    Only include the files with the given pattern to upload.

    e.g: gsync local_folder --include "1", will only include the files with pattern '1' in the name.\n

    Note: Only provide patterns which are supported by grep, and supported by -E option.


  • -ex | --exclude 'pattern'

    Exclude the files with the given pattern from uploading.

    e.g: gsync local_folder --exclude "1", will exclude all the files with pattern '1' in the name.\n

    Note: Only provide patterns which are supported by grep, and supported by -E option.


  • -c | --command command_name

    Incase if gupload command installed with any other name or to use in systemd service, which requires fullpath.


  • --sync-detail-dir 'dirname'

    Directory where a job information will be stored.

    Default: ${HOME}/.google-drive-upload

  • -s | --service 'service name'

    To generate systemd service file to setup background jobs on boot.

    Note: If this command is used, then only service files are created, no other work is done.


  • -d | --debug

    Display script command trace, use before all the flags to see maximum script trace.


Note: Flags that use pid number as input should be used at last, if you are not intending to provide pid number, say in case of a folder name with positive integers.

Background Sync Jobs

There are basically two ways to start a background job, first one we already covered in the above section.

It will indefinitely run until and unless the host machine is rebooted.

Now, a systemd service service can also be created which will start sync job after boot.

  1. To generate a systemd unit file, run the sync command with --service service_name at the end.

    e.g: If gsync foldername -d drive_folder --service myservice, where, myservice can be any name desired.

    This will generate a script and print the next required commands to start/stop/enable/disable the service.

    The commands that will be printed is explained below:

  2. Start the service sh gsync-test.service.sh start, where gsync-test is the service name

    This is same as starting a sync job with command itself as mentioned in previous section.

    To stop: sh gsync-test.service.sh stop

  3. If you want the job to automatically start on boot, run sh gsync-test.service.sh enable

    To disable: sh gsync-test.service.sh disable

  4. To see logs after a job has been started.

    sh gsync-test.service.sh logs

  5. To remove a job from system, sh gsync-test.service.sh remove

You can use multiple commands at once, e.g: sh gsync-test.service.sh start logs, will start and show the logs.

Note: The script is merely a wrapper, it uses systemctl to start/stop/enable/disable the service and journalctl is used to show the logs.

Extras: A sample service file has been provided in the repo just for reference, it is recommended to use gsync to generate the service file.

Uninstall

If you have followed the automatic method to install the script, then you can automatically uninstall the script.

There are two methods:

  1. Use the script itself to uninstall the script.

    gupload -U or gupload --uninstall

    This will remove the script related files and remove path change from shell file.

  2. Run the installation script again with -U/--uninstall flag

    curl --compressed -Ls https://github.com/labbots/google-drive-upload/raw/master/install.sh | sh -s -- --uninstall
    

    Yes, just run the installation script again with the flag and voila, it's done.

Note: Above methods always obey the values set by user in advanced installation.

Reporting Issues

Issues Status GitHub issues GitHub issues-closed

Use the GitHub issue tracker for any bugs or feature suggestions.

Before creating an issue, make sure to follow the guidelines specified in CONTRIBUTION.md

Contributing

Total Contributers GitHub contributors
Pull Requests GitHub pull-requests GitHub pull-requests closed

Submit patches to code or documentation as GitHub pull requests! Check out the contribution guide

Contributions must be licensed under the MIT. The contributor retains the copyright.

Inspired By

  • github-bashutils - soulseekah/bash-utils
  • deanet-gist - Uploading File into Google Drive
  • Bash Bible - A collection of pure bash alternatives to external processes
  • sh bible - A collection of posix alternatives to external processes

License

MIT

Treeware

Buy us a tree

This package is Treeware. You are free to use this package, but if you use it in production, then we would highly appreciate you buying the world a tree to thank us for our work. By contributing to the Treeware forest you’ll be creating employment for local families and restoring wildlife habitats.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].