All Projects → PyFilesystem → S3fs

PyFilesystem / S3fs

Licence: mit
Amazon S3 filesystem for PyFilesystem2

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to S3fs

Juicefs
JuiceFS is a distributed POSIX file system built on top of Redis and S3.
Stars: ✭ 4,262 (+3739.64%)
Mutual labels:  s3, filesystem
S3rver
A fake S3 server written in NodeJs
Stars: ✭ 410 (+269.37%)
Mutual labels:  s3, amazon
Aws.s3
Amazon Simple Storage Service (S3) API Client
Stars: ✭ 302 (+172.07%)
Mutual labels:  s3, amazon
benji
📁 This library is a Scala reactive DSL for object storage (e.g. S3/Amazon, S3/CEPH, Google Cloud Storage).
Stars: ✭ 18 (-83.78%)
Mutual labels:  amazon, s3
Aws Toolkit Vscode
AWS Toolkit for Visual Studio Code, an extension for working with AWS services including AWS Lambda.
Stars: ✭ 823 (+641.44%)
Mutual labels:  s3, amazon
Shrine
File Attachment toolkit for Ruby applications
Stars: ✭ 2,903 (+2515.32%)
Mutual labels:  s3, filesystem
Infinit
The Infinit policy-based software-defined storage platform.
Stars: ✭ 363 (+227.03%)
Mutual labels:  s3, filesystem
chicon-rs
A file abstraction system for Rust
Stars: ✭ 55 (-50.45%)
Mutual labels:  filesystem, s3
S3fs Fuse
FUSE-based file system backed by Amazon S3
Stars: ✭ 5,733 (+5064.86%)
Mutual labels:  s3, filesystem
S5cmd
Parallel S3 and local filesystem execution tool.
Stars: ✭ 565 (+409.01%)
Mutual labels:  s3, filesystem
storage
Go package for abstracting local, in-memory, and remote (Google Cloud Storage/S3) filesystems
Stars: ✭ 49 (-55.86%)
Mutual labels:  filesystem, s3
S3scanner
Scan for open AWS S3 buckets and dump the contents
Stars: ✭ 1,319 (+1088.29%)
Mutual labels:  s3, amazon
punic
Punic is a remote cache CLI built for Carthage and Apple .xcframework
Stars: ✭ 25 (-77.48%)
Mutual labels:  amazon, s3
Flydrive
☁️ Flexible and Fluent framework-agnostic driver based system to manage storage in Node.js
Stars: ✭ 275 (+147.75%)
Mutual labels:  s3, filesystem
acid-store
A library for secure, deduplicated, transactional, and verifiable data storage
Stars: ✭ 48 (-56.76%)
Mutual labels:  filesystem, s3
Goofys
a high-performance, POSIX-ish Amazon S3 file system written in Go
Stars: ✭ 3,932 (+3442.34%)
Mutual labels:  s3, filesystem
CloudHunter
Find unreferenced AWS S3 buckets which have CloudFront CNAME records pointing to them
Stars: ✭ 31 (-72.07%)
Mutual labels:  amazon, s3
go-fsimpl
Go io/fs.FS filesystem implementations for various URL schemes
Stars: ✭ 225 (+102.7%)
Mutual labels:  filesystem, s3
Aws
A collection of bash shell scripts for automating various tasks with Amazon Web Services using the AWS CLI and jq.
Stars: ✭ 493 (+344.14%)
Mutual labels:  s3, amazon
Objstore
A Multi-Master Distributed Caching Layer for Amazon S3.
Stars: ✭ 69 (-37.84%)
Mutual labels:  s3, amazon

S3FS

S3FS is a PyFilesystem interface to Amazon S3 cloud storage.

As a PyFilesystem concrete class, S3FS allows you to work with S3 in the same way as any other supported filesystem.

Installing

You can install S3FS from pip as follows:

pip install fs-s3fs

Opening a S3FS

Open an S3FS by explicitly using the constructor:

from fs_s3fs import S3FS
s3fs = S3FS('mybucket')

Or with a FS URL:

  from fs import open_fs
  s3fs = open_fs('s3://mybucket')

Downloading Files

To download files from an S3 bucket, open a file on the S3 filesystem for reading, then write the data to a file on the local filesystem. Here's an example that copies a file example.mov from S3 to your HD:

from fs.tools import copy_file_data
with s3fs.open('example.mov', 'rb') as remote_file:
    with open('example.mov', 'wb') as local_file:
        copy_file_data(remote_file, local_file)

Although it is preferable to use the higher-level functionality in the fs.copy module. Here's an example:

from fs.copy import copy_file
copy_file(s3fs, 'example.mov', './', 'example.mov')

Uploading Files

You can upload files in the same way. Simply copy a file from a source filesystem to the S3 filesystem. See Moving and Copying for more information.

ExtraArgs

S3 objects have additional properties, beyond a traditional filesystem. These options can be set using the upload_args and download_args properties. which are handed to upload and download methods, as appropriate, for the lifetime of the filesystem instance.

For example, to set the cache-control header of all objects uploaded to a bucket:

import fs, fs.mirror
s3fs = S3FS('example', upload_args={"CacheControl": "max-age=2592000", "ACL": "public-read"})
fs.mirror.mirror('/path/to/mirror', s3fs)

see the Boto3 docs for more information.

acl and cache_control are exposed explicitly for convenience, and can be used in URLs. It is important to URL-Escape the cache_control value in a URL, as it may contain special characters.

import fs, fs.mirror
with open fs.open_fs('s3://example?acl=public-read&cache_control=max-age%3D2592000%2Cpublic') as s3fs
    fs.mirror.mirror('/path/to/mirror', s3fs)

S3 URLs

You can get a public URL to a file on a S3 bucket as follows:

movie_url = s3fs.geturl('example.mov')

Documentation

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].