All Projects → matrix-org → synapse-s3-storage-provider

matrix-org / synapse-s3-storage-provider

Licence: Apache-2.0 License
Synapse storage provider to fetch and store media in Amazon S3

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to synapse-s3-storage-provider

Cloud Volume
Read and write Neuroglancer datasets programmatically.
Stars: ✭ 63 (+8.62%)
Mutual labels:  matrix, s3
simple-matrix-bot-lib
An easy to use bot library for the Matrix ecosystem written in Python. https://matrix.to/#/#simplematrixbotlib:matrix.org
Stars: ✭ 27 (-53.45%)
Mutual labels:  matrix
s3manager
A Web GUI for your S3 buckets
Stars: ✭ 39 (-32.76%)
Mutual labels:  s3
fluency
High throughput data ingestion logger to Fluentd, AWS S3 and Treasure Data
Stars: ✭ 135 (+132.76%)
Mutual labels:  s3
laravel-s3-tools
This Laravel package contains additional functionality not currently in Laravel for interfacing with Amazon's S3 service (including managing versioned objects).
Stars: ✭ 31 (-46.55%)
Mutual labels:  s3
WebServerCloudBackups
Automatic backups your web projects bases and files to the clouds via WebDAV.
Stars: ✭ 20 (-65.52%)
Mutual labels:  s3
docker-s3fs
S3FS Docker image
Stars: ✭ 18 (-68.97%)
Mutual labels:  s3
dotfiles
💻 🍚 🔳 🔲 My riced-up Kali dotfiles – off-white | dark leet | chrome lambo
Stars: ✭ 55 (-5.17%)
Mutual labels:  matrix
storage
Go package for abstracting local, in-memory, and remote (Google Cloud Storage/S3) filesystems
Stars: ✭ 49 (-15.52%)
Mutual labels:  s3
terraform-aws-serverless-pypi
Serverless PyPI backed by S3
Stars: ✭ 33 (-43.1%)
Mutual labels:  s3
django-docker-s3
Django + S3
Stars: ✭ 43 (-25.86%)
Mutual labels:  s3
docker base images
Vlad's Base Images for Docker
Stars: ✭ 61 (+5.17%)
Mutual labels:  s3
ph-commons
Java 1.8+ Library with tons of utility classes required in all projects
Stars: ✭ 23 (-60.34%)
Mutual labels:  matrix
verneuil
Verneuil is a VFS extension for SQLite that asynchronously replicates databases to S3-compatible blob stores.
Stars: ✭ 169 (+191.38%)
Mutual labels:  s3
benji
📁 This library is a Scala reactive DSL for object storage (e.g. S3/Amazon, S3/CEPH, Google Cloud Storage).
Stars: ✭ 18 (-68.97%)
Mutual labels:  s3
Matswift
Matrix computation library in Swift
Stars: ✭ 17 (-70.69%)
Mutual labels:  matrix
numerl
Small matrix library for Erlang.
Stars: ✭ 22 (-62.07%)
Mutual labels:  matrix
s3-server
Generic S3 server implementation
Stars: ✭ 27 (-53.45%)
Mutual labels:  s3
journal
a blogging platform built on [matrix]
Stars: ✭ 71 (+22.41%)
Mutual labels:  matrix
picgo-plugin-s3
PicGo S3 插件
Stars: ✭ 13 (-77.59%)
Mutual labels:  s3

Synapse S3 Storage Provider

This module can be used by synapse as a storage provider, allowing it to fetch and store media in Amazon S3.

Usage

The s3_storage_provider.py should be on the PYTHONPATH when starting synapse.

Example of entry in synapse config:

media_storage_providers:
- module: s3_storage_provider.S3StorageProviderBackend
  store_local: True
  store_remote: True
  store_synchronous: True
  config:
    bucket: <S3_BUCKET_NAME>
    # All of the below options are optional, for use with non-AWS S3-like
    # services, or to specify access tokens here instead of some external method.
    region_name: <S3_REGION_NAME>
    endpoint_url: <S3_LIKE_SERVICE_ENDPOINT_URL>
    access_key_id: <S3_ACCESS_KEY_ID>
    secret_access_key: <S3_SECRET_ACCESS_KEY>

    # The object storage class used when uploading files to the bucket.
    # Default is STANDARD.
    #storage_class: "STANDARD_IA"

    # The maximum number of concurrent threads which will be used to connect
    # to S3. Each thread manages a single connection. Default is 40.
    #
    #threadpool_size: 20

This module uses boto3, and so the credentials should be specified as described here.

Regular cleanup job

There is additionally a script at scripts/s3_media_upload which can be used in a regular job to upload content to s3, then delete that from local disk. This script can be used in combination with configuration for the storage provider to pull media from s3, but upload it asynchronously.

Once the package is installed, the script should be run somewhat like the following. We suggest using tmux or screen as these can take a long time on larger servers.

database.yaml should contain the keys that would be passed to psycopg2 to connect to your database. They can be found in the contents of the database.args parameter in your homeserver.yaml.

More options are available in the command help.

> cd s3_media_upload
# cache.db will be created if absent. database.yaml is required to
# contain PG credentials
> ls
cache.db database.yaml
# Update cache from /path/to/media/store looking for files not used
# within 2 months
> s3_media_upload update /path/to/media/store 2m
Syncing files that haven't been accessed since: 2018-10-18 11:06:21.520602
Synced 0 new rows
100%|█████████████████████████████████████████████████████████████| 1074/1074 [00:33<00:00, 25.97files/s]
Updated 0 as deleted

> s3_media_upload upload /path/to/media/store matrix_s3_bucket_name --storage-class STANDARD_IA --delete
# prepare to wait a long time

Packaging and release

For maintainers:

  1. Update the __version__ in setup.py. Commit. Push.
  2. Create a release on GitHub for this version.
  3. When published, a GitHub action workflow will build the package and upload to PyPI.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].