All Projects → ohnosequences → Sbt S3 Resolver

ohnosequences / Sbt S3 Resolver

Licence: agpl-3.0
☁️Amazon S3-based resolver for sbt

Programming Languages

scala
5932 projects

Projects that are alternatives of or similar to Sbt S3 Resolver

0x4447 product s3 email
📫 A serverless email server on AWS using S3 and SES
Stars: ✭ 2,905 (+2493.75%)
Mutual labels:  aws, s3, aws-s3, s3-bucket
simply-static-deploy
WordPress plugin to deploy static sites easily to an AWS S3 bucket.
Stars: ✭ 48 (-57.14%)
Mutual labels:  aws-s3, s3, s3-bucket
ionic-image-upload
Ionic Plugin for Uploading Images to Amazon S3
Stars: ✭ 26 (-76.79%)
Mutual labels:  aws-s3, s3, s3-bucket
Goofys
a high-performance, POSIX-ish Amazon S3 file system written in Go
Stars: ✭ 3,932 (+3410.71%)
Mutual labels:  s3, aws-s3, s3-bucket
BlobHelper
BlobHelper is a common, consistent storage interface for Microsoft Azure, Amazon S3, Komodo, Kvpbase, and local filesystem written in C#.
Stars: ✭ 23 (-79.46%)
Mutual labels:  aws-s3, s3, s3-bucket
flask-drive
A simple Flask app to upload and download files off Amazon's S3
Stars: ✭ 23 (-79.46%)
Mutual labels:  aws-s3, s3, s3-bucket
S3mock
A simple mock implementation of the AWS S3 API startable as Docker image, JUnit 4 rule, or JUnit Jupiter extension
Stars: ✭ 332 (+196.43%)
Mutual labels:  aws, s3, aws-s3
Terraform Aws S3 Bucket
Terraform module which creates S3 bucket resources on AWS
Stars: ✭ 130 (+16.07%)
Mutual labels:  aws, s3, aws-s3
S3 Sync Action
🔄 GitHub Action to sync a directory with a remote S3 bucket 🧺
Stars: ✭ 497 (+343.75%)
Mutual labels:  aws, s3, aws-s3
Rome
Carthage cache for S3, Minio, Ceph, Google Storage, Artifactory and many others
Stars: ✭ 724 (+546.43%)
Mutual labels:  aws, s3, aws-s3
S3 Permission Checker
Check read, write permissions on S3 buckets in your account
Stars: ✭ 18 (-83.93%)
Mutual labels:  aws, s3, aws-s3
Node S3 Uploader
Flexible and efficient resize, rename, and upload images to Amazon S3 disk storage. Uses the official AWS Node SDK for transfer, and ImageMagick for image processing. Support for multiple image versions targets.
Stars: ✭ 237 (+111.61%)
Mutual labels:  aws, s3, aws-s3
Kafka Connect Storage Cloud
Kafka Connect suite of connectors for Cloud storage (Amazon S3)
Stars: ✭ 153 (+36.61%)
Mutual labels:  aws, s3, aws-s3
Bucket-Flaws
Bucket Flaws ( S3 Bucket Mass Scanner ): A Simple Lightweight Script to Check for Common S3 Bucket Misconfigurations
Stars: ✭ 43 (-61.61%)
Mutual labels:  aws-s3, s3, s3-bucket
Aws S3 Scala
Scala client for Amazon S3
Stars: ✭ 35 (-68.75%)
Mutual labels:  aws, s3, aws-s3
Aws.s3
Amazon Simple Storage Service (S3) API Client
Stars: ✭ 302 (+169.64%)
Mutual labels:  aws, s3, aws-s3
Discharge
⚡️ A simple, easy way to deploy static websites to Amazon S3.
Stars: ✭ 483 (+331.25%)
Mutual labels:  aws, s3, aws-s3
Awslib scala
An idiomatic Scala wrapper around the AWS Java SDK
Stars: ✭ 20 (-82.14%)
Mutual labels:  aws, s3, aws-s3
Minio Hs
MinIO Client SDK for Haskell
Stars: ✭ 39 (-65.18%)
Mutual labels:  s3, aws-s3, s3-bucket
S3fs
S3 FileSystem (fs.FS) implementation
Stars: ✭ 93 (-16.96%)
Mutual labels:  s3, s3-bucket

Sbt S3 resolver

This is an sbt plugin which helps resolving dependencies from and publish to Amazon S3 buckets (private or public).

It can publish artifacts in maven or ivy style, but it can resolve only ivy artifacts:

Ivy artifacts publish resolve Maven artifacts publish resolve
public public *
private private

Usage

Plugin sbt dependency

In project/plugins.sbt:

resolvers += Resolver.jcenterRepo
addSbtPlugin("ohnosequences" % "sbt-s3-resolver" % "<version>")

(see the latest release version on the badge above)

Notes

  • Since v0.17.0 this plugin is compiled and published only for sbt-1.+. If you need it for sbt-0.13, use v0.16.0.
  • If you are using Java 9 and encounter problems, check information in #58.

Settings

  • awsProfile: AWS configuration profile
  • s3credentials: AWS credentials provider to access S3
  • s3region: AWS Region for your S3 resolvers
  • s3acl: Controls whether published artifacts are accessible publicly via http(s) or not
  • s3storageClass: Controls storage class for the published S3 objects
  • s3overwrite: Controls whether publishing resolver can overwrite artifacts
  • s3sse: Controls whether publishing resolver will use server side encryption
Key Type Default
awsProfile Option[String] None
s3credentials AWSCredentialsProvider DefaultAWSCredentialsProviderChain
s3region Region DefaultAwsRegionProviderChain
s3acl Option[CannedAccessControlList] Some(PublicRead)
s3storageClass StorageClass Standard
s3overwrite Boolean isSnapshot.value
s3sse Boolean false

These defaults are added to your project automatically. So if you're fine with them, you don't need to do anything special, just set the resolver and publish. Otherwise you can tune the settings by overriding them in your build.sbt.

You can use s3resolver setting key that takes a name and an S3 bucket url and returns S3Resolver which is implicitly converted to sbt.Resolver.

AWS configuration profiles

If you have different configuration profiles, you can choose the one you need by setting

awsProfile := Some("my-profile")

If you didn't touch s3region and s3credentials settings, they will both use this profile region and credentials.

By default awsProfile is set to None which means that both region and credentials will be set from the default provider chains. See below for details.

Credentials

s3credentials key has the AWSCredentialsProvider type from AWS Java SDK. Different kinds of providers look for credentials in different places, plus they can be chained. DefaultAWSCredentialsProviderChain looks in

  1. Environment Variables:
    • AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
    • or AWS_ACCESS_KEY and AWS_SECRET_KEY
  2. Java System Properties: aws.accessKeyId and aws.secretKey
  3. Credential profiles file: ~/.aws/credentials shared by all AWS SDKs and the AWS CLI
  4. ECS container credentials loaded from the Amazon ECS if the environment variable AWS_CONTAINER_CREDENTIALS_RELATIVE_URI is set
  5. Instance profile credentials delivered through the Amazon EC2 metadata service

You can find other types of credentials providers in the AWS Java SDK docs.

If you changed the awsProfile setting, default credentials provider becomes that of the corresponding profile. To check which credentials are used by the plugin, use showS3Credentials task.

Region

You can set the s3region setting in a number of ways:

By default it is set to the DefaultAwsRegionProviderChain which is similar to the default credentials provider chain and includes

  1. Environment Variable: AWS_REGION
  2. Java System Property: aws.region
  3. Profiles configuration file: ~/.aws/config
  4. EC2 instance metadata service

If you changed the awsProfile setting, default region provider becomes that of the corresponding profile.

Publishing

A common practice is to use different (snapshots and releases) repositories depending on the version. For example, here is such publishing resolver with ivy-style patterns:

publishMavenStyle := false

publishTo := {
  val prefix = if (isSnapshot.value) "snapshots" else "releases"
  Some(s3resolver.value(s"My ${prefix} S3 bucket", s3(s"${prefix}.cool.bucket.com")) withIvyPatterns)
}

You can also switch repository for public and private artifacts — you just set the url of your bucket depending on something. Here s3 constructor takes the name of your S3 bucket (don't worry about s3:// prefix).

Resolving

You can add a sequence of S3 resolvers just like this:

resolvers ++= Seq[Resolver](
  s3resolver.value("Releases resolver", s3("releases.bucket.com")),
  s3resolver.value("Snapshots resolver", s3("snapshots.bucket.com"))
)

Note, that you have to write Seq[Resolver] explicitly, so that S3Resolvers will be converted to sbt.Resolver before appending.

Public Maven artifacts

If your maven artifacts are public, you can resolve them using usual sbt resolvers just transforming your s3://my.bucket.com to

"My S3 bucket" at "https://s3-<region>.amazonaws.com/my.bucket.com"

i.e. without using this plugin. Or if you're using it anyway, you can write:

"My S3 bucket" at s3("my.bucket.com").toHttps(s3region.value)

Patterns

You can set patterns using .withPatterns(...) method of S3Resolver. Default are maven-style patterns (just as in sbt), but you can change it with the convenience method .withIvyPatterns.

S3 IAM policy

If you want to publish and resolve artifacts in an S3 bucket you should have at least these permissions on your AWS-user/role:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": "arn:aws:s3:::mybucket"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:PutObjectAcl",
                "s3:GetObject"
            ],
            "Resource": "arn:aws:s3:::mybucket/*"
        }
    ]
}

In theory s3:CreateBucket may be also needed in the first statement in case if you publish to a non-existing bucket.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].