defmodule SimpleS3Upload do | |
@moduledoc """ | |
Dependency-free S3 Form Upload using HTTP POST sigv4 | |
https://docs.aws.amazon.com/AmazonS3/latest/API/sigv4-post-example.html | |
""" | |
@doc """ | |
Signs a form upload. | |
The configuration is a map which must contain the following keys: | |
* `:region` - The AWS region, such as "us-east-1" | |
* `:access_key_id` - The AWS access key id | |
* `:secret_access_key` - The AWS secret access key | |
Returns a map of form fields to be used on the client via the JavaScript `FormData` API. | |
## Options | |
* `:key` - The required key of the object to be uploaded. | |
* `:max_file_size` - The required maximum allowed file size in bytes. | |
* `:content_type` - The required MIME type of the file to be uploaded. | |
* `:expires_in` - The required expiration time in milliseconds from now | |
before the signed upload expires. | |
## Examples | |
config = %{ | |
region: "us-east-1", | |
access_key_id: System.fetch_env!("AWS_ACCESS_KEY_ID"), | |
secret_access_key: System.fetch_env!("AWS_SECRET_ACCESS_KEY") | |
} | |
{:ok, fields} = | |
SimpleS3Upload.sign_form_upload(config, "my-bucket", | |
key: "public/my-file-name", | |
content_type: "image/png", | |
max_file_size: 10_000, | |
expires_in: :timer.hours(1) | |
) | |
""" | |
def sign_form_upload(config, bucket, opts) do | |
key = Keyword.fetch!(opts, :key) | |
max_file_size = Keyword.fetch!(opts, :max_file_size) | |
content_type = Keyword.fetch!(opts, :content_type) | |
expires_in = Keyword.fetch!(opts, :expires_in) | |
expires_at = DateTime.add(DateTime.utc_now(), expires_in, :millisecond) | |
amz_date = amz_date(expires_at) | |
credential = credential(config, expires_at) | |
encoded_policy = | |
Base.encode64(""" | |
{ | |
"expiration": "#{DateTime.to_iso8601(expires_at)}", | |
"conditions": [ | |
{"bucket": "#{bucket}"}, | |
["eq", "$key", "#{key}"], | |
{"acl": "public-read"}, | |
["eq", "$Content-Type", "#{content_type}"], | |
["content-length-range", 0, #{max_file_size}], | |
{"x-amz-server-side-encryption": "AES256"}, | |
{"x-amz-credential": "#{credential}"}, | |
{"x-amz-algorithm": "AWS4-HMAC-SHA256"}, | |
{"x-amz-date": "#{amz_date}"} | |
] | |
} | |
""") | |
fields = %{ | |
"key" => key, | |
"acl" => "public-read", | |
"content-type" => content_type, | |
"x-amz-server-side-encryption" => "AES256", | |
"x-amz-credential" => credential, | |
"x-amz-algorithm" => "AWS4-HMAC-SHA256", | |
"x-amz-date" => amz_date, | |
"policy" => encoded_policy, | |
"x-amz-signature" => signature(config, expires_at, encoded_policy) | |
} | |
{:ok, fields} | |
end | |
defp amz_date(time) do | |
time | |
|> NaiveDateTime.to_iso8601() | |
|> String.split(".") | |
|> List.first() | |
|> String.replace("-", "") | |
|> String.replace(":", "") | |
|> Kernel.<>("Z") | |
end | |
defp credential(%{} = config, %DateTime{} = expires_at) do | |
"#{config.access_key_id}/#{short_date(expires_at)}/#{config.region}/s3/aws4_request" | |
end | |
defp signature(config, %DateTime{} = expires_at, encoded_policy) do | |
config | |
|> signing_key(expires_at, "s3") | |
|> sha256(encoded_policy) | |
|> Base.encode16(case: :lower) | |
end | |
defp signing_key(%{} = config, %DateTime{} = expires_at, service) when service in ["s3"] do | |
amz_date = short_date(expires_at) | |
%{secret_access_key: secret, region: region} = config | |
("AWS4" <> secret) | |
|> sha256(amz_date) | |
|> sha256(region) | |
|> sha256(service) | |
|> sha256("aws4_request") | |
end | |
defp short_date(%DateTime{} = expires_at) do | |
expires_at | |
|> amz_date() | |
|> String.slice(0..7) | |
end | |
defp sha256(secret, msg), do: :crypto.hmac(:sha256, secret, msg) | |
end |
@clarkware Sure! shipit!
Thanks!
I tried this with GCS. As far as I know, x-amz extensions should work with GCS as well but I am always getting this error:
XHRPOSThttps://storage.googleapis.com/my-bucket
CORS Missing Allow Origin
1
<?xml version='1.0' encoding='UTF-8'?><Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your Google secret key and signing method.</Message><StringToSign>ewogI...</StringToSign></Error>
Actually, I am stuck here with this problem for months already.
Here is the gist that I tried for GCS https://gist.github.com/dgigafox/c293a252d2ad97f5cdeb4c3759313ba5
Some notable changes I made are I usedDateTime.utc_now()
on amz_date = amz_date(DateTime.utc_now())
and removed x-amz-server-side-encryption
on both policy and fields as it was not defined as field for GCS policy.
This has been super helpful! Thank you!
I'm working on a new LiveView project with Elixir 1.12.0. and OTP 24 and I ran into an issue with this line:
defp sha256(secret, msg), do: :crypto.hmac(:sha256, secret, msg)`
which raises a function :crypto.hmac/3 is undefined or private
error. It looks like updating the line to this may solve the issue:
defp sha256(secret, msg), do: :crypto.mac(:hmac, :sha256, secret, msg)
I found this in plug_crypto, does that look right to you?
@genevievecurry I'm also using that line in my LiveView apps that are doing file uploads.
defp sha256(secret, msg), do: :crypto.mac(:hmac, :sha256, secret, msg)
Would be great to see this added as a comment @chrismccord for Erlang 24+
Does anyone know a way to set the cache header on the file ? I couldn't figure it out.
It looks like some setting has changed I'm getting access denied.
@chrismccord What would be the proper way to test this with automatic tests?
Small gotcha that might be worth knowing about if someone else runs into the same problem.
Setting {"acl": "public-read"},
and "acl" => "public-read",
will cause an error when submitting the upload if your S3 bucket has "Block public access (BlockPublicAcls)" enabled. This is a common thing to have if the bucket is behind a CloudFront distribution and not intended to be accessed directly.
If this applies to you, simply remove the two lines of code mentioned above and things should work fine.
This has been super helpful! Thank you!
I'm working on a new LiveView project with Elixir 1.12.0. and OTP 24 and I ran into an issue with this line:
defp sha256(secret, msg), do: :crypto.hmac(:sha256, secret, msg)`
which raises a
function :crypto.hmac/3 is undefined or private
error. It looks like updating the line to this may solve the issue:defp sha256(secret, msg), do: :crypto.mac(:hmac, :sha256, secret, msg)
I found this in plug_crypto, does that look right to you?
A good read related to said issue I've found. https://www.erlang.org/doc/apps/crypto/new_api.html
Similar to this, I made a module that can be used to generate a presigned url. Use at own risk though, there's probably some bugs. https://gist.github.com/denvaar/66721b7a2f54f90592a509d29f57f831
Small gotcha that might be worth knowing about if someone else runs into the same problem.
Setting
{"acl": "public-read"},
and"acl" => "public-read",
will cause an error when submitting the upload if your S3 bucket has "Block public access (BlockPublicAcls)" enabled. This is a common thing to have if the bucket is behind a CloudFront distribution and not intended to be accessed directly.If this applies to you, simply remove the two lines of code mentioned above and things should work fine.
this helped me with the cors issue which i could not really understand why it was happening.
Hey, Chris. I love that this is dependency-free.
Since it's not officially licensed, I wanted to ask if you'd grant permission for me to use it in my LiveView course (crediting you as the creator). I'm not planning to walk through the details, just include it wholesale in the course's code so it's easy for folks to use.
Would that be cool with you?