Skip to content

Instantly share code, notes, and snippets.

@arnu515
Last active October 23, 2023 09:01
Show Gist options
  • Save arnu515/ec8ee07542d04769a002038b51fed924 to your computer and use it in GitHub Desktop.
Save arnu515/ec8ee07542d04769a002038b51fed924 to your computer and use it in GitHub Desktop.
Amazon S3 Presigned post
// ON THE SERVER
import { generatePresignedPost } from "./s3";
const {url, fields} = await generatePresignedPost(...); // fill with your own parameters.
// send the above url, fields to the Client.
// ON THE CLIENT/BROWSER
// we must POST to the `url`, and set all the entries in `fields` as entries
// in our formdata body. The file to be uploaded must have the key `"file"`.
const fd = new FormData();
Object.keys(fields).map(k => fd.set(k, fields[k]));
fd.set("file", yourFile);
const res = await fetch(url, {
headers: { "Content-Type": "multipart/form-data" },
body: fd,
method: "POST"
});
// From https://youmightnotneed.com/lodash/
const re = /([0-9]+|([A-Z][a-z]+)|[a-z]+|([A-Z]+)(?![a-z]))/g;
const kebabCase = (str: string) =>
(String(str ?? "").match(re) || []).map((x) => x.toLowerCase()).join("-");
export default kebabCase;
import {S3Client} from "@aws-sdk/client-s3";
import { createPresignedPost } from "@aws-sdk/s3-presigned-post";
import kebabCase from "./kebabCase";
const s3 = new S3Client({
region: Bun.env.S3_REGION,
endpoint: Bun.env.S3_ENDPOINT,
forcePathStyle: false,
credentials: {
accessKeyId: Bun.env.S3_ACCESS_KEY!,
secretAccessKey: Bun.env.S3_SECRET_ACCESS_KEY!,
},
});
export function generatePresignedPost(
key: string,
type: string,
metadata: Record<string, string>,
maxSizeInBytes = 5 * 1024 * 1024, // 5MiB
minSizeInBytes = 1
) {
const formattedMetadata = Object.fromEntries(
Object.entries(metadata).map((i) => [`x-amz-meta-${kebabCase(i[0])}`, i[1]])
);
const metadataConditions = Object.entries(formattedMetadata).map((i) => [
"eq",
...i,
]) as ["eq", string, string][];
return createPresignedPost(s3, {
Bucket: Bun.env.S3_BUCKET!,
Key: key,
Conditions: [
{ acl: "public-read" },
{ bucket: Bun.env.S3_BUCKET! },
["starts-with", "$key", "some-prefix/"], // to make sure files are only uploaded to the some-prefix/ folder.
{ "Content-Type": type },
["content-length-range", minSizeInBytes, maxSizeInBytes],
...metadataConditions,
],
Fields: { acl: "public-read", ...formattedMetadata },
Expires: 600,
});
}
export default s3;
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment