Skip to content

Instantly share code, notes, and snippets.

@danalloway
Last active March 29, 2024 06:19
Show Gist options
  • Star 5 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save danalloway/86f79efd5ca336ff7364554ac3104014 to your computer and use it in GitHub Desktop.
Save danalloway/86f79efd5ca336ff7364554ac3104014 to your computer and use it in GitHub Desktop.
Migrate Google Cloud Storage (GCS) to Supabase Storage
/**
* 1.) Make sure you have the `GOOGLE_APPLICATION_CREDENTIALS` environment variable
* set to the path of your local service account credentials.
* @see https://cloud.google.com/storage/docs/reference/libraries#setting_up_authentication
*
* 2.) Make sure you have the `SUPABASE_URL` and `SUPABASE_KEY` environment variables set
* with the proper values from your Supabase Project API page.
*
* Install dependancies: `npm install --save node-fetch form-data @google-cloud/storage`
*/
const fetch = require("node-fetch");
const FormData = require("form-data");
const { Storage } = require("@google-cloud/storage");
const SUPABASE_API_URL = process.env.SUPABASE_URL;
const SUPABASE_AUTH_TOKEN = process.env.SUPABASE_KEY;
const SUPABASE_BUCKET_ID = "YOUR_SUPABASE_BUCKET_ID";
const GCS_BUCKET_ID = "YOUR_GCS_BUCKET_ID";
const GCS_BUCKET_PREFIX = "SUB_FOLDER_TO_MIGRATE/";
const gcs = new Storage();
const gcsBucket = gcs.bucket(GCS_BUCKET_ID);
console.debug("Starting...");
gcsBucket.getFiles({ prefix: GCS_BUCKET_PREFIX }).then(async (data) => {
const files = data[0];
for (const file of files) {
try {
await upload(file);
} catch (err) {
console.error(err);
}
}
console.debug("Done!");
});
/**
* Stream an upload to Supabase.
*
* @returns {Promise}
*/
function upload(file) {
const path = file.name;
const { contentType, size } = file.metadata;
const form = new FormData();
form.append("file", file.createReadStream(), {
contentType,
filepath: path,
knownLength: size,
});
return fetch(
`${SUPABASE_API_URL}/storage/v1/object/${SUPABASE_BUCKET_ID}/${path}`,
{
method: "POST",
body: form,
headers: {
Authorization: `Bearer ${SUPABASE_AUTH_TOKEN}`
},
}
).then(() => {
console.debug(`Processed file: ${path}`);
});
}
@danalloway
Copy link
Author

This is awesome @danalloway!
One feedback - you actually don't need the apiKey here. Just the Authorization header is enough.

updated

@inian
Copy link

inian commented Apr 2, 2021

Nice! We need to support this natively within supabase-js. Been thinking of how to support it without bloating the library too much. As in this example, form-data is required in Node.js but not required if you are using supabase-js purely on the client side.

@danalloway
Copy link
Author

danalloway commented Apr 2, 2021

Nice! We need to support this natively within supabase-js. Been thinking of how to support it without bloating the library too much. As in this example, form-data is required in Node.js but not required if you are using supabase-js purely on the client side.

What if the Supabase Storage API supported accepting a file stream as the POST body?

https://github.com/node-fetch/node-fetch#post-data-using-a-file-stream

this would remove the need for form-data and you could just rely on node-fetch which is already present because I think I saw you were using cross-fetch

@DeepakDonde-GWL
Copy link

DeepakDonde-GWL commented Mar 29, 2024

Great!... Is it possible to move everything from GCS bucket to Supabase bucket(with extact subfolder path) without mentioning GCS_BUCKET_PREFIX

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment