Skip to content

Instantly share code, notes, and snippets.

@puf
Last active March 28, 2023 16:37
Show Gist options
  • Save puf/e00c34dd82b35c56e91adbc3a9b1c412 to your computer and use it in GitHub Desktop.
Save puf/e00c34dd82b35c56e91adbc3a9b1c412 to your computer and use it in GitHub Desktop.
Firebase Hosting Deploy Single File

This script may no longer work. Have a look at its (more official) replacement: https://github.com/firebase/firebase-tools/tree/master/scripts/examples/hosting/update-single-file

Firebase Hosting Deploy Single File

This utility script deploy a single local file to an existing Firebase Hosting site. Other files that are already deployed are left unmodified.

The difference with firebase deploy is that this script does not require you to have a local snapshot of all hosted files, you just need the one file that you want to add/update.

USE AT YOUR OWN RISK. NO WARRANTY IS PROVIDED.

Usage

node deployFile.js <site_name> <file_to_deploy> [commit]

Installation / Example

To use this script, you must have a signed-in installation of the Firebase CLI.

git clone https://gist.github.com/e00c34dd82b35c56e91adbc3a9b1c412.git firebase-hosting-deploy-file
cd firebase-hosting-deploy-file
npm install

# perform a dry run, make sure you're not doing something you'll regret
node deployFile.js contentsite /index.html

# do the deletion for real
node deployFile.js contentsite /index.html commit
const fs = require("fs");
const path = require("path");
const zlib = require("zlib");const crypto = require('crypto');
const request = require("request");
const { getGlobalDefaultAccount } = require('firebase-tools/lib/auth'); // see https://gist.github.com/puf/e00c34dd82b35c56e91adbc3a9b1c412#gistcomment-3718572
const api = require('firebase-tools/lib/api');
const isDryRun = process.argv[4] !== "commit";
if (!process.argv[2] || !process.argv[3]) {
console.error(`
ERROR: Must supply a site name and file to deploy. Usage:
node deployFile.js <site_name> <file_to_deploy> [commit]`);
process.exit(1);
}
const site = process.argv[2];
const file = process.argv[3];
requireAuth(getGlobalDefaultAccount(), ['https://www.googleapis.com/auth/cloud-platform']).then(async () => {
try {
// Steps in this script:
// 1. Determine version of the latest release
// 2. Get list of files in that version
// 3. Determine the hash of our local file
// 4. Create a new version
// 5. Send list of files from previous version, with our own local file in there too
// 6. Upload our local file if the Hosting server requests it
// 7. Finalize our new version
// 8. Create a release on this new version
// Determine the latest release
console.log("Determining latest release...")
var response = await api.request('GET', `/v1beta1/sites/${site}/releases`, { auth: true, origin: api.hostingApiOrigin });
let releases = response.body.releases;
releases.forEach((release) => { console.log(/*release.name, */release.version.status, release.version.createTime, release.version.fileCount, release.version.name); })
let latestVersion = releases[0].version.name;
// Get the files in the latest version
console.log("Getting files in latest version...")
response = await api.request('GET', `/v1beta1/${latestVersion}/files`, { auth: true, origin: api.hostingApiOrigin });
console.log(response.body);
var files = {};
response.body.files.forEach(file => {
files[file.path] = file.hash;
})
// prep our own file that we're uploading
const hasher = crypto.createHash("sha256");
const gzipper = zlib.createGzip({ level: 9 });
var zipstream = fs.createReadStream(process.cwd()+file).pipe(gzipper);
zipstream.pipe(hasher);
files[file] = await new Promise(function(resolve, reject) {
zipstream.on("end", function() {
resolve(hasher.read().toString("hex"));
});
zipstream.on("error", reject);
});
console.log(files[file]);
// Create a new version
console.log("Creating new version...")
response = await api.request('POST', `/v1beta1/sites/${site}/versions`, { auth: true, origin: api.hostingApiOrigin });
console.log(response.body);
let version = response.body.name;
// Send file info for the new version to the server, to hear what we need to upload
console.log("Sending file listing for new version...")
response = await api.request('POST', `/v1beta1/${version}:populateFiles`, {
auth: true,
origin: api.hostingApiOrigin,
data: { files: files }
})
console.log(response.body);
let requiredHashes = response.body.uploadRequiredHashes;
let uploadUrl = response.body.uploadUrl;
if (requiredHashes && requiredHashes.indexOf(files[file]) >= 0) {
console.log(`Uploading ${file}...`)
let reqOpts = await api.addRequestHeaders({
url: uploadUrl +"/"+ files[file],
})
await new Promise(function(resolve, reject) {
fs.createReadStream(process.cwd()+file).pipe(zlib.createGzip({ level: 9 })).pipe(
request.post(reqOpts, function(err, res) {
if (err) {
return reject(err);
} else if (res.statusCode !== 200) {
console.error(
"HTTP ERROR",
res.statusCode,
":",
res.headers,
res.body
);
return reject(new Error("Unexpected error while uploading file."));
}
resolve();
})
);
});
}
if (!isDryRun) {
console.log("Finalizing new version...");
response = await api.request('PATCH', `/v1beta1/${version}?updateMask=status`, {
origin: api.hostingApiOrigin,
auth: true,
data: { status: "FINALIZED" },
})
console.log(response.body);
console.log("Releasing new version...");
response = await api.request('POST', `/v1beta1/sites/${site}/releases?version_name=${version}`, {
auth: true,
origin: api.hostingApiOrigin,
data: { message: "Deployed from test.js" || null },
}
);
console.log(response.body);
}
else {
console.log("Dry run only.")
// Delete the version we just created, just to be nice
console.log("Deleting new version...")
response = await api.request('DELETE', `/v1beta1/${version}`, { auth: true, origin: api.hostingApiOrigin });
console.log(response.body);
}
} catch (error) {
console.error(error);
}
});
{
"dependencies": {
"firebase-tools": "^4.2.0"
},
"engines": {
"node": ">= 8.0.0"
}
}
@yzalvov
Copy link

yzalvov commented Apr 2, 2019

Hi @puf, Thanks for this clean script! Please advise, I use this approach in Cloud Functions to sync a file from another service (Contentful). Thanks to steps 1-2 and 5, all the project files are fine. But my experiments say that if after that I do firebase deploy on my local dev machine, the file from Contentful is gone.
Is there a way to handle this incremental deploy using firebase CLI? Or also using this script locally is my only option?
Many thanks for considering my request.

P.S. I'm kinda new to dev ops and Firebase deploy in particular, yet somehow I've managed to code steps 3 to 8 of your logics (juggling a good number of different auth and gapis options... :) before finding of your answer on SO: Programatically Write to Firebase Hosting. The most funny part is that I don't have 'reputation' to ask there in the comments. That's why I'm asking here if you don't mind.

@yzalvov
Copy link

yzalvov commented Apr 3, 2019

Well, I ended up using 20 lines as the predeploy script locally to sync down those files. Anyway, thanks a lot for thedeployFile.js, Frank!

@mateja176
Copy link

mateja176 commented Apr 15, 2019

Thanks for the script Frank! Unfortunately I received this error,

(node:17429) UnhandledPromiseRejectionWarning: Error
    at new FirebaseError (/home/mateja/Documents/firebase/firebase-hosting-deploy-file/node_modules/firebase-tools/lib/error.js:11:16)
    at Object.<anonymous> (/home/mateja/Documents/firebase/firebase-hosting-deploy-file/node_modules/firebase-tools/lib/requireAuth.js:14:18)
    at Module._compile (internal/modules/cjs/loader.js:721:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:732:10)
    at Module.load (internal/modules/cjs/loader.js:620:32)
    at tryModuleLoad (internal/modules/cjs/loader.js:560:12)
    at Function.Module._load (internal/modules/cjs/loader.js:552:3)
    at Module.require (internal/modules/cjs/loader.js:657:17)
    at require (internal/modules/cjs/helpers.js:22:18)
    at Object.<anonymous> (/home/mateja/Documents/firebase/firebase-hosting-deploy-file/deployFile.js:6:21)
(node:17429) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:17429) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.

After running the following commands with my own site and file name replaced

git clone https://gist.github.com/e00c34dd82b35c56e91adbc3a9b1c412.git firebase-hosting-deploy-file
cd firebase-hosting-deploy-file
npm install

# perform a dry run, make sure you're not doing something you'll regret
node deployFile.js contentsite /index.html

Do you have any suggestions?

@mikehardy
Copy link

@puf I think this works for small files out of luck because creating the hash is fast enough that the hasher wins the race between creating the hash and getting to the new version create / compare, but when I attempt to use it to publish big files the hash hasn't finished before version create, so nothing is published.

I altered the hashing part like so and it seems to work for me (which is awesome, thank you)

    var zipstream = fs.createReadStream(process.cwd()+file).pipe(gzipper);
    zipstream.pipe(hasher);

    var end = new Promise(function(resolve, reject) {
      zipstream.on("end", function() {
        // console.log(hasher.read().toString("hex")); 
        resolve(hasher.read().toString("hex"));
      });
      zipstream.on("error", reject);
    });
    files[file] = await end;
    console.log('should have a new file / file hash now');
    console.log(files);

@mikehardy
Copy link

mikehardy commented Sep 25, 2019

@puf - one more tiny thing as I integrate this in my CI/CD pipeline - this line is missing an await - https://gist.github.com/puf/e00c34dd82b35c56e91adbc3a9b1c412#file-deployfile-js-L117

But I continue to be thankful as that part was aesthetic really, and this thing works...

(and a side comment for anyone else using this - with the new multi-site abilities, you can have one "normal" firebase hosting site that is your typical "deploy it all every time" site, and one separate site that is for single-uploads like this, and to me at least, it should be a pretty safe system as long as you deploy with --except hosting normally, so you don't nuke your single-upload multi-site)

@gspassky
Copy link

@puf, many thanks for your script!
I'm making PWA (pregressive web app) on Angular using StackBlitz and Firebase. StackBlitz can deploy app directly to Firebase hosting, and it's amazing, but, it can't deploy static files and unusual scripts, including service worker JS, they are simply ignored. I've used your script to upload a service worker on the hosting, and everything worked fine, thank you very much!!!

But there was one problem. When your script publishes new version, it doesn't preserve hosting settings, including rewrites. Then, I have a single-page app, with only index.html, and hosting must redirect all requests (for example, to https://mydomain.com/somefolder) to the index.html. StackBlitz account this, but your script not.

So in my situation I added some code to line 65 of deployFile.js:
response = await api.request('POST',/v1beta1/sites/${site}/versions, { auth: true, origin: api.hostingApiOrigin, data: { "config": { "rewrites": [ { "glob": "**", "path": "/index.html" } ] } } });

Now it seems to work fine ;)

@mikehardy
Copy link

@puf With firebase-tools releases >= 8.x this line:
https://gist.github.com/puf/e00c34dd82b35c56e91adbc3a9b1c412#file-deployfile-js-L6
needs to go from

const requireAuth = require('firebase-tools/lib/requireAuth');

to

const { requireAuth } = require('firebase-tools/lib/requireAuth');

The script keeps chugging along though :-) - @puf would you be offended if I hosted a fully up to date version as a gist of my own? It seems this one isn't getting updated, but it's super useful and just needs a couple tweaks

@puf
Copy link
Author

puf commented Apr 21, 2020

Hey Mike.

I'm definitely slow on updating, as I haven't had a chance to test this since publishing the initial version. I should've made it a regular repo, so you could've just submitted PRs. Hindsight...

  1. I just updated the script with your latest change to require.
  2. For your previous change to the console.log line, are you saying it needs to await the body? I couldn't find any reason for needing that in looking at the code.
  3. I think I already processed your change before that to the hasher. Was there still a mistake in there?

Thanks for submitting these fixes! 🔥

 puf

@mikehardy
Copy link

Hey @puf - not sure what is going on with 2 referenced above, I checked my copy and I'm not awaiting anything so - no action to take and I'm not sure what Me-Past was thinking, 1 and 3 look great so AFAICS this script is ready to go for anything on current tools, for anyone else that needs it

@puf
Copy link
Author

puf commented Apr 22, 2020 via email

@Gerschtli
Copy link

@mikehardy You mentioned above that you use multi site hosting for this. I don't know your use case but this way it would be Impossible to serve files from both hostings on one domain, or did I miss something? My use case would be to deploy static assets and pages in one hosting and generated html files (from cloud functions) in a separate hosting like index.html.

Either way, thank you all for this script. It's a shame that this is not supported without such a hack.

@mikehardy
Copy link

@Gerschtli that's correct, if you do multi-site then the "single file upload site" will be controlled by this script, and a different domain. I do that just because it's where I put auto-updates for my app so it's a computer interface basically, not a human one, and the benefit is just that I don't have to worry about accidentally publishing an old version of the file maybe if I publish the whole site with a stale version of the single-file I'm uploading. This is likely pretty use-case specific for me.

@JanOschii
Copy link

JanOschii commented Aug 19, 2020

@puf Are you sure, that the hash your generating for a file locally is the same on the server?
In my project the resulting hashes for a file in a version are different, if I send the same file via firebase deploy and if I use the script.
I also think it is wrong to make a hash of a compressed file to. So the hash depends on the level of compression. The server delivers the file uncompressed at the end.
I'm just trying to understand the logic.

@puf
Copy link
Author

puf commented Aug 19, 2020

This script worked when I wrote it, which means the hashes must have matched.
I've never compressed files myself though, but instead leave that to Firebase Hosting.

@kitfit-dave
Copy link

Thanks for this!

Just a short note, not sure if this is a change or was always an issue:

It appears that the code as is will result in any rewrites that were contained in your firebase.json file (probably any config like redirects, headers, etc.) to be removed by the api when you create a version like this. Fortunately, you can optionally pass the configuration object to the api call:

response = await api.request('POST', `/v1beta1/sites/${site}/versions`, {
	auth: true,
	origin: api.hostingApiOrigin,
	data: {
		config: {
			rewrites: [
				{
					glob: '/ping',
					function: 'ping'
				},
				{
					glob: '**',
					path: '/index.html'
				}
			]
		}
	}
})	

Hopefully this might save someone else a little time.

@puf
Copy link
Author

puf commented Nov 11, 2020 via email

@kitfit-dave
Copy link

That's interesting David. I would've expected the new version to have the same config as the existing version, but you're saying that it doesn't?

Yep, that is exactly what I am seeing. All of our rewrites were gone after the file was uploaded and released. I would have expected the same thing as you, for the existing config to remain.

@puf
Copy link
Author

puf commented Nov 11, 2020

Darn. I'm not sure if that's a change in behavior, because I likely never tried with a non-default config. I hope others find your comment/fix, and will merge it in if there are more reports. Thanks again!

@josalvmo
Copy link

I am getting:

requireAuth({}, ['https://www.googleapis.com/auth/cloud-platform']).then(async () => {
^

TypeError: requireAuth is not a function
at Object. (C:\Dev\Repos\firebase-hosting-deploy-file\deployFile.js:21:1)
at Module._compile (internal/modules/cjs/loader.js:1138:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1158:10)
at Module.load (internal/modules/cjs/loader.js:986:32)
at Function.Module._load (internal/modules/cjs/loader.js:879:14)
at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:71:12)
at internal/main/run_main_module.js:17:47

Any idea?

@mikehardy
Copy link

With firebase-tools >= 9.9.0 the requireAuth function has changed and now requires an Account object passed in, it won't automatically grab one.

These changes work (I'll propose the same here as edits)

// add this new import at the top
const { getGlobalDefaultAccount } = require('firebase-tools/lib/auth');

// use the import to get the current default account (assuming that is the account you want...)
const defaultAccount = getGlobalDefaultAccount();

// now send that account in, instead of the empty object before
//requireAuth({}, ['https://www.googleapis.com/auth/cloud-platform'])  // old version, no account
requireAuth(defaultAccount, ['https://www.googleapis.com/auth/cloud-platform'])  // new version, works

upstream issue firebase/firebase-tools#3308

@mikehardy
Copy link

@puf I just realized I cannot actually PR to this file, but if you look at the comment immediately above ☝️ you will see a code edit required for firebase-tools >= 9.9.0

@puf
Copy link
Author

puf commented Apr 24, 2021

Thanks for the update Mike. 👍 I merged the changes.
Does this mean it is no longer possible to run with the user that is logged in to the Firebase CLI? Or did we already lose that ability before?

@mikehardy
Copy link

As far as I can tell, this is just a change / new feature in firebase-tools that allows (if you use new APIs) more than one user to have stored credentials, and for you to select from them. What I did with this specific change was used the new API that gets the "global default user" (aka, what we had before) and uses that as some user has to be given to requireAuth now (it won't go find one if you don't provide one). And I think my change is backwards compatible with regards to behavior, though of course if people try to use this on firebase-tools <= 9.8.0 now it will fail.

So to be precise and less wordy: this change should continue deploying single files using the user currently logged in to the firebase CLI, same as ever (it does in my testing anyway)

@puf
Copy link
Author

puf commented Apr 24, 2021 via email

@puf
Copy link
Author

puf commented Aug 25, 2022

If you find this script not working for you anymore, have a look at its (more official) replacement here: https://github.com/firebase/firebase-tools/tree/master/scripts/examples/hosting/update-single-file

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment