Skip to content

Instantly share code, notes, and snippets.

@adactio
Last active August 18, 2023 09:15
  • Star 83 You must be signed in to star a gist
  • Fork 4 You must be signed in to fork a gist
Star You must be signed in to star a gist
Save adactio/3717b7da007a9363ddf21f584aae34af to your computer and use it in GitHub Desktop.
An attempt at a minimal viable service worker.
// Licensed under a CC0 1.0 Universal (CC0 1.0) Public Domain Dedication
// http://creativecommons.org/publicdomain/zero/1.0/
// HTML files: try the network first, then the cache.
// Other files: try the cache first, then the network.
// Both: cache a fresh version if possible.
// (beware: the cache will grow and grow; there's no cleanup)
const cacheName = 'files';
addEventListener('fetch', fetchEvent => {
const request = fetchEvent.request;
if (request.method !== 'GET') {
return;
}
fetchEvent.respondWith(async function() {
const fetchPromise = fetch(request);
fetchEvent.waitUntil(async function() {
const responseFromFetch = await fetchPromise;
const responseCopy = responseFromFetch.clone();
const myCache = await caches.open(cacheName);
return myCache.put(request, responseCopy);
}());
if (request.headers.get('Accept').includes('text/html')) {
try {
return await fetchPromise;
}
catch(error) {
return caches.match(request);
}
} else {
const responseFromCache = await caches.match(request);
return responseFromCache || fetchPromise;
}
}());
});
@jakearchibald
Copy link

Line 8: Typo, should be addEventListener('fetch', fetchEvent => {. I didn't even notice this until I pasted it into VSCode & it complained 😄.

Line 14: This try block doesn't have a catch/finally.

Line 15: Awaiting the fetch here means you're blocking on the network, even when you're returning cache-first.

Line 16: .clone is sync, so you don't need to await it, although it does no harm. Given that responseCopy is only used for caching, I'd put it within the waitUntil block, since that's the only place it's used.

Line 22: I'm kinda undecided, but is request.mode === 'navigate' a better signal here?

Incorporating the above:

const cacheName = 'files';

addEventListener('fetch', fetchEvent => {
  const { request } = fetchEvent;
  
  if (request.method !== 'GET') {
    return;
  }

  fetchEvent.respondWith(async function() {
    const fetchPromise = fetch(request);

    fetchEvent.waitUntil(async function () {
      const responseCopy = (await fetchPromise).clone();
      const myCache = await caches.open(cacheName);
      await myCache.put(request, responseCopy);
    }());

    if (request.mode === 'navigate') {
      try {
        return await fetchPromise;
      }
      catch {
        return caches.match(request);
      }
    } else {
      const responseFromCache = await caches.match(request);
      return responseFromCache || fetchPromise;
    }
  }());
});

@adactio
Copy link
Author

adactio commented Jan 17, 2018

Ah, right! Thank you so much, Jake. I'll have another stab based on your feedback.

@adactio
Copy link
Author

adactio commented Jan 18, 2018

Okay, bearing in mind that I’m quite dim, I’m hoping to better understand this bit:

Awaiting the fetch here means you're blocking on the network, even when you're returning cache-first.

So there's a difference between saying fetch(request) and saying await fetch(request), right? But in this scenario, no matter what happens, I want to fetch the resource (either to return it and/or to cache it). Now when you say that including await means I’m blocking on the network, does that mean the fetch happens and then the cache.match happens? I thought everything would be asynchronous anyway ...so I could safely start fetching the resource even before I know whether I want to return it or not (because I'm definitely going to put it in the cache).

I guess I’m trying to figure out how the presence or absence of await before that fetch changes the order of events.

@jakearchibald
Copy link

jakearchibald commented Jan 18, 2018

Here's a method what creates a promise that resolves after some milliseconds:

function wait(ms) {
  return new Promise(r => setTimeout(r, ms));
}

This function takes two seconds to complete:

async function demo() {
  await wait(1000);
  await wait(1000);
  console.log('Done!');
}

The async function yields for a second when it hits the first wait(1000), then it continues and yields for another second when it hits the second wait(1000).

Whereas this function takes one second to complete:

async function demo() {
  const waitPromise = wait(1000);
  await wait(1000);
  await waitPromise;
  console.log('Done!');
}

It gets a promise for a one second wait, but it doesn't await it, instead it goes straight to the next line, await wait(1000);, where it does wait one second. Then, it awaits the completion of waitPromise, but that started a second ago, so it's already done.

Async functions are brilliant, but you need to ensure that you're still allowing things to happen in parallel, and avoid waiting on stuff you don't want to wait on.

addEventListener('fetch', event => {
  event.respondWith(async function() {
    // With this next line, the function yields until it can provide a response object,
    // which needs HTTP response headers.
    const networkResponse = await fetch(event.request);
    
    event.waitUntil(async function() {
      // Imaging the caching of the networkResponse happens here
    }());
    
    // Cache-first approach
    const cachedResponse = await caches.match(event.request);
    return cachedResponse || networkResponse;
  }());
});

The above isn't quite right, as the function yields on awaiting a network response before it can return the final value, so even the cache-first approach is blocked on the network.

addEventListener('fetch', event => {
  event.respondWith(async function() {
    // This time, we don't await the network response, but we do start it
    const networkResponsePromise = fetch(event.request);
    
    event.waitUntil(async function() {
      const networkResponseClone = (await networkResponsePromise).clone();
      // Imaging the caching of the networkResponseClone happens here
    }());
    
    // Cache-first approach
    const cachedResponse = await caches.match(event.request);
    return cachedResponse || (await networkResponsePromise);
  }());
});

The above example doesn't have the same problem.

@adactio
Copy link
Author

adactio commented Jan 18, 2018

Got it!

And for bonus points, that last line can be simplified to return cachedResponse || networkResponsePromise;.

@paulyabsley
Copy link

Thanks you for this. I have the caching working and used the offline fallback example from the 2018.ampersand site.

Just wondered if you or @jakearchibald have a strategy for dealing with the requests that Google Analytics makes to https://www.google-analytics.com/collect and a way to stop them from being added to the cache?

@adactio
Copy link
Author

adactio commented Jun 17, 2018

@paulyabsley One option is to add in a catch-all that skips over any requests for files that aren't from your own domain, similar to how we're ignoring any non-GET requests. Right before or after that bit, you could add something like:

const url = new URL(request.url);
if (url.origin !== location.origin) {
    return;
}

@paulyabsley
Copy link

Ah cool, thanks!

@dracos
Copy link

dracos commented Dec 18, 2019

Just to note that the main gist here is missing an await in the "return fetchPromise" line (as given in @jakearchibald's update), which means as-is it won't respond from the cache for HTML pages - as the promise can always be returned so will be, regardless of whether it resolves or rejects.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment