Cache-limiting in Service Workers

When I was documenting my first Service Worker I mentioned that every time a user requests a page, I store that page in a cache for later (offline) use:

Right now I’m stashing any HTML pages the user visits into the cache. I don’t think that will get out of control—I imagine most people only ever visit just a handful of pages on my site. But there’s the chance that the cache could get quite bloated. Ideally I’d have some way of keeping the cache nice and lean.

I was thinking: maybe I should have a separate cache for HTML pages, and limit the number in that cache to, say, 20 or 30 items. Every time I push something new into that cache, I could pop the oldest item out.

I could imagine doing something similar for images: keeping a cache of just the most recent 10 or 20.

Well I’ve done that now. Here’s the updated Service Worker code.

I’ve got a function in there called stashInCache that takes a few arguments: which cache to use, the maximum number of items that should be in there, the request (URL), and the response:

var stashInCache = function(cacheName, maxItems, request, response) {
    caches.open(cacheName)
        .then(function (cache) {
            cache.keys()
                .then(function (keys) {
                    if (keys.length < maxItems) {
                        cache.put(request, response);
                    } else {
                        cache.delete(keys[0])
                            .then(function() {
                                cache.put(request, response);
                            });
                    }
                })
        });
};

It looks to see if the current number of items in the cache is less than the specified maximum:

if (keys.length < maxItems)

If so, go ahead and cache the item:

cache.put(request, response);

Otherwise, delete the first item from the cache and then put the item in the cache:

cache.delete(keys[0])
  .then(function() {
    cache.put(request, response);
  });

For HTML requests, I limit the cache to 35 items:

var copy = response.clone();
var cacheName = version + pagesCacheName;
var maxItems = 35;
stashInCache(cacheName, maxItems, request, copy);
return response;

For images, I’m limiting the cache to 20 items:

var copy = response.clone();
var cacheName = version + imagesCacheName;
var maxItems = 20;
stashInCache(cacheName, maxItems, request, copy);
return response;

Here’s my updated Service Worker.

The cache-limited seems to be working for pages. But for some reason the images cache has blown past its allotted maximum of 20 (you can see the items in the caches under the “Resources” tab in Chrome under “Cache Storage”).

This is almost certainly because I’m doing something wrong or have completely misunderstood how the caching works. If you can spot what I’m doing wrong, please let me know.

Have you published a response to this? :

Responses

Kartik Prabhu

So the cache is a queue—oldest items first—not a stack! Never realised this before. Thanks for sharing your code.

Aaron Gustafson

User experience encompasses more than just the interface. Download speed, render performance, and the cost of accessing a site are often overlooked areas when it comes to the practice of UX, but they all affect how users experience what we build on the Web.

I’m always looking for ways to improve these aspects of my own site. And, since it’s my own personal playground, I often use it as a test-bed for new technologies, ideas, and techniques. My latest adventure was inspired by a bunch of articles and posts I’ve linked to recently, especially

After reading these pieces, I decided to see how much I could do to improve the performance of this site, especially on posts with a lot of images and embedded code samples, like my recent post on form labels.

Using Resource Hints

To kick things off, I followed Malte’s advice and used Resource Hints to prime the pump for any third-party servers hosting assets I use frequently (e.g. Disqus, Twitter, etc.). I used the code Malte references in the AMP Project as my starting point and added two new methods (preconnect() and prefetch()) to my global AG object. With that library code in place, I can call those methods as necessary from within my other JavaScript files. Here’s a simplified extract from my Disqus integration script:

if ( ‘AG’ in window && ‘preconnect’ in window.AG ) { window.AG.preconnect( ‘//disqus.com/’ ); window.AG.prefetch( ‘//’ + disqusshortname + ‘.disqus.com/count.js’ ); } view raw resource-hints-sample.js hosted with ❤ by GitHub

While a minor addition, the speed improvement in supporting browsers was noticeable.1

Integrating Service Worker

With that in the bag, I set about making my first Service Worker. I started off gently, using Dean’s piece as a guide. I added a WebP conversion bit to my image processing Gulp task to get the files in place and then I created the Service Worker. By default, Dean’s code converts all JPG and PNG requests to WebP responses, so I set it up to limit the requests to only those files being requested directly from my server. I have no way of knowing if WebP equivalents of every JPG and PNG exist on the open web (probably not), but I know they exist on my server. Here’s the updated code:

“use strict”; self.addEventListener(‘fetch’, function(event) { var request = event.request, url = request.url, url
object = new URL( url ), rejpgorpng = /\.(?:jpg|png)$/, supportswebp = false, // pessimism webpurl; // Check if the image is a local jpg or png if ( rejpgorpng.test( request.url ) && urlobject.origin == location.origin ) { // console.log(‘WORKER: caught a request for a local PNG or JPG’); // Inspect the accept header for WebP support if ( request.headers.has(‘accept’) ) { supportswebp = request.headers.get(‘accept’).includes(‘webp’); } // Browser supports WebP if ( supportswebp ) { // Make the new URL webpurl = url.substr(0, url.lastIndexOf(‘.’)) + ‘.webp’; event.respondWith( fetch( webpurl, { mode: ‘no-cors’ } ) ); } } }); view raw webp-service-worker.js hosted with ❤ by GitHub

When I began tucking to the caching possibilities of Service Workers, following Nicolas’ and Jeremy’s posts, I opted to tweak Nicholas’ caching setup a bit. I’m still not completely thrilled with it, but it’s a work in progress. I’m sure I will tweak as I get more familiar with the technology.

To keep my Service Worker code modularized (like my other JavaScript code), I opted to break it up into separate files and am using Gulp to merge them all together and move the combined file into the root of the site. If you’d like to follow a similar path, feel free to adapt this Gulp task (which builds all of my JavaScript):

var gulp = require(‘gulp’), path = require(‘path’), folder = require(‘gulp-folders’), gulpIf = require(‘gulp-if’), insert = require(‘gulp-insert’), concat = require(‘gulp-concat’), uglify = require(‘gulp-uglify’), notify = require(‘gulp-notify’), rename = require(‘gulp-rename’), //handleErrors = require(‘handleErrors’), source
folder = ‘source/javascript’, destinationroot = ‘source’, destinationfolder = destinationroot + ‘/j’, publicroot = ‘public’ publicfolder = publicroot + ‘/j’, renameserviceworker = rename({ dirname: “../” }); gulp.task(‘scripts’, folder(sourcefolder, function(folder){ return gulp.src(path.join(sourcefolder, folder, ‘.js’)) .pipe(concat(folder + ‘.js’)) .pipe(insert.transform(function(contents, file){ // insert a build time variable var build_time = (new Date()).getTime() + ”; return contents.replace( ‘{{BUILD_TIME}}’, build_time ); })) .pipe(gulp.dest(destination_folder)) .pipe(gulp.dest(public_folder)) .pipe(rename({suffix: ‘.min’})) .pipe(uglify()) .pipe(gulpIf(folder==’serviceworker’,rename_serviceworker)) .pipe(gulp.dest(destination_folder)) .pipe(gulp.dest(public_folder)) .pipe(notify({ message: ‘Scripts task complete’ })); //.on(‘error’, handleErrors); })); view raw gulp-scripts.js hosted with ❤ by GitHub

As most of the walkthroughs recommended that you version your Service Worker if you’re doing any caching, I set mine up to be auto-versioned by inserting a timestamp (lines 23-27, above) into my Service Worker header file (line 3, below):

var gulp = require(‘gulp’), path = require(‘path’), folder = require(‘gulp-folders’), gulpIf = require(‘gulp-if’), insert = require(‘gulp-insert’), concat = require(‘gulp-concat’), uglify = require(‘gulp-uglify’), notify = require(‘gulp-notify’), rename = require(‘gulp-rename’), //handleErrors = require(‘handleErrors’), source_folder = ‘source/_javascript’, destination_root = ‘source’, destination_folder = destination_root + ‘/j’, public_root = ‘public’ public_folder = public_root + ‘/j’, rename_serviceworker = rename({ dirname: “../” }); gulp.task(‘scripts’, folder(source_folder, function(folder){ return gulp.src(path.join(source_folder, folder, ‘
.js’)) .pipe(concat(folder + ‘.js’)) .pipe(insert.transform(function(contents, file){ // insert a build time variable var buildtime = (new Date()).getTime() + ”; return contents.replace( ‘{{BUILDTIME}}’, buildtime ); })) .pipe(gulp.dest(destinationfolder)) .pipe(gulp.dest(publicfolder)) .pipe(rename({suffix: ‘.min’})) .pipe(uglify()) .pipe(gulpIf(folder==’serviceworker’,renameserviceworker)) .pipe(gulp.dest(destinationfolder)) .pipe(gulp.dest(publicfolder)) .pipe(notify({ message: ‘Scripts task complete’ })); //.on(‘error’, handleErrors); })); view raw gulp-scripts.js hosted with ❤ by GitHub

Service Workers are still pretty new (and modestly supported), but it’s definitely interesting to see what’s possible using them. Like Jeremy, I want to do a bit more exploration into caching and how it may actually increase the monetary cost of accessing a website if not used properly. Like any powerful tool, we need to wield it wisely.

Making Gists Static

On particularly code-heavy posts (yes, like this one), I make liberal use of Gists. They’re quite useful, but the Gist plugin for Jekyll, while good, still requests a script from Github in order to load the pretty printed version of the Gist. On some posts, that can mean 5 or more additional network requests, not to mention execution time for the JavaScript. It’s yet another dependency that could prohibit you from quickly getting to the content you’re looking for. Additionally, if JavaScript should be available, but isn’t, you get nothing (since the noscript content is only evaluated if JavaScript support isn’t available or if a user turns it off).

With all of this in mind, I decided to revise the plugin and make it capable of downloading the JavaScript code directly. It then extracts the HTML markup that the JavaScript would be writing into the page and just embeds it directly. It also caches the result, which is handy for speeding up the build process.

You can grab my fork of the Gist Jekyll Plugin as, well, a Gist. It’s also in the source of this site on Github.

(Hopefully) A Little Faster

All told, these changes have gotten the render time of this site down significantly across the board.2 Even more so on browsers that support Service Workers and Resource Hints. I’ll likely continue tweaking as I go, but I wanted to share my process, code, and thoughts in case any of it might be useful to you in your own work. In the end, it’s all about creating better experiences for our users. How our sites perform is a big part of that.

  1. Sadly I forgot to run some speed tests prior to rolling out this change and I didn’t feel like rolling back the site, so I don’t have solid numbers for you. That said, it seemed to shave nearly 2 seconds off of the load time on heavy pages like the post I mentioned.

  2. Again, I don’t have the numbers, but I am routinely seeing DOMContentLoaded reached between 400-600ms with Service Worker caching in play.

brandonrozek.com

Summary: I rewrote how cache limiting works to address a few problems listed later in this post. Check out the gist for the updated code.

I wrote a function in my previous service worker post to help limit the cache. Here’s a reminder of what it looked like.


var limitCache = function(cache, maxItems) { cache.keys().then(function(items) { if (items.length > maxItems) { cache.delete(items[0]); } })
}

The Problem

Jeremy Keith updated the service worker on his site and noticed that the images has blown past the amount he allocated for it (post). Looking back at my service worker, I noticed that mine has the same shortcoming as well. So what happened? Service workers function in an asynchronous manner. Meaning it can be processing not just one, but many fetch events at the same time. This comes into conflict when there are synchronous instructions such as deleting the first item from the cache which Jeremy describes in his follow up post.

A Solution

Jeremy wrote a function to help trim the cache and asked when it would be appropriate to apply it.


var trimCache = function (cacheName, maxItems) { caches.open(cacheName) .then(function (cache) { cache.keys() .then(function (keys) { if (keys.length > maxItems) { cache.delete(keys[0]) .then(trimCache(cacheName, maxItems)); } }); });
};

And that got me thinking. In what situations is this problem more likely to occur? This particular problem happens when a lot of files are being called asynchronously. This problem doesn’t occur when only one file is being loaded. So when do we load a bunch of files? During page load. During page load, the browser might request css, javascript, images, etc. Which for most websites, is a lot of files. Let’s now move our focus back to the humble script.js. Before, the only role it played with service workers was registering the script. However, if we can get the script to notify the service worker when the page is done loading, then the service worker will know when to trim the cache.


if ('serviceWorker' in navigator) { navigator.serviceWorker.register('https://yourwebsite.com/serviceworker.js', {scope: '/'});
}
window.addEventListener("load", function() { if (navigator.serviceWorker.controller != null) { navigator.serviceWorker.controller.postMessage({"command":"trimCache"}); }
});

Why if (navigator.serviceWorker.controller != null)? Service Workers don’t take control of the page immediately but on subsequent page loads, Jake Archibald explains. When the service worker does have control of the page however, we can use the postMessage api to send a message to the service worker. Inside, I provided a json with a “command” to “trimCache”. Since we send the json to the service worker, we need to make sure that it can receive it.


self.addEventListener("message", function(event) { var data = event.data; if (data.command == "trimCache") { trimCache(version + "pages", 25); trimCache(version + "images", 10); trimCache(version + "assets", 30); }
});

Once it receives the command, it goes on to trim all of the caches.

Conclusion

So whenever you download a bunch of files, make sure to run navigator.serviceWorker.controller.postMessage({"command":"trimCache"}); on the main javascript file to trim the cache. A downside to this method is that since Service Workers don’t take control during the first page load, the cache isn’t trimmed until the second page load. If you can find a way to make it so that this event happens in the first page load tell me about it/write a blog post. 🙂 Update: To get the service worker to take control of the page immediately call self.skipWaiting() after the install event and self.clients.claim() after the activate event. Current code for our humble service worker:


var version = 'v2.0.24:'; var offlineFundamentals = [ '/', '/offline/'
]; //Add core website files to cache during serviceworker installation
var updateStaticCache = function() { return caches.open(version + 'fundamentals').then(function(cache) { return Promise.all(offlineFundamentals.map(function(value) { var request = new Request(value); var url = new URL(request.url); if (url.origin != location.origin) { request = new Request(value, {mode: 'no-cors'}); } return fetch(request).then(function(response) { var cachedCopy = response.clone(); return cache.put(request, cachedCopy); }); })) })
}; //Clear caches with a different version number
var clearOldCaches = function() { return caches.keys().then(function(keys) { return Promise.all( keys .filter(function (key) { return key.indexOf(version) != 0; }) .map(function (key) { return caches.delete(key); }) ); })
} /* trims the cache If cache has more than maxItems then it removes the excess items starting from the beginning
*/
var trimCache = function (cacheName, maxItems) { caches.open(cacheName) .then(function (cache) { cache.keys() .then(function (keys) { if (keys.length > maxItems) { cache.delete(keys[0]) .then(trimCache(cacheName, maxItems)); } }); });
}; //When the service worker is first added to a computer
self.addEventListener("install", function(event) { event.waitUntil(updateStaticCache() .then(function() { return self.skipWaiting(); }) );
}) self.addEventListener("message", function(event) { var data = event.data; //Send this command whenever many files are downloaded (ex: a page load) if (data.command == "trimCache") { trimCache(version + "pages", 25); trimCache(version + "images", 10); trimCache(version + "assets", 30); }
}); //Service worker handles networking
self.addEventListener("fetch", function(event) { //Fetch from network and cache var fetchFromNetwork = function(response) { var cacheCopy = response.clone(); if (event.request.headers.get('Accept').indexOf('text/html') != -1) { caches.open(version + 'pages').then(function(cache) { cache.put(event.request, cacheCopy); }); } else if (event.request.headers.get('Accept').indexOf('image') != -1) { caches.open(version + 'images').then(function(cache) { cache.put(event.request, cacheCopy); }); } else { caches.open(version + 'assets').then(function add(cache) { cache.put(event.request, cacheCopy); }); } return response; } //Fetch from network failed var fallback = function() { if (event.request.headers.get('Accept').indexOf('text/html') != -1) { return caches.match(event.request).then(function (response) { return response || caches.match('/offline/'); }) } else if (event.request.headers.get('Accept').indexOf('image') != -1) { return new Response('Offlineoffline', { headers: { 'Content-Type': 'image/svg+xml' }}); } } //This service worker won't touch non-get requests if (event.request.method != 'GET') { return; } //For HTML requests, look for file in network, then cache if network fails. if (event.request.headers.get('Accept').indexOf('text/html') != -1) { event.respondWith(fetch(event.request).then(fetchFromNetwork, fallback)); return; } //For non-HTML requests, look for file in cache, then network if no cache exists. event.respondWith( caches.match(event.request).then(function(cached) { return cached || fetch(event.request).then(fetchFromNetwork, fallback); }) ) }); //After the install event
self.addEventListener("activate", function(event) { event.waitUntil(clearOldCaches() .then(function() { return self.clients.claim(); }) );
});

if ('serviceWorker' in navigator) { navigator.serviceWorker.register('https://brandonrozek.com/serviceworker.js', {scope: '/'});
}
window.addEventListener("load", function() { if (navigator.serviceWorker.controller != null) { navigator.serviceWorker.controller.postMessage({"command":"trimCache"}); }
});

# Monday, November 30th, 2015 at 12:00am

brandonrozek.com

Summary: I rewrote how cache limiting works to address a few problems listed later in this post. Check out the gist for the updated code.

I wrote a function in my previous service worker post to help limit the cache. Here’s a reminder of what it looked like.


var limitCache = function(cache, maxItems) { cache.keys().then(function(items) { if (items.length > maxItems) { cache.delete(items[0]); } })
}

The Problem

Jeremy Keith updated the service worker on his site and noticed that the images has blown past the amount he allocated for it (post). Looking back at my service worker, I noticed that mine has the same shortcoming as well. So what happened? Service workers function in an asynchronous manner. Meaning it can be processing not just one, but many fetch events at the same time. This comes into conflict when there are synchronous instructions such as deleting the first item from the cache which Jeremy describes in his follow up post.

A Solution

Jeremy wrote a function to help trim the cache and asked when it would be appropriate to apply it.


var trimCache = function (cacheName, maxItems) { caches.open(cacheName) .then(function (cache) { cache.keys() .then(function (keys) { if (keys.length > maxItems) { cache.delete(keys[0]) .then(trimCache(cacheName, maxItems)); } }); });
};

And that got me thinking. In what situations is this problem more likely to occur? This particular problem happens when a lot of files are being called asynchronously. This problem doesn’t occur when only one file is being loaded. So when do we load a bunch of files? During page load. During page load, the browser might request css, javascript, images, etc. Which for most websites, is a lot of files. Let’s now move our focus back to the humble script.js. Before, the only role it played with service workers was registering the script. However, if we can get the script to notify the service worker when the page is done loading, then the service worker will know when to trim the cache.


if ('serviceWorker' in navigator) { navigator.serviceWorker.register('https://yourwebsite.com/serviceworker.js', {scope: '/'});
}
window.addEventListener("load", function() { if (navigator.serviceWorker.controller != null) { navigator.serviceWorker.controller.postMessage({"command":"trimCache"}); }
});

Why if (navigator.serviceWorker.controller != null)? Service Workers don’t take control of the page immediately but on subsequent page loads, Jake Archibald explains. When the service worker does have control of the page however, we can use the postMessage api to send a message to the service worker. Inside, I provided a json with a “command” to “trimCache”. Since we send the json to the service worker, we need to make sure that it can receive it.


self.addEventListener("message", function(event) { var data = event.data; if (data.command == "trimCache") { trimCache(version + "pages", 25); trimCache(version + "images", 10); trimCache(version + "assets", 30); }
});

Once it receives the command, it goes on to trim all of the caches.

Conclusion

So whenever you download a bunch of files, make sure to run navigator.serviceWorker.controller.postMessage({"command":"trimCache"}); on the main javascript file to trim the cache. A downside to this method is that since Service Workers don’t take control during the first page load, the cache isn’t trimmed until the second page load. If you can find a way to make it so that this event happens in the first page load tell me about it/write a blog post. 🙂 Update: To get the service worker to take control of the page immediately call self.skipWaiting() after the install event and self.clients.claim() after the activate event. Current code for our humble service worker:


var version = 'v2.0.24:'; var offlineFundamentals = [ '/', '/offline/'
]; //Add core website files to cache during serviceworker installation
var updateStaticCache = function() { return caches.open(version + 'fundamentals').then(function(cache) { return Promise.all(offlineFundamentals.map(function(value) { var request = new Request(value); var url = new URL(request.url); if (url.origin != location.origin) { request = new Request(value, {mode: 'no-cors'}); } return fetch(request).then(function(response) { var cachedCopy = response.clone(); return cache.put(request, cachedCopy); }); })) })
}; //Clear caches with a different version number
var clearOldCaches = function() { return caches.keys().then(function(keys) { return Promise.all( keys .filter(function (key) { return key.indexOf(version) != 0; }) .map(function (key) { return caches.delete(key); }) ); })
} /* trims the cache If cache has more than maxItems then it removes the excess items starting from the beginning
*/
var trimCache = function (cacheName, maxItems) { caches.open(cacheName) .then(function (cache) { cache.keys() .then(function (keys) { if (keys.length > maxItems) { cache.delete(keys[0]) .then(trimCache(cacheName, maxItems)); } }); });
}; //When the service worker is first added to a computer
self.addEventListener("install", function(event) { event.waitUntil(updateStaticCache() .then(function() { return self.skipWaiting(); }) );
}) self.addEventListener("message", function(event) { var data = event.data; //Send this command whenever many files are downloaded (ex: a page load) if (data.command == "trimCache") { trimCache(version + "pages", 25); trimCache(version + "images", 10); trimCache(version + "assets", 30); }
}); //Service worker handles networking
self.addEventListener("fetch", function(event) { //Fetch from network and cache var fetchFromNetwork = function(response) { var cacheCopy = response.clone(); if (event.request.headers.get('Accept').indexOf('text/html') != -1) { caches.open(version + 'pages').then(function(cache) { cache.put(event.request, cacheCopy); }); } else if (event.request.headers.get('Accept').indexOf('image') != -1) { caches.open(version + 'images').then(function(cache) { cache.put(event.request, cacheCopy); }); } else { caches.open(version + 'assets').then(function add(cache) { cache.put(event.request, cacheCopy); }); } return response; } //Fetch from network failed var fallback = function() { if (event.request.headers.get('Accept').indexOf('text/html') != -1) { return caches.match(event.request).then(function (response) { return response || caches.match('/offline/'); }) } else if (event.request.headers.get('Accept').indexOf('image') != -1) { return new Response('Offlineoffline', { headers: { 'Content-Type': 'image/svg+xml' }}); } } //This service worker won't touch non-get requests if (event.request.method != 'GET') { return; } //For HTML requests, look for file in network, then cache if network fails. if (event.request.headers.get('Accept').indexOf('text/html') != -1) { event.respondWith(fetch(event.request).then(fetchFromNetwork, fallback)); return; } //For non-HTML requests, look for file in cache, then network if no cache exists. event.respondWith( caches.match(event.request).then(function(cached) { return cached || fetch(event.request).then(fetchFromNetwork, fallback); }) ) }); //After the install event
self.addEventListener("activate", function(event) { event.waitUntil(clearOldCaches() .then(function() { return self.clients.claim(); }) );
});

if ('serviceWorker' in navigator) { navigator.serviceWorker.register('https://brandonrozek.com/serviceworker.js', {scope: '/'});
}
window.addEventListener("load", function() { if (navigator.serviceWorker.controller != null) { navigator.serviceWorker.controller.postMessage({"command":"trimCache"}); }
});

# Monday, November 30th, 2015 at 12:00am

brandonrozek.com

Summary: I rewrote how cache limiting works to address a few problems listed later in this post. Check out the gist for the updated code. I wrote a function in my previous service worker post to help limit the cache. Here’s a reminder of what it looked like.


var limitCache = function(cache, maxItems) {
 cache.keys().then(function(items) {
 if (items.length > maxItems) {
 cache.delete(items[0]);
 }
 })
}

The Problem

Jeremy Keith updated the service worker on his site and noticed that the images has blown past the amount he allocated for it (post). Looking back at my service worker, I noticed that mine has the same shortcoming as well.

So what happened?

Service workers function in an asynchronous manner. Meaning it can be processing not just one, but many fetch events at the same time. This comes into conflict when there are synchronous instructions such as deleting the first item from the cache which Jeremy describes in his follow up post.

A Solution

Jeremy wrote a function to help trim the cache and asked when it would be appropriate to apply it.


var trimCache = function (cacheName, maxItems) {
 caches.open(cacheName)
 .then(function (cache) {
 cache.keys()
 .then(function (keys) {
 if (keys.length > maxItems) {
 cache.delete(keys[0])
 .then(trimCache(cacheName, maxItems));
 }
 });
 });
};

And that got me thinking. In what situations is this problem more likely to occur? This particular problem happens when a lot of files are being called asynchronously. This problem doesn’t occur when only one file is being loaded.

So when do we load a bunch of files? During page load.

During page load, the browser might request css, javascript, images, etc. Which for most websites, is a lot of files.

Let’s now move our focus back to the humble script.js. Before, the only role it played with service workers was registering the script. However, if we can get the script to notify the service worker when the page is done loading, then the service worker will know when to trim the cache.


if ('serviceWorker' in navigator) {
 navigator.serviceWorker.register('https://yourwebsite.com/serviceworker.js', {scope: '/'});
}
window.addEventListener("load", function() {
 if (navigator.serviceWorker.controller != null) {
 navigator.serviceWorker.controller.postMessage({"command":"trimCache"});
 }
});

Why if (navigator.serviceWorker.controller != null)? Service Workers don’t take control of the page immediately but on subsequent page loads, Jake Archibald explains.

When the service worker does have control of the page however, we can use the postMessage api to send a message to the service worker. Inside, I provided a json with a “command” to “trimCache”.

Since we send the json to the service worker, we need to make sure that it can receive it.


self.addEventListener("message", function(event) {
 var data = event.data;
 
 if (data.command == "trimCache") {
 trimCache(version + "pages", 25);
 trimCache(version + "images", 10);
 trimCache(version + "assets", 30);
 }
});

Once it receives the command, it goes on to trim all of the caches.

Conclusion

So whenever you download a bunch of files, make sure to run navigator.serviceWorker.controller.postMessage({"command":"trimCache"}); on the main javascript file to trim the cache. A downside to this method is that since Service Workers don’t take control during the first page load, the cache isn’t trimmed until the second page load. If you can find a way to make it so that this event happens in the first page load tell me about it/write a blog post. :)

Update: To get the service worker to take control of the page immediately call self.skipWaiting() after the install event and self.clients.claim() after the activate event.

Current code for our humble service worker:


var version = 'v2.0.24:';

var offlineFundamentals = [
 '/',
 '/offline/'
];

//Add core website files to cache during serviceworker installation
var updateStaticCache = function() {
 return caches.open(version + 'fundamentals').then(function(cache) {
 return Promise.all(offlineFundamentals.map(function(value) {
 var request = new Request(value);
 var url = new URL(request.url);
 if (url.origin != location.origin) {
 request = new Request(value, {mode: 'no-cors'});
 }
 return fetch(request).then(function(response) { 
 var cachedCopy = response.clone();
 return cache.put(request, cachedCopy); 
 
 });
 }))
 })
};

//Clear caches with a different version number
var clearOldCaches = function() {
 return caches.keys().then(function(keys) {
 return Promise.all(
 keys
 .filter(function (key) {
 return key.indexOf(version) != 0;
 })
 .map(function (key) {
 return caches.delete(key);
 })
 );
 })
}

/*
 trims the cache
 If cache has more than maxItems then it removes the excess items starting from the beginning
*/
var trimCache = function (cacheName, maxItems) {
 caches.open(cacheName)
 .then(function (cache) {
 cache.keys()
 .then(function (keys) {
 if (keys.length > maxItems) {
 cache.delete(keys[0])
 .then(trimCache(cacheName, maxItems));
 }
 });
 });
};


//When the service worker is first added to a computer
self.addEventListener("install", function(event) {
 event.waitUntil(updateStaticCache()
 .then(function() { 
 return self.skipWaiting(); 
 })
 );
})

self.addEventListener("message", function(event) {
 var data = event.data;
 
 //Send this command whenever many files are downloaded (ex: a page load)
 if (data.command == "trimCache") {
 trimCache(version + "pages", 25);
 trimCache(version + "images", 10);
 trimCache(version + "assets", 30);
 }
});

//Service worker handles networking
self.addEventListener("fetch", function(event) {

 //Fetch from network and cache
 var fetchFromNetwork = function(response) {
 var cacheCopy = response.clone();
 if (event.request.headers.get('Accept').indexOf('text/html') != -1) {
 caches.open(version + 'pages').then(function(cache) {
 cache.put(event.request, cacheCopy);
 });
 } else if (event.request.headers.get('Accept').indexOf('image') != -1) {
 caches.open(version + 'images').then(function(cache) {
 cache.put(event.request, cacheCopy);
 });
 } else {
 caches.open(version + 'assets').then(function add(cache) {
 cache.put(event.request, cacheCopy);
 });
 }

 return response;
 }

 //Fetch from network failed
 var fallback = function() {
 if (event.request.headers.get('Accept').indexOf('text/html') != -1) {
 return caches.match(event.request).then(function (response) { 
 return response || caches.match('/offline/');
 })
 } else if (event.request.headers.get('Accept').indexOf('image') != -1) {
 return new Response('Offlineoffline', { headers: { 'Content-Type': 'image/svg+xml' }});
 } 
 }
 
 //This service worker won't touch non-get requests
 if (event.request.method != 'GET') {
 return;
 }
 
 //For HTML requests, look for file in network, then cache if network fails.
 if (event.request.headers.get('Accept').indexOf('text/html') != -1) {
 event.respondWith(fetch(event.request).then(fetchFromNetwork, fallback));
 return;
 }

 //For non-HTML requests, look for file in cache, then network if no cache exists.
 event.respondWith(
 caches.match(event.request).then(function(cached) {
 return cached || fetch(event.request).then(fetchFromNetwork, fallback);
 })
 ) 
});

//After the install event
self.addEventListener("activate", function(event) {
 event.waitUntil(clearOldCaches()
 .then(function() { 
 return self.clients.claim(); 
 })
 );
});

if ('serviceWorker' in navigator) {
 navigator.serviceWorker.register('https://brandonrozek.com/serviceworker.js', {scope: '/'});
}
window.addEventListener("load", function() {
 if (navigator.serviceWorker.controller != null) {
 navigator.serviceWorker.controller.postMessage({"command":"trimCache"});
 }
});

# Sunday, December 27th, 2015 at 7:01am

1 Like

# Liked by Front-End Front on Sunday, November 22nd, 2015 at 7:39pm

Related posts

Am I cached or not?

Complementing my site’s service worker strategy with an extra interface element.

Timing out

A service worker strategy for dealing with lie-fi.

Going Offline—the talk of the book

…of the T-shirt.

The audience for Going Offline

A book about service workers that doesn’t assume any prior knowledge of JavaScript.

My first Service Worker

Enhancing my site with the niftiest new technology.

Related links

Now THAT’S What I Call Service Worker! – A List Apart

This is terrific! Jeremy shows how you can implement a fairly straightforward service worker for performance gains, but then really kicks it up a notch with a recipe for turning a regular website into a speedy single page app without framework bloat.

Tagged with

Service Workers | Go Make Things

Chris Ferdinandi blogs every day about the power of vanilla JavaScript. For over a week now, his daily posts have been about service workers. The cumulative result is this excellent collection of resources.

Tagged with

Offline Page Descriptions | Erik Runyon

Here’s a nice example of showing pages offline. It’s subtly different from what I’m doing on my own site, which goes to show that there’s no one-size-fits-all recipe when it comes to offline strategies.

Tagged with

Distinguishing cached vs. network HTML requests in a Service Worker | Trys Mudford

Less than 24 hours after I put the call out for a solution to this gnarly service worker challenge, Trys has come up with a solution.

Tagged with

Offline fallback page with service worker - Modern Web Development: Tales of a Developer Advocate by Paul Kinlan

Paul describes a fairly straightforward service worker recipe: a custom offline page for failed requests.

Tagged with

Previously on this day

12 years ago I wrote Pattern primer

A little script to automatically generate a document of markup patterns.

13 years ago I wrote Spacelift

My project at Science Hack Day San Francisco

17 years ago I wrote To Cape Canaveral… and beyond!

I went behind the curtain at the Kennedy Space Center.

17 years ago I wrote Refreshed

Wrapping up Refresh Orlando 2006.

19 years ago I wrote Mac madness

I was thinking of heading up to the Mac Expo in London tomorrow. It’s pretty much an annual event for me (anyone remember this post from three years ago?).

22 years ago I wrote Photoshop: It's All the Rage

This article at Wired is all about Photoshop. Specifically, it’s all about the popularity of "Photoshopping" images for fun, satire or spite.