If you have ever visited this site before you may have noticed a few changes over the last week. I have been tinkering with the WordPress API to better understand what is involved with creating a decoupled theme. As with all good projects I started by setting myself a few goals:
- Learn. I work with React most days so I wanted to try something else to better understand modern web development, not just a particular framework
- Speed. I wanted a site that is extremely fast, lightweight and better for the planet
- Resilient. A site can handle LOTS of traffic and is also available via the P2P web
- Low cost. Less than £15 a month to host and maintain
At the heart of the new site is the Nuxt.js framework. Nuxt is a framework for creating Universal Vue.js Applications and in many ways it feels like working on a WordPress theme. It has a folder for your pages/templates, a hook API, support for plugins and components (or template parts if your thinking WordPress). Nuxt follows a philosophy of convention over configuration, drop Vue components in their respective folders and Nuxt will bind everything together.
This fairly rigid structure is important to the framework as it allows it to operate in three distinct modes:
- Single page application (SPA) – Run as a regular Vue web app
- Server Side Rendered (SSR) – For each request, pre-render the application in a server and send that back to the client
- Static (Generated) – Pre-render the application once during build, and then serve it as an SPA
I was personally interested in the static (generated) and SPA modes which would allow me to host the interface in isolation to WordPress and without the need for Node.js hosting. This would allow me to make use of some of the free static hosting options out there like GitHub pages or surge.sh.
Out of the box Nuxt supports routing (vue router), a store (vuex), filters (core vue 2) and meta data (vue-meta). I added some additional plugins for progressive web app (PWA) features, animation, lazy loading, keyboard navigation, sitemap generation and API consumption with axios.
After some experimentation I discovered that Nuxt didn’t quite work as I expected to. A statically generated version of the site only loads statically for the initial page load, the app is then bootstrapped and runs client side, making API requests back to the WordPress API. This is probably fine in most circumstances but I wanted to have a version of the site served over Dat (the peer to peer web) and would rather that it didn’t need the API. I was looking for a truly static site.
Luckily Nuxt has some hooks which allowed me to cache a local copy of the API responses as JSON during the build process. When deployed, the app will then query itself rather than the WordPress API. You can see some of the logic here and here.
The WordPress side didn’t need a great deal of tinkering. I moved the site to api.scott.ee and set a redirect on that to the main site. I then introduced a couple of filters for supporting lazy loaded images and enabled photon in Jetpack so images are served from a CDN. Images are the only thing that are not part of the generated build of the site.
WordPress should see very little traffic (if any) and only ever be hit when I want to write some new content or during the site build/generate process when the API is fetched and cached.
Nuxt takes care of a great deal of the app bundling, minification and optimisation for you. Add the PWA module and you can get a decent Lighthouse score right out the box. I also recommend installing the Nuxt webpack monitor module which allows you to inspect the size of your bundles which can be useful when you start to make use of third party add ons. Running
yarn nuxt build --webpackmonitor --analyze will launch an interactive breakdown of how your site is put together.
The next thing to implement was lazy loading. Lazy loading ensures that images are not wasting bandwidth by loading unnecessarily. Using Jetpack’s photon CDN I can request a much smaller version of the original image to act as a loading image while the original, full size image, is fetched. I use CSS filters to blur the loading image slightly and make it greyscale. The result is rather neat:
My final trick for images is to detect low bandwidth (2G/3G) visitors and those with the data saver option enabled in their browser. In these scenarios the image quality is reduced to 50% and the resolution divided by four.
As you page around the site a request for the next page is made in the background of the current page to help speed things up too.
I had grand plans for this. My aim was to use Cloudflare’s load balancing feature to split the site hosting between Netlify and GitHub pages. That way if either service went down, the site would stay up. Cloudflare would also act as a third backup / cache if things got really messy. I struggled to get the load balancing health checks to work and the load balancing feature came in at a whopping cost of $5 a month so I decided to park that idea for now.
After some speed testing I decided to keep things simple. It turns out that GitHub pages are one of the fastest places to put your code and Cloudfare only seemed to make it slower (albeit just a little bit):
netlify: sweden (no cloudflare) First test: 2.04s 88/100 Second test: 1.79s 88/100 Third test: 1.69s 88/100 netlify: sweden (cloudflare) First test: 3.17s 94/100 Second test: 2.20s 94/100 Third test: 3.17s 94/100 github: sweden (no cloudflare) First test: 540ms 88/100 Second test: 200ms 88/100 Third test: 236ms 88/100 github: sweden (cloudflare) First test: 786ms 94/100 Second test: 312ms 94/100 Third test: 277ms 94/100
Dat describes itself as:
A p2p hypermedia protocol. It provides public-key-addressed file archives which can be synced securely and browsed on-demand.
In short, you can host your website using peer to peer technology and it is super exciting. Everyone that visits a site is also potentially a host of the site.
In order to visit the Dat version of my site you can use the Beaker browser. Using Dat DNS allows the site to be found at: dat://scott.ee/. There is currently one mirror/peer that is always online at hashbase.io and one of my next projects will be to set a Raspberry Pi to the task of providing another mirror. I would love to see some other mirrors pop up too if you feel like adding it to your library.
Using Nuxt to statically generate the site makes pushing to Dat a much simpler task, I am not really sure how WordPress will adapt to the peer to peer web as things currently stand.
So how did I do with my goals?
Learn. I work with React most days so I wanted to try something else to better understand modern web development, not just a particular framework
Completed. I am fairly comfortable with the basics of Vue now and learnt a great deal about Nuxt as well.
Speed. I wanted a site that is extremely fast, lightweight and better for the planet
Pretty good, the load time hovers around 300ms and the page size at around 130kb without images. The lighthouse scores are fairly decent too with the lowest score being 73 for progressive web app features (this is mostly because I have disabled the service worker while I get up to speed with how they work).
It turns out that a friend and colleague at work has also been experimenting with making WordPress more planet friendly at the same time. I highly recommend taking a read of his post on serving WordPress in just 7kb! quite the achievement and it shows there is probably a great deal room for improvement here.
Resilient. A site can handle LOTS of traffic and to be redundant via the peer to peer web
Moderate. Without the load balancing there is a single point of failure. Having the Dat version ticks the box for the P2P web although I would like to consider seeding the site on other protocols as well.
Low cost. Less than £15 a month to host and maintain
Complete. Cost is £0.
Complete. No need for GDPR compliance 🙂
The site isn’t much to look at right now, I treated this a purely a technical exercise rather than a design one but it provides the foundations for something much more interesting. My plans include:
- Gutenberg support
- Sustainable hosting. Performance is one part of the battle
- Store preservation between sessions which may be a better approach than the locally cached API during build
- Service worker support. I would like to get the Lighthouse score to 100 and find some other performance gains
- An automated build and deploy script that only builds the difference in content
If you are interested in the code you can find it here: github.com/scottsweb/scott.ee.