to the head. This line of code instructs the browser to start downloading the font in parallel to the CSS, so by the time it is needed it is hopefully already available.
The DNS/TCP/TLS issue extends beyond just fonts. Instead of importing scripts and styles from external CDNs, just host them yourself. This avoids additional load times.
When the browser encounters a JavaScript element, it will pause the HTML rendering to process the JavaScript. This is detrimental to performance. Historically, programmers have worked around this by placing their javascript imports at the bottom of the HTML. Unfortunately, this means the JavaScript won’t begin downloading as soon as possible. In modern times, we have two tools at our disposal. The defer
attribute tells browsers to download the JavaScript in the background, but run it when the HTML is done parsing. The async
attribute also tells browsers to download JavaScript in the background, but run it as soon as it’s ready. The proper attribute depends on the context, but both have positive impacts on performance.
Defer | Async | Script on top | Script on bottom | |
---|---|---|---|---|
Begins download instantly | ✅ | ✅ | ✅ | ❌ |
Runs | At DOM ready | As soon as downloaded | As soon as downloaded | At DOM ready |
Doesn’t block rendering | ✅ | ✅ | ❌ | ✅ |
Good for | Scripts depending on DOM | Background (Analytics ETC) | (was) for background, library | (was) for scripts depending on dom |
So this is where the “at all costs comes in”. I’m really, really embarrassed and regretful about this one. My favicon was the second biggest file on my website, which felt really wrong to me. I began with minifying the PNG about 10 times before turning it into a ICO, which unfortunately increased the file size. A couple things:
favicon.ico
path for an icon if there is no <link>
to one.Naturally, as to not compromise on icon size (PNG vs ICO) nor HTML size (eww, an extra link tag!), I concluded the best route forward was to use the incorrect extension for my PNG file. I have a PNG file named favicon.ico
.
If you have a email anywhere on your site and also happen to use cloudflare, it might helpfully begin obfuscating your email and adding an additional 1.1kb
script named email-decode.min.js
. It’s not particularly advanced. To prove my point, this is a re-implementation of the decode function:
function decode(token: string) {
const hexadecimal = (a, b) => parseInt(a.substr(b, 2), 16);
const hexadecimalToken = hexadecimal(token);
let decode = "";
for (let i = 2; i < token.length; i += 2) {
decode += String.fromCharCode(hexadecimal(token, i) ^ hexadecimalToken);
}
return decode;
}
And here it minified!
function(t){const e=(t,e)=>parseInt(t.substr(e,2),16),n=e(t);let r="";for(let o=2;o<t.length;o+=2)r+=String.fromCharCode(e(t,o)^n)}
To save a little bit of bandwidth, you could disable the feature in cloudflare and inline the decoder into your main script. I have not bothered to write an encoder yet, you would need one for the decoder to be of any use. I did not write a decoder because I suspect scrapers are smart enough to decode cloudflare emails themselves — you are much better off writing one yourself.
I have seen this technique under a large variety of umbrellas, including pjax, client side routing, single page applications, asynchronous navigation, and link prefetching. Really, they all work to achieve similar goals — making client side navigation as fast as possible.
There are certain scripts that fetch routes prematurely based on predictions about user input. Ideally, by the time the user releases their mouse the next page is already downloaded or close to it. I’
instant.page | InstantClick | Quicklink | |
---|---|---|---|
Last Commit | Recent | 2014 | Recent |
Method | Hover, Mousedown, Viewport | Hover | Viewport |
Client Side Routing | ❌ (Full Reload) | Hotswap Body + Title | ❌ (Full Reload) |
The job of the client side router is to turn MPAs into SPAs. If this sounds like gibberish to you, here’s the run down. Whenever you click a link on a normal website, the browser fully refreshes the page. This could mean rerunning all your scripts, parsing all your css, and generally running a ton of wasteful operations. Instead, client side routers hijack click events, run a fetch in the background, and only change the parts of the website that changed. This results in really snappy clicking (quartz prefetches onhover as well)
For client side routers, you are looking for standalone options. http://microjs.com/#Router could be a good starting place. I’ve seen Turbo and Million Router in the wild. Fireship has published flamethrower. Navigo looks decent.
Ben Holms has his own implementation, and generally speaking his site has lots of good performance tips as well.
Function Dynamic has the fastest website I’ve seen. They describe it as “Asynchronous Navigation”
https://quartz.jzhao.xyz is also quite fast, and open source! They use the million router