It was time to redesign my website, and I needed to move from a static image website (used to showcase the occasional photograph I took) to one that supports UX and CRO focused articles. My journey involved learning the HUGO Static Site Generator to produce a fast and lightweight website, leveraging CSS Grid to create a code optimized responsive design, all while achieving the ever elusive Google Page Speed score of 100%.
There are plenty of reasons to use a static site generator (“SSG”). You are able to separate content from presentation like with most CMS, while eliminating the security and performance risks associated with server scripts and backend systems. When I decided to upgrade my website, I wanted a light weight website that would achieve the following:
Be a testing ground for website technologies like SSG’s and Conversion Rate Optimization (“CRO”) tools.
Be able to handle a growing library of articles related to SEO, CRO, UX, and other website design related topics. I’ve spent decades working on websites in some form or another, testing new technologies and strategies, and always learning something new.
Have all articles written in Markdown with minimal HTML added.
Be able to achieve a 100% score on Google Page Speed. Yes, the score alone isn’t worth much because the metrics are guidelines for possible improvements and not a confirmation of success. The improvements tend to require extra workarounds, or even complete rewrites of an entire CMS system, without the promise of materially improving the performance or end user experience. At times the payoff is nothing more than the arbitrary grade your site received. So why bother? Well, the guidelines, while idealistic, are founded on good practice and my goal was to understand the tradeoffs necessary to reach a perfect score.
Follow WCAG 2.1 AA guidelines. Website accessibility recommendations have been around for decades but we’ve only just started seeing rapid adoption in recent years. These days having a website that meets the basic criteria (WCAG 2.1 AA) really is considered table stakes for any website.
Be able to host the site for free using GitHub. The price of basic website hosting is very affordable thanks to cloud computing and competition. The choice to host for free with GitHub was mostly academic. This contributed to the decision to use an SSG which would produce the high performing static website pages while offering the template and content management benefits.
I had to learn to use CSS Grid as a modern approach to creating clean and responsive layouts. Picking up CSS Grid was easy thanks to my previous experience with CSS and a bunch of helpful YouTube videos. (If you’ve never seen Kevin Powell’s CSS videos, you may be missing out on a wealth of information) . The layout options are versatile and flexible, exactly what is needed for a modern day responsive website. While CSS Flexbox would achieve similar effects; using CSS Grid paired nicely with my Figma design’s column based layouts.
I then had to learn and use a SSG to manage separate template files and produce clean code output. The choice of SSG came down to two popular tools: Hugo and 11ty. While the JS language of 11ty made it a top contender, Hugo is self-contained (11ty requires NPM/Node) and has built-in Github support. I have no prior experience with Hugo’s Go language, but there was plenty of online documentation and loads of helpful YouTube videos to speed things along. Using HUGO in the MicroSoft Visual Studio Code terminal was fast and convenient. I admit I’m still manually uploading everything to GitHub; learning to integrate everything into GitHub will be a lesson for another day.
Results & Findings
After rebuilding the templates and setting the foundations for my articles, written in Markdown, I uploaded my site to GitHub for testing.
GTMetrix Scores: Provided a 100% performance and 98% structure score right off the bat. The only real improvement available was improving the image optimization and cache policy. This is hardly a surprise since I didn’t build in any media based image sizing yet.
Google Page Speed Scores: Wasn’t far behind with scores of 100% and 99% for Desktop and Mobile respectively. Google also recommended serving images in nextgen formats like WebP and AVIF instead of JPG and PNG as a means of improving download speeds and consuming less data. This will be an upgrade for another time since my largest image was only 641KB, which is hardly large by today’s desktop standards. I do concede that the mobile sizing should be smaller.
If you are ever curious how these scores are weighted, I strongly recommend you consult this article Google Lighthouse 8. It’s easy to assume that First Content Paint and Time to Interactive are important (weighted 10% each), but did you know that Total Blocking Time is more impactful (weighted 30%)?
I’ve already moved my CSS on-page to minimize page layout shift and improve block rendering. I even put my JS on-page because it was short enough and applied to all pages. I made the decision that eliminating one extra file lookup was worth adding a bit more code to the page. I now needed to make sure my images were optimized well for the web and that I serve static assets with an efficient cache policy.
Github currently does not allow you to set image cache policies so the answer is simply to use a CDN service like CloudFlare. Signing up for a free account was fast and easy. It required me to update my domain Name Server records. CloudFlare’s Quick Start Guide helps improve security by enforcing SSL usage; auto-minifies files such as JS, CSS, and HTML; and helps speed up page load times by offering optional Brotli compression. Now the DNS update did take down my website for a short duration (a few minutes while my browser updated DNS cache). I used WhatsMyDNS to confirm the NS updates went through. I was pleasantly surprised to see the updates happened within minutes, though sometimes they can take up to 24 hours to fully propagate. Given all the benefits of the CloudFlare service (ssl enforcement, firewalls, caching, rules, etc), I really should have made this update years ago.
Caching Rules: Since this site is serving static articles that really won’t change much, I set the Browser Cache TTL to 6 months. After a quick refresh of the website I see that my image now has a max-age of 16070400. The EpochConverter tells me this is 186 days which roughly converts to 6.1 months).
After the CloudFlare image cache setting, I received the targeted 100% score. It is worth noting that I didn’t even have to optimize the individual image after setting up CloudFlare. Though it was still recommended that I further optimize my large desktop image to save cellular data when used on mobile devices.
So there it is. A stable server, a little on-page CSS/JS and a CDN were all I needed. Now I need traffic. I’ve long wondered why my site is so popular in China, though I suspect my visitors are mostly bots. I used CloudFlare to serve a geo based captcha challenge in order to help sift out false traffic. The local (US) traffic on my website today is nonexistent but improving this will be super easy. My website has been dormant for years; only recently serving as a means of sharing some of my photography (without context) which isn’t exactly a draw for search engines. By adding relevant and valuable articles to my website, I expect to begin seeing traffic increase in the coming months. Once I’ve reached critical mass for inbound traffic, the real fun begins - Conversion Rate Optimization.
Until then, I’d love to hear your thoughts on my journey. Feel free to email me using the form below or @miguelyounger on Twitter.
3/15/2021 Postscript: It is worth noting that the scores above were taken before Google Tag Manager and Google Analytics were added to the site. The addition of these two services lowered the website score by a few % points. While the pursuit of a perfect 100% score has been fun and educational, it is hardly practical for real-world use.