Each entry in the “Gems in the rough” series is a collection of tips, explanations, and/or idle observations which I hope will be at least somewhat useful to those of you with websites built by static site generators (SSGs).
I’ve been spending a lot of time the last week or two playing with Astro, an interesting but still very beta-test-y (alpha-ish, at times) new SSG offering from the folks behind the Snowpack build tool and Skypack software package delivery network. Right now, Astro is kind of like what you’d expect if Next.js and Snowpack had a baby.
Astro’s got a lot of promise, but for now it’s also kinda frustrating in that a lot of the example code in the documentation simply doesn’t work as it should—or, at least, it doesn’t for me. Astro also is pretty opinionated by design, which has its good points—e.g., fewer ways for certain procedures to go wrong—but also can make it difficult to re-use some items that work perfectly well on other SSGs.
For example: in the Eleventy and Hugo SSGs, you can insert shortcodes into your Markdown text. This lets you insert bits of code to do cool things, like image processing, that Markdown on its own won’t allow. In Astro, the only way to include code in Markdown is to use an
.astro file1 with a proprietary
<Markdown> element and then intermix your code with that. It works, but this limitation means you’re not likely to move years’ worth of shortcode-laden
.md files over from another SSG. That’s a big deal. (The Astro team is considering allowing components in actual
.md files, somewhat like how the
.mdx file format works, so this particular gotcha may improve soon.)
On the bright side, I should note that I was able to convert this site’s
imgc image-delivery shortcode to an
Imgc.astro component that works perfectly well when dropped into an
.astro page, so that process is pretty straightforward.
On the still brighter side, the Astro team is friendly to a fault, seems to listen to any and all feedback, and is quickly building a great and helpful community—including its presence on Discord. This reminds me a lot of the early days of Eleventy (or, at least, what I gather those were like considering that I didn’t get into Eleventy until it was already well over a year old; by contrast, the first public beta for Astro appeared only a few weeks ago).
We’ll see what comes of Astro, which is getting a lot of attention in the SSG world right now. If you have a sadistic interest in following my sometimes stumbling efforts to make it work to my liking, feel free to drop in on my Astro repo.
I have a cloned repo of Zach Leatherman’s Speedlify project for testing the performance of some sites I’ve built on different static site hosting vendors with the same repository. While I typically use a cron job to run the Speedlify test automatically, I sometimes also do a manual run after making a change to the test repo. This gives me a chance to check the relative build times for the hosts. On the most recent push to this repo earlier this week—upgrading a few dependencies but otherwise making no changes—I got the following build times, in minutes and seconds (and, for comparison’s sake, note also the previous push’s build times).
- Vercel: 0:28. Previous: 0:34.
- Render: 1:26. Previous: unknown (Render couldn’t display logs from earlier builds).
- Cloudflare Workers site via GitHub Actions: 1:35. Previous: 1:20.
- Azure Static Web Apps via GitHub Actions: 1:47. Previous: 3:56 (my first build to ASWA).
- Cloudflare Pages: 2:48. Previous: 2:23.
- Digital Ocean Apps Platform: 3:41. Previous: 2:08.
I have no idea why DOAP’s build time was so slow this time, sparing Cloudflare Pages from being the slowest of the six for once; stuff happens. That said, it’s a big black eye for both the last two in that their build times without GitHub Actions were considerably slower than the two above them which did use GitHub Actions.
As for the actual performance numbers I see in Speedlify, you can check my test results at any time (but see my 2021-09-25 update further down). Speedlify stores only the ten most recent results, so the trends you see lack statistical significance; but I can make a couple of general observations about what I’ve been seeing:
- Cloudflare Pages and DOAP tend to be in the top two or three pretty consistently, while the Cloudflare Workers site and ASWA tend to be in the bottom two or three almost as consistently.
- Render and Vercel are somewhat erratic, varying wildly from top three to bottom three. I’m at a loss to know why.
Update, 2021-08-14: Digital Ocean has cut the monthly build limit for App Platform static websites from the original 400 minutes to only 100 minutes.2 Accordingly, I have taken the test repo off Digital Ocean, so the Speedlify test results mentioned above consist of only five sites as of now.
Update, 2021-09-18: Azure deleted my test ASWA site tonight (early on 2021-09-19 according to UTC) because, apparently, it doesn’t allow free static sites, after all, unlike the other vendors mentioned here. Consequently, the Speedlify test results now have four sites.
Update, 2021-09-25: I’ve turned off the cron jobs that run the test each day, so the results will be frozen going forward. It seemed nonsensical to keep running the tests since the list of sites was down to just four—two of them Cloudflare-based.