We’re proud to announce that the Orlando Devs blog is live! And not only is our blog live, but it is also running really fast and we would love to give you a rundown on how we built it.
TL;DR (Too long; didn’t read):
We launched the official Orlando Devs blog and it’s open source! Anyone in the community is welcome to add new articles, as we have a fancy way of adding articles to the blog automatically after merging your pull requests :)
The Master Plan
Our idea was to have an open source blog. We wanted developers within our community to contribute to our blog using the whole open source paradigm (forks, pull request reviews, etc).
We thought this would be a good way to get people’s names out there and have them show off their skills in a written form, as well as making newer developers getting used to contributing to open source projects.
From day one we knew that we wanted to host our site on GitHub Pages. Using Jekyll was also a no-brainer since GitHub Pages has native integration with it. We used a tutorial and had a prototype running in 5 minutes.
There was one problem with GitHub Pages though: it doesn’t support
https on custom domain names. So we went on to find a way to fix this.
We decided to use CloudFlare to add
https support to our custom domain. We also leveraged CloudFlare’s caching capabilities to speed up our site and the hardest part about this whole integration was getting the correct DNS settings in place.
CloudFlare DNS Settings
This is what our DNS settings look like:
* We are taking advantage of CloudFlare’s CNAME Flattening
CloudFlare Page Rules
We setup two “Page Rules” within CloudFlare:
1. Always uses
This rule forces the site to always use
2. Custom Caching Here we forced our site to “Cache everything”. This is important because by default CloudFlare does not cache HTML and other dynamic pages (as somebody kindly pointed out to us on Hacker News). Please note that we are talking about Edge Cache here, which is a form of server-side caching that CloudFlare provides (as well as other CDN providers).
Once CloudFlare was configured, we only ran into one problem: Edge cache invalidation. Since we set CloudFlare to cache everything on the server-side for 1 week, how could we make sure the home page is instantly refreshed when we add new articles?
It turns out that CloudFlare offers an API to purge its cache, so we started investigating on how we could consume this API every time a pull request is merged.
The solution? GitHub Webhooks & AWS Lambda.
We soon learned that AWS Lambda is great solution for our cache invalidation problem, as it saves us the headache of setting up a new server. AWS Lambda is inexpensive, so it turned out to be a win-win solution for us.
Here’s how Lambda’s billing works: every month, the first 1 million requests are free. If we go over the limit it will cost us 20 cents per each additional million requests. We don’t think we will ever merge anything close to 1 million pull requests a month, so we’re good!
Here’s how this process went:
We signed up for AWS and added a new Lambda function using
Node.js. We named the function
We used the “Upload ZIP File” option to deploy the project. We ran into a little problem while uploading the zip file: We were zipping the parent directory containing our project, however this didn’t work. The zip file uploaded needs to have the entry file (called
app.jsin our project) at the root of the compressed folder.
We then set up an endpoint for our function using AWS’s API Gateway. During this step we were able to point this new API endpoint to the
Configured GitHub Webhooks for our blog’s repo using the URL provided by the API Gateway. The only event we needed this Webhook to work was the
Finally, we open sourced the project (be sure to check it out!)
We now have a whole workflow for you to contribute to our open source blog. All tech-related topics are welcome, although new articles are subject to review by moderators. So please, read our guide to start contributing and open a pull request with your brand new article today!