I decided to revive this blog and wanted to host it as a static site using a S3 bucket.
I recently sat through the AWS Solutions Architect training and have been spinning up resources in the free tier for some practical experience.
This blog was originally a static site on Azure but my Visual Studio subscription lapsed so it’s time I revived it.
Create a S3 Bucket
First thing we need is somewhere to store our site. Hugo is a framework for building static websites using simple markdown language. Perfect for someone like me that fumbles their way through everything. As there’s no computational server side component we can use object storage to host the website files.
In AWS create a new S3 bucket, it needs to be named the same as the site URL. We’ll discuss why later.
Set the following:
- Set the bucket name the same as the URL of your site
- Un-tick block all public access
- Tick the acknowledgement
- Leave everything else the default
Once your bucket is created, navigate to the permissions tab. Under bucket policy, click edit and enter the following and press save.
1{
2 "Version": "2012-10-17",
3 "Statement": [
4 {
5 "Sid": "PublicReadGetObject",
6 "Effect": "Allow",
7 "Principal": "*",
8 "Action": "s3:GetObject",
9 "Resource": "arn:aws:s3:::<BUCKET_NAME>/*",
10 "Condition": {
11 "IpAddress": {
12 "aws:SourceIp": [
13 "2400:cb00::/32",
14 "2606:4700::/32",
15 "2803:f800::/32",
16 "2405:b500::/32",
17 "2405:8100::/32",
18 "2a06:98c0::/29",
19 "2c0f:f248::/32",
20 "173.245.48.0/20",
21 "103.21.244.0/22",
22 "103.22.200.0/22",
23 "103.31.4.0/22",
24 "141.101.64.0/18",
25 "108.162.192.0/18",
26 "190.93.240.0/20",
27 "188.114.96.0/20",
28 "197.234.240.0/22",
29 "198.41.128.0/17",
30 "162.158.0.0/15",
31 "104.16.0.0/13",
32 "104.24.0.0/14",
33 "172.64.0.0/13",
34 "131.0.72.0/22"
35 ]
36 }
37 }
38 }
39 ]
40}
This policy allows the public to read objects in the bucket… sort of. The list of IP addresses are for restricting access to only from Cloudflare. As my traffic will be proxied through Cloudflare this will prevent people navigating directly to the bucket.
Now navigate to the properties tab and scroll down to the Static website hosting option and click edit. Set the following:
- Static website hosting: enable
- Hosting type: static website
- Index document: index.html
- Error document: 404.html
Save changes. You should now see a website endpoint for your site. If you comment out the Cloudflare IP addresses under the bucket policy you should be able to navigate to it. However, at the moment we haven’t added any content.
Set up IAM role
Our bucket policy restricts access to our bucket from our CDN of choice. We now need to create an IAM role that will allow us to add content via a Github repo.
In AWS, navigate to IAM and to Identity Providers. Add a new provider for GitHub, you can follow their instructions here on how to do that.
Now navigate to roles under IAM and create a new role. Select the Web identity trusted entity type. Select token.actions.githubusercontent.com and then enter in the organisation (or username) of your Github account, the name of the repo you’re going to use, and the branch you’ll use i.e. main.
On the next screen we’ll skip selecting a permission policy, we’ll add that later. Go to the next screen and give your role a name and click Create role.
Our role is going to need permissions to our S3 bucket. Navigate to your new role and on the permissions tab under Permissions policies, click add permissions, create inline policy. Switch to the JSON editor and enter the following:
1{
2 "Version": "2012-10-17",
3 "Statement": [
4 {
5 "Sid": "SyncToBucket",
6 "Effect": "Allow",
7 "Action": [
8 "s3:PutObject",
9 "s3:GetObject",
10 "s3:ListBucket",
11 "s3:DeleteObject"
12 ],
13 "Resource": [
14 "arn:aws:s3:::<BUCKET_NAME>/*",
15 "arn:aws:s3:::<BUCKET_NAME>"
16 ]
17 }
18 ]
19}
On the next screen give it a name and Create policy.
Now we should be able to get some data into our bucket via Github.
Set up Github repo
Jump over Github and create a new repository.
Give it a name that’s consistent with what you entered in the previous step. You can keep it private if you like.
Once created, navigate to the settings tab, under security Secrets and Variables, Actions. On the variables tab we’re going to add the following:
Name | Description |
---|---|
AWS_REGION | The region you created your S3 Bucket |
AWS_ROLE_ARN | Copy this from the IAM role you created in the previous section |
BUCKET_NAME | The name of your S3 bucket, i.e. the URL |
HUGO_VERSION | The version of Hugo you’re wanting to use |
SITE_BASE_URL | The URL to your website, i.e. https://example.com |
Now all we need to do is create a workflow. Navigate to the Actions tab and click New workflow. When prompted to choose a workflow, select set up a workflow yourself.
Enter in the following:
1name: Deploy Hugo site to S3
2
3on:
4 push:
5 branches: ["main"]
6 workflow_dispatch:
7
8permissions:
9 contents: read
10 id-token: write
11
12concurrency:
13 group: "hugo_deploy"
14 cancel-in-progress: false
15
16defaults:
17 run:
18 shell: bash
19
20jobs:
21 build:
22 runs-on: ubuntu-latest
23 env:
24 HUGO_VERSION: ${{ vars.HUGO_VERSION }}
25 steps:
26 - name: Install Hugo CLI
27 run: |
28 wget -O ${{ runner.temp }}/hugo.deb https://GitHub.com/gohugoio/hugo/releases/download/v${HUGO_VERSION}/hugo_extended_${HUGO_VERSION}_linux-amd64.deb \
29 && sudo dpkg -i ${{ runner.temp }}/hugo.deb
30 - name: Install Dart Sass
31 run: sudo snap install dart-sass
32 - name: Checkout
33 uses: actions/checkout@v4
34 with:
35 submodules: recursive
36 - name: Build with Hugo
37 env:
38 HUGO_ENVIRONMENT: production
39 HUGO_ENV: production
40 run: |
41 hugo \
42 --minify \
43 --baseURL "${{ vars.SITE_BASE_URL }}/"
44 - name: Upload a Build Artifact
45 uses: actions/[email protected]
46 with:
47 name: hugo-site
48 path: ./public
49
50 deploy:
51 runs-on: ubuntu-latest
52 needs: build
53 steps:
54 - name: Download artifacts (Docker images) from previous workflows
55 uses: actions/download-artifact@v4
56 with:
57 name: hugo-site
58 path: ./public
59 - name: "Configure AWS Credentials"
60 uses: aws-actions/[email protected]
61 with:
62 aws-region: ${{ vars.AWS_REGION }}
63 role-to-assume: ${{ vars.AWS_ROLE_ARN }}
64 role-session-name: GithubActions-deploy-hugo-site
65 mask-aws-account-id: true
66 - name: Sync to S3
67 id: deployment
68 run: aws s3 sync ./public/ s3://${{ vars.BUCKET_NAME }} --delete --cache-control max-age=31536000
Now all we need to do is commit out Hugo site to our repo to our main branch and github actions will trigger the above workflow deploying our site to S3.
Set up Cloudflare
The very last thing we’re going to do is set up our DNS for our site. I use Cloudflare for my DNS and it’s as simple as creating a CNAME record that points to the bucket website endpoint. I.e. <BUCKET_NAME>.s3-website-<AWS_REGION>.amazonaws.com. I have mine set to the root of the domain with the Proxy option enabled. I then have a CNAME for www that points to the root.
If you still have the bucket policy enabled that restricts access to cloudflare, try loading the website endpoint directly. You should get an error message 403 Forbidden from S3. I originally had my bucket set to a name different from my domain name, however when navigating to the site it would return an error message from S3 about not being able to find the bucket, referring to the domain name.
Summary
And there you have it the website you’re currently reading was written in markdown, stored in a free github repo, deployed to a S3 bucket under the AWS free tier, and being accessed through Cloudflare for free. The only cost being the domain registration, not bad.