Deploying a Hugo website to Amazon S3 using AWS CodeBuild

A month ago I blogged about using Bitbucket Pipelines as a deployment tool to deploy my Hugo website to AWS S3. It was a fully automated setup that deployed a new version of the site every time I pushed a commit to the master branch of the git repo.

Lately I’ve been moving more things to AWS, as having everything on AWS makes it easier to integrate stuff, including my Hugo blog. Let me show you how I set up the build process on AWS.

CodeCommit

Firstly I moved my git repo from the public, free Bitbucket server to AWS CodeCommit. There really is nothing special to say about that: CodeCommit is simply git on AWS (details on pricing)

The only thing I want to stress, again, is that you should not use your admin user to push code but create a new IAM user with limited access so it can only push code and nothing more. The CodeCommit page will guide you with that, up to the point of creating SSH keys.

The AWS Managed Policy AWSCodeCommitFullAccess should be all the access needed, there is no need to write your own policy.

CodeBuild

Secondly, I needed a replacement for Bitbucket Pipelines: AWS CodeBuild. Launched in December 2016, CodeBuild is almost exactly the same build system as Bitbucket Pipelines (and Travis CI, and GitLab templates, and so many other Docker-driven build systems) and there is just one thing you need to create yourself: a build template.

Here’s what I used as buildspec.yml for building and deploying my Hugo blog:

version: 0.1

environment_variables:
  plaintext:
    AWS_DEFAULT_REGION: "YOUR_AWS_REGION_CODE"
    HUGO_VERSION: "0.17"
    HUGO_SHA256: "f1467e204cc469b9ca6f17c0dc4da4a620643b6d9a50cb7dce2508aaf8fbc1ea"

phases:
  install:
    commands:
      - curl -Ls https://github.com/spf13/hugo/releases/download/v${HUGO_VERSION}/hugo_${HUGO_VERSION}_Linux-64bit.tar.gz -o /tmp/hugo.tar.gz
      - echo "${HUGO_SHA256}  /tmp/hugo.tar.gz" | sha256sum -c -
      - tar xf /tmp/hugo.tar.gz -C /tmp
      - mv /tmp/hugo_${HUGO_VERSION}_linux_amd64/hugo_${HUGO_VERSION}_linux_amd64 /usr/bin/hugo
      - rm -rf /tmp/hugo*
  build:
    commands:
      - hugo
  post_build:
    commands:
      - aws s3 sync --delete public s3://BUCKETNAME --cache-control max-age=3600

The Docker image I used was the standard Ubuntu Linux 14.04 one since I don’t require any custom software during my build plan.

For more complex jobs you can provide your own Docker image to run the build process in. Make sure it includes libc, otherwise AWS will not be able to run it. Sadly this will exclude most alpine-based images, but for a build process that probably shouldn’t be a big issue.

Instead of using an IAM user by providing the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in my build template, I used the CodeBuild IAM role to define my access to the S3 bucket. CodeBuild will generate this role for you when creating a build plan, just add this custom IAM policy to that role:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Action": [
                "s3:List*",
                "s3:Put*",
                "s3:DeleteObject"
            ],
            "Resource": [
                "arn:aws:s3:::BUCKETNAME",
                "arn:aws:s3:::BUCKETNAME/*"
            ],
            "Effect": "Allow"
        }
    ]
}

Replace BUCKETNAME with the name of your S3 bucket.

Some remarks

Right now deployment is a manual action: I log into the AWS CodeBuild site and push the Run build button. CodeBuild has no easy “Build on new commits” option, but you can of course use AWS Lambda to build that yourself. I will do that soon for my blog, and then I’ll update this post with the Lambda I used.

If you are looking for a complete pipeline system like GoCD, AWS CodePipeline is what you need.

comments powered by Disqus