CloudFormation example for AWS CodePipeline - Hugo Deployment

- aws codepipeline cloudformation

I recently blogged on how you can use AWS CodePipeline to automatically deploy your Hugo website to AWS S3 and promised a CloudFormation template, so here we go. You can find the full template in this GitHub repo.

If you create a new stack with the template you will be asked for following parameters, let’s look at them in detail:

AWS CloudFormation

The referenced GitHub Repo has to be your Repo with the Hugo source files and the in the previous blog post mentioned buildspec.yml file which has to be in this repo as well

Needed parameters

  • GitHub OAuth Token → The Token which will be used to create the webhook in the Repo

  • GitHub Owner → The owner of the GitHub Repo

  • GitHub Repo → The name of the GitHub Repo

  • GitHub Branch → The name of the Branch

  • Artifacts S3 BucketName → The name of the S3 bucket where CodePipeline Artifacts will be saved, this bucket will be created!

  • Target S3 Bucket → The name of the S3 bucket where your Hugo Website will be deployed, this bucket will be created!

  • S3 Bucket with Lambda Code ZIP → The existing S3 bucket which contains the ZIP file of the python script for the CloudFront invalidation. The file has to be named and can be found here

  • CertificateArn → The Arn of the Certificate which should be used on CloudFront Distribution (has to be created in US East!)

I tried to generate the certificate with the Template as well but unfortunately there is no easy way doing this → Looks like Terraform offers this functionality, think I will have a look at Terraform soon
  • HostedZoneId → The Id of the hosted Zone on Route53, will be used to create the following 2 subdomains/ WebsiteNames

  • WebsiteName01 → subdomain1 of the HostedZone

  • WebsiteName02 → subdomain2 of the HostedZone

Created AWS Resources

If you create a Stack out of this Template following resources will be created automatically:

  • PipelineArtifactsBucket → AWS::S3::Bucket Artifacts S3 BucketName

  • PipelineWebpageBucket → AWS::S3::Bucket Target S3 Bucket

  • BucketPolicy → AWS::S3::BucketPolicy which will be used for the S3 Bucket with the Hugo source files and allows PublicRead access

  • myCloudfrontDist → AWS::CloudFront::Distribution for the following subdomain names

  • domainDNSRecord1 → AWS::Route53::RecordSet WebsiteName01

  • domainDNSRecord2 → AWS::Route53::RecordSet WebsiteName02

  • CodeBuildProject → AWS::CodeBuild::Project, the actual build project which will be used in the CodePipeline

  • CodePipeline → AWS::CodePipeline::Pipeline

  • GithubWebhook → AWS::CodePipeline::Webhook

  • CreateCodePipelinePolicy → AWS::IAM::ManagedPolicy, the managed policy which will be used for the according role/pipeline

  • CodePipelineRole → AWS::IAM::Role with managed policy for CodePipeline

  • CreateCodeBuildPolicy → AWS::IAM::ManagedPolicy the managed policy which will be used for the according role for CodeBuild

  • CodeBuildRole → AWS::IAM::Role with managed policy for CodeBuild

  • CreateLambdaExecutionPolicy → AWS::IAM::ManagedPolicy

  • LambdaExecutedRole → AWS::IAM::Role with managed policy to give Lambda enough rights

  • LambdaCloudfrontInvalidation → AWS::Lambda::Function python function

Code examples

Throughout the Template I tried to follow the principle of least privilege. I.e. if you look at the CodeBuild Policy you see that CodeBuild is only allowed to work with the created S3 buckets.

    Type: AWS::IAM::ManagedPolicy
      ManagedPolicyName: CodeBuildAccess_Hugo
      Description: "Policy for access to logs and Hugo S3 Buckets"
      Path: "/"
        Version: "2012-10-17"
        - Sid: VisualEditor0
          Effect: Allow
          Action: s3:*
          Resource: [
            !Join [ '', ['arn:aws:s3:::',!Ref TargetS3Bucket] ],
            !Join [ '', ['arn:aws:s3:::',!Ref TargetS3Bucket, '/*'] ],
            !Join [ '', ['arn:aws:s3:::',!Ref ArtifactsBucketName] ],
            !Join [ '', ['arn:aws:s3:::',!Ref ArtifactsBucketName, '/*'] ]
        - Sid: VisualEditor1
          Effect: Allow
          Action: logs:*
          Resource: '*'

Following part creates the CodePipeline with all stages
(Source from GitHub, Build on CodeBuild, Deploy to S3 and call Lambda function)

    Type: AWS::CodePipeline::Pipeline
      Name: PipelineForStaticWebpageWithHugo
        Type: S3
        Location: !Ref PipelineArtifactsBucket
      RestartExecutionOnUpdate: true
      RoleArn: !GetAtt CodePipelineRole.Arn
      - Name: Source
        - Name: Source
          InputArtifacts: []
            Category: Source
            Owner: ThirdParty
            Version: 1
            Provider: GitHub
          - Name: SourceCode
            Owner: !Ref GitHubOwner
            Repo: !Ref GitHubRepo
            Branch: !Ref GitHubBranch
            PollForSourceChanges: false
            OAuthToken: !Ref GitHubOAuthToken
          RunOrder: 1
      - Name: Build
        - Name: CodeBuild
            Category: Build
            Owner: AWS
            Provider: CodeBuild
            Version: '1'
            - Name: SourceCode
          - Name: PublicFiles
            ProjectName: !Ref CodeBuildProject
          RunOrder: 1
      - Name: Deploy
        - Name: S3Deploy
            Category: Deploy
            Owner: AWS
            Provider: S3
            Version: '1'
            - Name: PublicFiles
            BucketName: !Ref TargetS3Bucket
            Extract: 'true'
          RunOrder: 1
        - Name: LambdaDeploy
            Category: Invoke
            Owner: AWS
            Provider: Lambda
            Version: '1'
            FunctionName: invalidateCloudfront
            UserParameters: !Ref myCloudfrontDist
          RunOrder: 2

This is the Lambda function written in python to create the CloudFront invalidation. I needed quiet some time to get the CodePipeline jobId and to get the Id of the CloudFront Distribution out of the UserParameters.

import time
import logging
from botocore.exceptions import ClientError
import boto3

LOGGER = logging.getLogger()

def codepipeline_success(job_id):
    Puts CodePipeline Success Result
        codepipeline = boto3.client('codepipeline')
        return True
    except ClientError as err:
        LOGGER.error("Failed to PutJobSuccessResult for CodePipeline!\n%s", err)
        return False

def codepipeline_failure(job_id, message):
        codepipeline = boto3.client('codepipeline')
            failureDetails={'type': 'JobFailed', 'message': message}
        return True
    except ClientError as err:
        LOGGER.error("Failed to PutJobFailureResult for CodePipeline!\n%s", err)
        return False

def lambda_handler(event, context):
        job_id = event['CodePipeline.job']['id']
        distId = event['CodePipeline.job']['data']['actionConfiguration']['configuration']['UserParameters']
        client = boto3.client('cloudfront')
        invalidation = client.create_invalidation(DistributionId=distId,
                'Paths': {
                    'Quantity': 1,
                    'Items': ['/*']
            'CallerReference': str(time.time())

    except KeyError as err:
        LOGGER.error("Could not retrieve CodePipeline Job ID!\n%s", err)
        return False
        codepipeline_failure(job_id, err)

Hope this Template helps you on building your own CodePipelines via CloudFormations.