Complete CI/CD Pipeline for Static Websites with AWS S3, Cloudflare CDN, and Cache Purging

RMAG news

Hey, fellow developers! Are you tired of the tedious manual steps involved in deploying static websites? Say no more! In this article, we’ll set up an amazing CI/CD pipeline using GitHub Actions to deploy static websites on AWS S3, supercharge them with Cloudflare as a CDN, and automate cache purging like a pro. Plus, you’ll get to try out my brand-new GitHub Action that makes cache purging a breeze. Ready? Let’s dive in!

Prerequisites

Before we embark on this adventure, make sure you have:

AWS Account: With permissions to create IAM roles, identity providers, and S3 buckets.

Cloudflare Account: With a configured zone.

GitHub Repository: For your static website project.

Step 1: Set Up AWS IAM OpenID Connect Provider

Create OpenID Connect Provider

Create OpenID Connect Provider:

Go to the IAM console.
In the navigation pane, choose Identity providers, and then choose Add provider.
For Provider type, choose OpenID Connect.
For Provider URL, enter https://token.actions.githubusercontent.com.
For Audience, enter sts.amazonaws.com.
For Thumbprints, add the following thumbprints:

6938fd4d98bab03faadb97b34396831e3780aea1
1c58a3a8518e8759bf075b76b750d4f2df264fcd

Choose Add provider.

IAM Role for GitHub Actions

Create IAM Role:

Go to the IAM console.
Create a new role with the following trust policy to allow GitHub runners to assume this role:

{
“Version”: “2012-10-17”,
“Statement”: [
{
“Effect”: “Allow”,
“Principal”: {
“Federated”: “arn:aws:iam::YOUR_ACCOUNT_ID:oidc-provider/token.actions.githubusercontent.com”
},
“Action”: “sts:AssumeRoleWithWebIdentity”,
“Condition”: {
“StringEquals”: {
“token.actions.githubusercontent.com:aud”: “sts.amazonaws.com”,
“token.actions.githubusercontent.com:sub”: “repo:YOUR_GITHUB_USERNAME/YOUR_REPOSITORY_NAME:*”
}
}
}
]
}

Attach Policies:

Attach a policy to the role that allows s3:PutObject and other necessary permissions for your S3 bucket.

{
“Version”: “2012-10-17”,
“Statement”: [
{
“Effect”: “Allow”,
“Action”: [“s3:PutObject”],
“Resource”: “arn:aws:s3:::YOUR_BUCKET_NAME/*”
}
]
}

S3 Bucket Configuration

Create an S3 Bucket:

Go to the S3 console and create a new bucket for your static website.

Set Bucket Policy:

Restrict access to Cloudflare IP ranges and the GitHub Actions IAM role.

{
“Version”: “2012-10-17”,
“Statement”: [
{
“Sid”: “AllowGetFromCloudflare”,
“Effect”: “Allow”,
“Principal”: “*”,
“Action”: “s3:GetObject”,
“Resource”: “arn:aws:s3:::YOUR_BUCKET_NAME/*”,
“Condition”: {
“IpAddress”: {
“aws:SourceIp”: [“CLOUDFLARE_IP_RANGES”]
}
}
},
{
“Sid”: “AllowPutFromGitHubActions”,
“Effect”: “Allow”,
“Principal”: {
“AWS”: “arn:aws:iam::YOUR_ACCOUNT_ID:role/GITHUB_ACTIONS_ROLE_NAME”
},
“Action”: “s3:PutObject”,
“Resource”: “arn:aws:s3:::YOUR_BUCKET_NAME/*”
}
]
}

For Cloudflare IP ranges, refer to Cloudflare IP Ranges.

Step 2: Configure GitHub Actions Workflow

Secrets and Variables

Add GitHub Secrets:

GITHUB_RUNNERS_AWS_IAM_ROLE_ARN
CLOUDFLARE_API_KEY

Add GitHub Variables:

CLOUDFLARE_ZONE_ID

S3_BUCKETS: A comma-separated list of S3 bucket names.

GitHub Actions Workflow

Create a .github/workflows/deploy.yml file in your repository with the following content:

name: Build and Deploy

on:
push:
branches:
main

jobs:
ci:
runs-on: ubuntu-latest
permissions:
id-token: write
contents: read
steps:
name: Checkout code
uses: actions/checkout@v4

name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v4
with:
aws-region: us-east-1
role-to-assume: ${{ secrets.GITHUB_RUNNERS_AWS_IAM_ROLE_ARN }}

name: Setup Node.js
uses: actions/setup-node@v4
with:
cache: yarn
cache-dependency-path: yarn.lock

name: Install dependencies
run: yarn

name: Run build
run: yarn build

name: Upload build artifacts to S3
run: |
IFS=’,’ read -ra buckets <<< “${{ vars.S3_BUCKETS }}”
for bucket in “${buckets[@]}”; do
# upload the build artifacts to the S3 bucket to the root path
aws s3 cp dist “s3://${bucket}/” –recursive
done

name: Purge Cloudflare cache
uses: fishmanlabs/cloudflare-purge-cache-action@v1
with:
api_token: ${{ secrets.CLOUDFLARE_API_KEY }}
zone_id: ${{ var.CLOUDFLARE_ZONE_ID }}
purge_everything: false
purge_files: |
/
/index.html

Workflow Breakdown

Checkout Code: Checks out the code from your GitHub repository.

Configure AWS Credentials: Configures AWS credentials for the GitHub Actions runner using the IAM role created earlier.

Setup Node.js: Sets up Node.js environment and caches dependencies.

Install Dependencies: Installs project dependencies using yarn.

Run Build: Builds the static website using the specified environment.

Upload Build Artifacts to S3: Uploads the built files to the specified S3 buckets.

Purge Cloudflare Cache: Uses the Cloudflare Purge Cache GitHub Action to purge the Cloudflare cache for the specified files.

Conclusion

By following this guide, you can set up a complete CI/CD pipeline for deploying static websites to AWS S3, using Cloudflare as a CDN, and automating cache purging. This approach ensures that your users always receive the latest content without manual intervention, optimizing both performance and user experience.

I’m thrilled to introduce my new GitHub Action, Cloudflare Purge Cache, which makes cache purging effortless. If you find it useful, I would love to see your support by giving it a star on GitHub! Check out the Cloudflare Purge Cache GitHub Action on GitHub Marketplace and integrate it into your project today. For more details and support, visit the repository. Let’s make the web faster together!

Please follow and like us:
Pin Share