Skip to content

Split CI build, restore Cloudflare purge for datasheets #2153

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 18 commits into from
Sep 10, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 28 additions & 0 deletions .github/actions/cleanup-disk/action.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
name: Disk cleanup
description: "Cleanup disk space"
runs:
using: composite
steps:
- name: Free up disk space
shell: bash
run: |
echo "Disk space before cleanup..."
df -h /
echo "Removing unnecessary files to free up disk space..."
# https://github.com/actions/runner-images/issues/2840#issuecomment-2272410832
sudo rm -rf \
/opt/hostedtoolcache \
/opt/google/chrome \
/opt/microsoft/msedge \
/opt/microsoft/powershell \
/opt/pipx \
/usr/lib/mono \
/usr/local/julia* \
/usr/local/lib/android \
/usr/local/lib/node_modules \
/usr/local/share/chromium \
/usr/local/share/powershell \
/usr/share/dotnet \
/usr/share/swift
echo "Disk space after cleanup..."
df -h /
3 changes: 0 additions & 3 deletions .github/actions/cloudflare-upload/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,6 @@ inputs:
runs:
using: composite
steps:

- name: Find PR Preview Comment
if: github.event_name == 'pull_request'
uses: peter-evans/find-comment@v1
Expand Down Expand Up @@ -67,7 +66,6 @@ runs:
🚀 Preview this PR: ${{ steps.deploy-cloudflare.outputs.url }}
📍 Commit SHA: ${{ github.sha }}


- name: Update PR Preview Comment
if: github.event_name == 'pull_request' && steps.deploy-preview-comment.outputs.comment-id != 0
uses: peter-evans/[email protected]
Expand All @@ -78,4 +76,3 @@ runs:
### ${{ inputs.project-name }}
🚀 Preview this PR: ${{ steps.deploy-cloudflare.outputs.url }}
📍 Commit SHA: ${{ github.sha }}

43 changes: 43 additions & 0 deletions .github/actions/generate-datasheets/action.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
name: "Generate Datasheets"
description: "Generate product datasheets from markdown files"
inputs:
datasheets-path:
description: "The datasheets path"
required: true
default: static/resources/datasheets
artifact-name:
description: "The name of the output artifact"
required: true

runs:
using: composite
steps:
- uses: actions/cache@v4
id: cache
with:
path: ${{ inputs.datasheets-path }}
key: ${{ runner.os }}-datasheets-${{ hashFiles('**/*datasheet.md') }}

- uses: actions/setup-node@v4
if: steps.cache.outputs.cache-hit != 'true'
with:
node-version: 18
cache: "npm"
cache-dependency-path: "package-lock.json"

- name: Render Datasheets
if: steps.cache.outputs.cache-hit != 'true'
run: |
cd ${GITHUB_WORKSPACE}/scripts/datasheet-rendering
./render-datasheets.sh
cd $GITHUB_WORKSPACE
mkdir -p ${{ inputs.datasheets-path }}
find ./content/hardware -type f -name "*-datasheet.pdf" -exec cp {} ./${{ inputs.datasheets-path }}/ \;
shell: bash

- name: Export artifact
uses: actions/upload-artifact@v4
with:
name: ${{ inputs.artifact-name }}
path: ${{ inputs.datasheets-path }}
retention-days: 1 # Only needed to pass it to the next job
30 changes: 30 additions & 0 deletions .github/actions/sync-s3/action.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
name: Sync assets on S3
description: "Sync Docs assets on S3"
inputs:
role-to-assume:
description: "The IAM role to assume"
required: true
bucket-name:
description: "The name of the S3 bucket to sync assets to"
required: true
runs:
using: composite
steps:
- name: Configure AWS credentials from Staging account
uses: aws-actions/configure-aws-credentials@v4
with:
role-to-assume: ${{ inputs.role-to-assume }}
aws-region: us-east-1

- name: Sync all cacheable assets
shell: bash
run: aws s3 sync --cache-control "public, max-age=31536000, immutable" --include "*.css" --include="*.js" --include="*.gif" --include="*.png" --include="*.svg" --exclude "*.html" --exclude="sw.js" --exclude="*.json" --exclude="*.pdf" --delete public/ s3://${{ inputs.bucket-name }}/

- name: Sync all non-cacheable assets
shell: bash
# Don't cache any HTML or JSON file: they should always be up-to-dates
run: aws s3 sync --cache-control "public, max-age=0, must-revalidate" --include "*.html" --include="sw.js" --include="*.json" --include "*.css" --exclude="*.js" --exclude="*.gif" --exclude="*.png" --exclude="*.svg" --exclude="*.pdf" --delete public/ s3://${{ inputs.bucket-name }}/

- name: Sync PDF
shell: bash
run: aws s3 sync --cache-control "public, max-age=86400, must-revalidate" --include "*.pdf" --exclude="*.js" --exclude="*.gif" --exclude="*.png" --exclude="*.svg" --exclude="*.css" --exclude="*.html" --exclude="*.json" --exclude="sw.json" --delete public/ s3://${{ inputs.bucket-name }}/
87 changes: 52 additions & 35 deletions .github/workflows/deploy-production.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,13 +10,30 @@ concurrency:
group: deploy-production
cancel-in-progress: true

# Allow installation of dependencies
permissions:
id-token: write
contents: read

jobs:
# This job is used to render datasheets, but only if they have changed.
# It's a separate job so we don't have to cleanup the machine afterwards.
render-datasheets:
name: Render Datasheets
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 1

- uses: ./.github/actions/generate-datasheets
with:
artifact-name: datasheets
datasheets-path: static/resources/datasheets

build:
if: "github.repository == 'arduino/docs-content'"
name: Build and Deploy
needs: render-datasheets
runs-on: ubuntu-latest
environment: production
env:
Expand All @@ -27,20 +44,28 @@ jobs:
- uses: actions/checkout@v4
with:
fetch-depth: 1

- name: Cleanup runner disk
uses: ./.github/actions/cleanup-disk # Cleanup machine before starting the build

- uses: actions/setup-node@v4
with:
node-version: 18
cache: "npm"
cache-dependency-path: "**/package-lock.json"
cache-dependency-path: "package-lock.json"

- name: Retrieve Datasheets
uses: actions/download-artifact@v4 # Retrieve the datasheets generated in the previous job
with:
name: datasheets
path: static/resources/datasheets

- name: Render Datasheets
run: cd ${GITHUB_WORKSPACE}/scripts/datasheet-rendering;./render-datasheets.sh
- name: Debug datasheet list
run: ls -lah static/resources/datasheets

- name: Copy Static Files
run: |
mkdir -p static/resources/datasheets static/resources/schematics static/resources/pinouts static/resources/models
find ./content/hardware -type f -name "*-schematics.pdf" -exec cp {} ./static/resources/schematics/ \;
find ./content/hardware -type f -name "*-datasheet.pdf" -exec cp {} ./static/resources/datasheets/ \;
mkdir -p static/resources/schematics static/resources/pinouts static/resources/models
find ./content/hardware -type f -name "*-full-pinout.pdf" -exec cp {} ./static/resources/pinouts/ \;
find ./content/hardware -type f -name "*-pinout.png" -exec cp {} ./static/resources/pinouts/ \;
find ./content/hardware -type f -name "*-step.zip" -exec cp {} ./static/resources/models/ \;
Expand All @@ -54,7 +79,7 @@ jobs:
restore-keys: |
${{ runner.os }}-cache-gatsby-main

- name: Gatsby Public Folder
- name: Gatsby Public Folder cache
uses: actions/cache@v4
id: gatsby-public-folder
with:
Expand All @@ -64,37 +89,29 @@ jobs:
${{ runner.os }}-public-gatsby-main

- run: npm install

- run: npm run build

- name: Clean up node_modules
- name: Clean up node_modules # Just to save space
run: rm -rf node_modules

- name: Configure AWS credentials from Production account
uses: aws-actions/configure-aws-credentials@v4
- name: Deploy to S3
uses: ./.github/actions/sync-s3
with:
role-to-assume: ${{ secrets.PRODUCTION_IAM_ROLE }}
aws-region: us-east-1

- name: Sync all cacheable assets
run: aws s3 sync --cache-control "public, max-age=31536000, immutable" --include "*.css" --include="*.js" --include="*.gif" --include="*.png" --include="*.svg" --exclude "*.html" --exclude="sw.js" --exclude="*.json" --exclude="*.pdf" --delete public/ s3://${{ secrets.PRODUCTION_BUCKET_NAME }}/

- name: Sync all non-cacheable assets
# Don't cache any HTML or JSON file: they should always be up-to-dates
run: aws s3 sync --cache-control "public, max-age=0, must-revalidate" --include "*.html" --include="sw.js" --include="*.json" --include "*.css" --exclude="*.js" --exclude="*.gif" --exclude="*.png" --exclude="*.svg" --exclude="*.pdf" --delete public/ s3://${{ secrets.PRODUCTION_BUCKET_NAME }}/

# - name: Sync PDF
# run: aws s3 sync --cache-control "public, max-age=86400, must-revalidate" --include "*.pdf" --exclude="*.js" --exclude="*.gif" --exclude="*.png" --exclude="*.svg" --exclude="*.css" --exclude="*.html" --exclude="*.json" --exclude="sw.json" --delete public/ s3://${{ secrets.PRODUCTION_BUCKET_NAME }}/

# - name: Purge cache on CloudFlare
# run: |
# curl -X POST "https://api.cloudflare.com/client/v4/zones/${{ secrets.CLOUDFLARE_ZONE }}/purge_cache" \
# -H "Authorization: Bearer ${{ secrets.CLOUDFLARE_PURGE_API_TOKEN }}" \
# -H "Content-Type: application/json" \
# --data '{"prefixes":["${{ vars.DATASHEETS_BASE_URL }}"]}'
bucket-name: ${{ secrets.PRODUCTION_BUCKET_NAME }}

- name: Sync all cacheable assets
run: aws s3 sync --cache-control "public, max-age=31536000, immutable" --include "*.css" --include="*.js" --include="*.gif" --include="*.png" --include="*.svg" --exclude "*.html" --exclude="sw.js" --exclude="*.json" --delete public/ s3://${{ secrets.PRODUCTION_BUCKET_NAME }}/

- name: Sync all non-cacheable assets
# Don't cache any HTML or JSON file: they should always be up-to-dates
run: aws s3 sync --cache-control "public, max-age=0, must-revalidate" --include "*.html" --include="sw.js" --include="*.json" --include "*.css" --exclude="*.js" --exclude="*.gif" --exclude="*.png" --exclude="*.svg" --delete public/ s3://${{ secrets.PRODUCTION_BUCKET_NAME }}/
purge-datasheets:
name: Purge Datasheets cache
needs: build
runs-on: ubuntu-latest
environment: production
steps:
- name: Purge Cloudflare Cache
shell: bash
run: |
echo "Purging Cloudflare cache for prefix: ${{ vars.DATASHEETS_BASE_URL }}, zone: ${{ vars.CLOUDFLARE_ZONE }}"
curl -f -X POST "https://api.cloudflare.com/client/v4/zones/${{ vars.CLOUDFLARE_ZONE }}/purge_cache" \
-H "Authorization: Bearer ${{ secrets.CLOUDFLARE_PURGE_API_TOKEN }}" \
-H "Content-Type: application/json" \
--data '{"prefixes":["${{ vars.DATASHEETS_BASE_URL }}"]}'
87 changes: 52 additions & 35 deletions .github/workflows/deploy-staging.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,12 +10,30 @@ concurrency:
group: deploy-staging
cancel-in-progress: true

# Allow installation of dependencies
permissions:
id-token: write
contents: read

jobs:
# This job is used to render datasheets, but only if they have changed.
# It's a separate job so we don't have to cleanup the machine afterwards.
render-datasheets:
name: Render Datasheets
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 1

- uses: ./.github/actions/generate-datasheets
with:
artifact-name: datasheets
datasheets-path: static/resources/datasheets

build:
name: Build and Deploy
needs: render-datasheets
runs-on: ubuntu-latest
environment: staging
env:
Expand All @@ -26,20 +44,28 @@ jobs:
- uses: actions/checkout@v4
with:
fetch-depth: 1

- name: Cleanup runner disk
uses: ./.github/actions/cleanup-disk # Cleanup machine before starting the build

- uses: actions/setup-node@v4
with:
node-version: 18
cache: "npm"
cache-dependency-path: "**/package-lock.json"
cache-dependency-path: "package-lock.json"

- name: Retrieve Datasheets
uses: actions/download-artifact@v4 # Retrieve the datasheets generated in the previous job
with:
name: datasheets
path: static/resources/datasheets

- name: Render Datasheets
run: cd ${GITHUB_WORKSPACE}/scripts/datasheet-rendering;./render-datasheets.sh
- name: Debug datasheet list
run: ls -lah static/resources/datasheets

- name: Copy Static Files
run: |
mkdir -p static/resources/datasheets static/resources/schematics static/resources/pinouts static/resources/models
find ./content/hardware -type f -name "*-schematics.pdf" -exec cp {} ./static/resources/schematics/ \;
find ./content/hardware -type f -name "*-datasheet.pdf" -exec cp {} ./static/resources/datasheets/ \;
mkdir -p static/resources/schematics static/resources/pinouts static/resources/models
find ./content/hardware -type f -name "*-full-pinout.pdf" -exec cp {} ./static/resources/pinouts/ \;
find ./content/hardware -type f -name "*-pinout.png" -exec cp {} ./static/resources/pinouts/ \;
find ./content/hardware -type f -name "*-step.zip" -exec cp {} ./static/resources/models/ \;
Expand All @@ -53,7 +79,7 @@ jobs:
restore-keys: |
${{ runner.os }}-cache-gatsby-main

- name: Gatsby Public Folder
- name: Gatsby Public Folder cache
uses: actions/cache@v4
id: gatsby-public-folder
with:
Expand All @@ -63,38 +89,29 @@ jobs:
${{ runner.os }}-public-gatsby-main

- run: npm install

- run: npm run build

- name: Clean up node_modules
- name: Clean up node_modules # Just to save space
run: rm -rf node_modules

- name: Configure AWS credentials from Staging account
uses: aws-actions/configure-aws-credentials@v4
- name: Deploy to S3
uses: ./.github/actions/sync-s3
with:
role-to-assume: ${{ secrets.STAGING_IAM_ROLE }}
aws-region: us-east-1

# - name: Sync all cacheable assets
# run: aws s3 sync --cache-control "public, max-age=31536000, immutable" --include "*.css" --include="*.js" --include="*.gif" --include="*.png" --include="*.svg" --exclude "*.html" --exclude="sw.js" --exclude="*.json" --exclude="*.pdf" --delete public/ s3://${{ secrets.STAGING_BUCKET_NAME }}/

# - name: Sync all non-cacheable assets
# # Don't cache any HTML or JSON file: they should always be up-to-dates
# run: aws s3 sync --cache-control "public, max-age=0, must-revalidate" --include "*.html" --include="sw.js" --include="*.json" --include "*.css" --exclude="*.js" --exclude="*.gif" --exclude="*.png" --exclude="*.svg" --exclude="*.pdf" --delete public/ s3://${{ secrets.STAGING_BUCKET_NAME }}/

# - name: Sync PDF
# run: aws s3 sync --cache-control "public, max-age=86400, must-revalidate" --include "*.pdf" --exclude="*.js" --exclude="*.gif" --exclude="*.png" --exclude="*.svg" --exclude="*.css" --exclude="*.html" --exclude="*.json" --exclude="sw.json" --delete public/ s3://${{ secrets.STAGING_BUCKET_NAME }}/

# - name: Purge cache on CloudFlare
# run: |
# curl -X POST "https://api.cloudflare.com/client/v4/zones/${{ secrets.CLOUDFLARE_ZONE }}/purge_cache" \
# -H "Authorization: Bearer ${{ secrets.CLOUDFLARE_PURGE_API_TOKEN }}" \
# -H "Content-Type: application/json" \
# --data '{"prefixes":["${{ vars.DATASHEETS_BASE_URL }}"]}'

- name: Sync all cacheable assets
run: aws s3 sync --cache-control "public, max-age=31536000, immutable" --include "*.css" --include="*.js" --include="*.gif" --include="*.png" --include="*.svg" --exclude "*.html" --exclude="sw.js" --exclude="*.json" --delete public/ s3://${{ secrets.STAGING_BUCKET_NAME }}/

- name: Sync all non-cacheable assets
# Don't cache any HTML or JSON file: they should always be up-to-dates
run: aws s3 sync --cache-control "public, max-age=0, must-revalidate" --include "*.html" --include="sw.js" --include="*.json" --include "*.css" --exclude="*.js" --exclude="*.gif" --exclude="*.png" --exclude="*.svg" --delete public/ s3://${{ secrets.STAGING_BUCKET_NAME }}/
bucket-name: ${{ secrets.STAGING_BUCKET_NAME }}

purge-datasheets:
name: Purge Datasheets cache
needs: build
runs-on: ubuntu-latest
environment: staging
steps:
- name: Purge Cloudflare Cache
shell: bash
run: |
echo "Purging Cloudflare cache for prefix: ${{ vars.DATASHEETS_BASE_URL }}, zone: ${{ vars.CLOUDFLARE_ZONE }}"
curl -f -X POST "https://api.cloudflare.com/client/v4/zones/${{ vars.CLOUDFLARE_ZONE }}/purge_cache" \
-H "Authorization: Bearer ${{ secrets.CLOUDFLARE_PURGE_API_TOKEN }}" \
-H "Content-Type: application/json" \
--data '{"prefixes":["${{ vars.DATASHEETS_BASE_URL }}"]}'
Loading