-
Notifications
You must be signed in to change notification settings - Fork 94
CLOUDP-296897: Allow to select tests to run #2240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
pull_request: | ||
types: [opened, synchronize, labeled, unlabeled] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
comment: Note this will never work for external contributors. For security reasons, the pull_request
event uses the credentials from the contributor's fork repo, not the AKO repo. For accepting external contributions pull_request_target
needs to be used instead, but in such case, when the contribution is external, a gating mechanism needs to be set in place to avoid external forks to access and peek or abuse AKO creds without us even been able to stop it (for instance, "if external, the safe-to-test
label must also be present").
BTW, one current issue I had when tying to fix these external workflows is that if you have pull_request
& pull_request_target
both set, they will trigger and your workflows might duplicate. I guess the best way is, either make sure only one can fire at a time (pull_request
only for owners and pull_request_target
only for contributors) or use only pull_request_target
with gating for non owners.
q: Was intended to skip external contributions? It is fine, as the CI is still broken in general for that use case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, I think we can drop the CI runs for PRs from forked repos for now. I'll create a follow up ticket
uses: actions/github-script@v7 | ||
with: | ||
script: | | ||
if (context.eventName === 'pull_request') { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
comment: remember this if pull_request_target
is added in the PR.
./bin/ginkgo-labels > result.json | ||
echo "Int tests to execute $(cat result.json | jq -c .int)" | ||
echo "E2E tests to execute $(cat result.json | jq -c .e2e)" | ||
|
||
echo "int_matrix=$(cat result.json | jq -c .int)" >> $GITHUB_OUTPUT | ||
echo "e2e_matrix=$(cat result.json | jq -c .e2e)" >> $GITHUB_OUTPUT |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
comment: we'll need to take this into account when we split helm chart tests from e2e
|
||
# run: | | ||
# cd ./test/e2e | ||
# ginkgo --label-filter="${{ matrix.test }}" -r --fail-fast |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
q: interesting this --fail-fast
, should we be more explicit about this? I mean this means when the PR fails there might be more errors to check on the same label that failed in the CI
But I agree we probably want this to be less wasteful with CI tests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is fixed by executing the devbox
.PHONY: compute-labels | ||
compute-labels: | ||
mkdir -p bin | ||
go build -o bin/ginkgo-labels tools/compute-test-labels/main.go |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
q: Not for this PR, but a thought in general. Should we group all AKO dev tools in a single tool repo?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can think about that if there are good reasons for that. So far, nothing stops us for having tools in the /tools
folder.
func jsonDump(data interface{}) string { | ||
r, _ := json.Marshal(data) | ||
return string(r) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
comment: I use this exact same code a lot when debugging, we might want to have it in a library somewhere in internal. It will be a different copy here, though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Waiting on a few questions, but LGTM overall.
Great job!
USE_JSON: true | ||
run: | | ||
make compute-labels | ||
./bin/ginkgo-labels > result.json |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
q why not just go run ....
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also possible, but I decided to go with the go build
option
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! great job!
Summary
This PR introduces a new workflow that allows to selectively run tests based on Github labels
Examples:
test/e2e/*
- runs all e2e teststest/e2e/Atlas*
- runs e2e tests whose Ginkgo test labels starts withAtlas
test/e2e/AtlasBackupSchedule
- runs specific e2e test with Ginkgo labelAtlasBackupSchedule
test/int/*
- runs all integrationtest/int/alert-*
- runs integration tests whose Ginkgo test labels starts withalert-
test/int/alert-config
- runs specific integration test with Ginkgo labelalert-config
The workflow is triggered for opened, labeled and modified PR. The workflow is triggered for every labels change.
This PR doesn't break existing workflow. The previous workflow can still be triggered by applying
cloud-tests
label.Proof of Work
See the workflow results
Checklist