Multiple runners on single host running job with the same id on PR overwrites status
I have 2 self hosted runners - ‘runner’ & ‘runner-resource-free’ on 1 machine. I have 2 labels ‘Run UI tests’ and ‘Skip UI tests’. Jobs which start based on label use the same job id which is ‘pr_ui_tests_check’ and this job is marked as required in branch protection rules.
pr_check_skip_ui_tests.yaml
name: Skip UI tests on pull request`
on:
pull_request:
types: [ labeled, synchronize ]
jobs:
pr_ui_tests_check:
if: contains(github.event.pull_request.labels.*.name, 'Skip UI tests')
runs-on: runner-resource-free
timeout-minutes: 10
concurrency:
group: ${{ github.event.pull_request.number }}
cancel-in-progress: true
steps:
- name: Skip tests
run: echo "Tests skipped"
pr_check_run_ui_tests.yaml
name: Run UI tests on pull request
on:
pull_request:
types: [ labeled, synchronize ]
jobs:
pr_ui_tests_check:
if: contains(github.event.pull_request.labels.*.name, ‘Run UI tests’)
runs-on: runner
timeout-minutes: 150
concurrency:
group: ${{ github.event.pull_request.number }}
cancel-in-progress: true
steps:
- uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Run UI tests
uses: ./.github/actions/run-ui-tests
When I run UI tests on pull request #1 using label 'Run UI tests' they start okay. But when I add label 'Skip UI tests' on pull request #2 and this job finishes it overwrites pull request #1 and marks 'pr_ui_tests_check' as success due to skip in another pull request. How can I avoid that?
do you know?
how many words do you know
See also questions close to this topic
-
How to remove Github PAT from Gitlab?
I recently imported a project from Github to Gitlab (Uni Owned) and I had to put my Github PAT in when importing. Now every-time I select New Project -> Import Project From Github I'm immediately taken to the list of projects on the Github account corresponding to the PAT I put in earlier.
Is there a way I can remove the Github PAT I initially put in so I'm taken to the "Provide your Github PAT" screen instead?
Thanks!
-
Git - Commit and push local changes without losing file diffs?
I'm in
VSCode
and just made a ton of changes on my branch. I have a rough draft that would be a shame to lose due to to not backing up online. So naturally it would be smart togit add .
my changes, thengit push origin myBranch
to create a remote backup.But doing so will remove all my file diffs that
Visual Studio Code
's interface shows me in the sidebar (see picture below). Is there a way I can push to a remote branch but still keep the changes unstaged (or bring them back to unstaged), or any way to at least keep those diff files so I can still continue work on my local and easily be able to see everywhere that I touched? -
How to delete all commits from private repository in github?
I want to add gitignore with 2 files( node_module,another one) in a private repository which is given by github classroom. But I already git pushed for several times on that repository. Now I want to add gitignore, but after adding gitnore. the previous commited files (node_module, another one) are still in that repository. Now how I can remove all previous commits and git init again on that private repo?
-
Conditional KinesisStreamSpecification in CloudFormation script
I am new to CloudFoundation scripts and trying to set the conditional attribute for AWS DDB table using the yaml files.
Tried with below but getting error during the stack formation - Property StreamArn cannot be empty.
Seems its not allowing AWS::NoValue in this case.
Can we set the 'KinesisStreamSpecification' property itself on the condition?
KinesisStreamSpecification: StreamArn: !If - ShouldAttachKinesis - !Sub "arn:aws:kinesis:SomeValue" - !Ref "AWS::NoValue"
-
OpenCV cannot read manually edited YAML parameters
I manually add a custom matrix to a YAML file for openCV parameters, the problem is it cannot read the matrix and so OpenCV returns none-type. I do not know what is happening here as I tried editing both in notepad and visual studio code.
%YAML:1.0 --- test_matrix: !!opencv-matrix rows: 2 cols: 2 dt: i data: [ 1, 1, 1, 1 ]
-
In a json embedded YAML file - replace only json values using Python
I have a YAML file as follows:
api: v1 hostname: abc metadata: name: test annotations: { "ip" : "1.1.1.1", "login" : "fad-login", "vip" : "1.1.1.1", "interface" : "port1", "port" : "443" }
I am trying to read this data from a file, only replace the values of
ip
andvip
and write it back to the file.What I tried is:
open ("test.yaml", w) as f: yaml.dump(object, f) #this does not help me since it converts the entire file to YAML
also
json.dump()
does not work too as it converts entire file to JSON. It needs to be the same format but the values need to be updated. How can I do so? -
Django Migration from Github actions
Hello I have a database in Google Cloud Platform and I am trying to figure out how to run a django migration from github actions once I have deployed my app to GCP's App engine.
I have tried using cloud_sql_proxy, but cannot get it to connect to my database. I would whitelist github actions ip address but I am not really sure what the ip addresses is either.
Here is the config I currently have:
name: deploy-app-to-gcp on: push: branches: [ main] paths: - '**' jobs: migrate: name: Migrate Database runs-on: ubuntu-latest steps: - uses: actions/checkout@v1 - name: get env file run: echo "${{secrets.ENV_FILE}}" | base64 --decode > ./env_variables.yaml - name: Set up Python 3.9 uses: actions/setup-python@v2 with: python-version: 3.9 - name: Install dependencies run: | python -m pip install --upgrade pip pip install -r requirements.txt - name: Get Cloud SQL Proxy run: | wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy chmod +x cloud_sql_proxy - name: migrate Database env: DATABASE_CONNECTION_ADDRESS: 127.0.0.1 run: | pip install ruamel.yaml set -a; eval $(python -c 'from pathlib import Path;from ruamel.yaml import YAML; print( "".join( [f"{k}={v!r}\n" for k, v in YAML().load(Path("env_variables.yaml"))["env_variables"].items() if not k.__eq__("DATABASE_CONNECTION_ADDRESS")] ) )'); set +a ./cloud_sql_proxy -instances=com-cjoshmartin:us-central1:cms-db=tcp:5432 & python manage.py migrate exit 0;
-
How do I create dynamic pipelines with multiple parameters (some optional) in github work flows?
I've been trialling gitlab to make dynamic pipelines based on changes in a mono repo but have hit a bug that makes that solution essentially unworkable so am looking to try github instead.
Basic workflow I had was:
- Compare merge request branch to target branch and identify all changed files.
- Identify which (visual studio) solutions need to be rebuilt based on dependencies of changed files.
- Identify whether the solution includes NUnit or GTest tests.
- Dynamically generate .yaml script containing jobs with build (and optional test instructions).
- Pass the dynamic script back to the CI such that it launches all the build[and optional test] jobs.
I can see there's matrices that could potentially pass the required solution path but:
- All the examples have very simple looking text strings or numbers - can you pass dictionaries of options that way?
- Is there a cleaner way to dynamically generate jobs with more than one parameter for a single job?
-
Github Action ubuntu-latest to Heroku auth failed
I'm seeing this error as of today, was working yesterday and prior to that as well. Can't see that anything has changed in the Heroku documentation that might cause this breaking change.
Error:
Switched to a new branch 'deploy' remote: ! WARNING: remote: ! Do not authenticate with username and password using git. remote: ! Run `heroku login` to update your credentials, then retry the git command. remote: ! See documentation for details: https://devcenter.heroku.com/articles/git#http-git-authentication fatal: Authentication failed for 'https://git.heroku.com/snapnhd-staging.git/'
main.yml
server-deploy: needs: server-check runs-on: ubuntu-latest steps: - uses: actions/checkout@master - uses: actions/setup-ruby@v1 with: ruby-version: '2.6.x' - name: Determine Heroku App id: heroku uses: ./.github/actions/heroku-app - name: Deploy env: HEROKU_API_KEY: ${{ secrets.HEROKU_API_KEY }} HEROKU_APP: ${{ steps.heroku.outputs.app }} run: | git remote add heroku \ https://heroku:$HEROKU_API_KEY@git.heroku.com/$HEROKU_APP.git git fetch --unshallow origin git checkout -b deploy git push heroku deploy:master -f
-
Celery with Server MongoDB
Working on getting Celery setup with a mongodb as result_backend. Following the configuration guidelines set out in the official docs, my celeryconfig.py is setup as follows:
class CeleryConfig: # Celery Config Available fields listed here: # https://docs.celeryq.dev/en/stable/userguide/configuration.html imports = imports broker_url = broker_url result_backend = "mongodb+srv://admin:admin@testing.company.mongodb.net/" accept_content = ["json"] result_serializer = "json" task_serializer = "json" mongodb_backend_settings = { "database": db_name, "taskmeta_collection": db_name, }
PYPROJECT.toml file
[tool.poetry.dependencies] python = "^3.9" boto3 = "^1.21.17" celery = "^5.2.3" mongoengine = "^0.24.1"
# Initialization of Celery APP app = Celery(backend=CELERY_BACKEND_URL) app.config_from_object(CeleryConfig)
celery worker log also states that url is changed from ->
"mongodb+srv://admin:admin@testing.company.mongodb.net/"
to ->[config] results: mongodb://admin:admin@testing.company.mongodb.net/
error I am getting executing any task ->
pymongo.errors.ServerSelectionTimeoutError: testing.company.mongodb.net:27017: [Errno -2] Name or service not known, Timeout: 30s, Topology Description: <TopologyDescription id: 62750b2e5b92d32d7cf08a22, topology_type: Unknown, servers: [< ServerDescription ('testing.company.mongodb.net', 27017) server_type: Unknown, rtt: None, error=AutoReconnect('testing.company.mongodb.net:27017: [Errno -2] Name or service not known')>]>
For some reason it it pushes the port into my url and mongodb+srv:// URL cannot have port number
-
How to execute a a remote script in a reusable github workflow
I have this workflow in a repo called
terraform-do-database
and I'm trying to use a reusable workflow coming from the public repofoo/git-workflows/.github/workflows/tag_validation.yaml@master
name: Tag Validation on: pull_request: branches: [master] push: branches: - '*' # matches every branch that doesn't contain a '/' - '*/*' # matches every branch containing a single '/' - '**' # matches every branch - '!master' # excludes master # Allows you to run this workflow manually from the Actions tab workflow_dispatch: jobs: tag_check: uses: foo/git-workflows/.github/workflows/tag_validation.yaml@master
And this is the reusable workflow file from the public
git-workflows
repo that has the script that should run on it. What is happening is that the workflow is trying to use a script inside the repoterraform-do-database
name: Tag Validation on: pull_request: branches: [master] workflow_call: jobs: tag_check: # The type of runner that the job will run on runs-on: ubuntu-latest # Steps represent a sequence of tasks that will be executed as part of the job steps: # Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it - uses: actions/checkout@v3 # Runs a single command using the runners shell - name: Verify the tag value run: ./scripts/tag_verify.sh
So the question: How can I make the workflow use the script stored in the
git-worflows
repo instead of the terraform-do-database?I want to have a single repo where I can call the workflow and the scripts, I don't want to have everything duplicated inside all my repos.
-
GIT merge develop into feature branch -- after the same branch was merged and reverted
I am in a pretty sticky situation with a merge, I hope someone can help me.
A feature branch was merged into develop some time ago, all good. Because that feature was not required for a release, the merge was reverted. In the meantime some more feature branches were merged into develop.
Now, it is time to re-merge the same feature branch into develop. As expected, when I try to do that, git will notice that my changes were deleted, so it automatically deletes those files/changes that were part of the original merge in my local feature branch.
Is there a way to fix this: merge/rebase develop into my feature branch, but keeping all my changes?
-
How can I create a pull request between branches with entirely different commit histories
I have two branches with entirely different commit histories, and I need to create a pull request - NOT MERGE - from one of them to the other. When I tried it gave me this:
I tried to push a mutual commit, but this is not working. What is the best practice for it? I'm a beginner and this is my first time dealing with it.
What should I do to create this pull request with all files contained in the
pull
branch? -
Invoke-RestMethod : {"errors":[{"msg":"Either \u0027analysisId\u0027, \u0027projectId\u0027 or \u0027projectKey\u0027 must be provided"}]}
I am trying to get quality gate status from sonar server using API by passing pull request ID but I am getting error:
$QualityGateResult = Invoke-RestMethod -Method Get -Uri "$ServerUrl/api/qualitygates/project_status?pullRequest=$pullrequest_key?api- version=6.0" -Headers $Headers $QualityGateResult | ConvertTo-Json | Write-Host if ($QualityGateResult.projectStatus.status -eq "OK"){ Write-Host "Quality Gate Succeeded" } else{ throw "Quality gate failed. Please check and fix the issues by reviewing the same." }
Error msg: Invoke-RestMethod : {"errors":[{"msg":"Either \u0027analysisId\u0027, \u0027projectId\u0027 or \u0027projectKey\u0027 must be provided"}]}
Please help us to understand where I am making mistake.
- create pull request from branch to Empty branch