Compare commits
314 Commits
conflicts
...
mergemodes
| Author | SHA256 | Date | |
|---|---|---|---|
| db70452cbc | |||
| 53eebb75f7 | |||
| 9f9a4660e9 | |||
| cb2f17a287 | |||
| 3125df4d6a | |||
| 06600813b4 | |||
| 3b510182d6 | |||
|
|
d1bcc222ce | ||
|
|
b632952f62 | ||
|
|
1b90299d94
|
||
| 708add1017 | |||
| 712349d638 | |||
| ba5a42dd29 | |||
| 53cf2c8bad | |||
| 868c28cd5a | |||
| 962c4b2562 | |||
| 57cb251dbc | |||
| 75c4fada50 | |||
| 7d13e586ac | |||
| 7729b845b0 | |||
| c662b2fdbf | |||
|
|
4cedb37da4 | ||
|
|
fe519628c8 | ||
|
|
ff18828692 | ||
| 6337ef7e50 | |||
| e9992d2e99 | |||
| aac218fc6d | |||
| 139f40fce3 | |||
| c44d34fdbe | |||
| 23be3df1fb | |||
| 68b67c6975 | |||
| 478a3a140a | |||
| df4da87bfd | |||
| b19d301d95 | |||
| 9532aa897c | |||
| f942909ac7 | |||
| 7f98298b89 | |||
| c6ee055cb4 | |||
| 58e5547a91 | |||
| c2709e1894 | |||
| 7790e5f301 | |||
| 2620aa3ddd | |||
| 59a47cd542 | |||
| a0c51657d4 | |||
| f0b053ca07 | |||
| 844ec8a87b | |||
| 6ee8fcc597 | |||
| 1220799e57 | |||
| 86a176a785 | |||
| bb9e9a08e5 | |||
| edd8c67fc9 | |||
| 877e93c9bf | |||
| 51403713be | |||
| cc69a9348c | |||
| 5b5bb9a5bc | |||
|
|
2f39fc9836 | ||
| f959684540 | |||
| 18f7ed658a | |||
| c05fa236d1 | |||
| c866303696 | |||
| e806d6ad0d | |||
| abf8aa58fc | |||
| 4f132ec154 | |||
| 86a7fd072e | |||
| 5f5e7d98b5 | |||
| e8738c9585 | |||
| 2f18adaa67 | |||
| b7f5c97de1 | |||
| 09001ce01b | |||
| 37c9cc7a57 | |||
| 362e481a09 | |||
| 38f4c44fd0 | |||
| 605d3dee06 | |||
| 6f26bcdccc | |||
| fffdf4fad3 | |||
| f6d2239f4d | |||
| 913fb7c046 | |||
| 79318dc169 | |||
| 377ed1c37f | |||
| 51b0487b29 | |||
| 49e32c0ab1 | |||
| 01e4f5f59e | |||
| 19d9fc5f1e | |||
| c4e184140a | |||
| 56c492ccdf | |||
| 3a6009a5a3 | |||
| 2c4d25a5eb | |||
| 052ab37412 | |||
| 925f546272 | |||
| 71fd32a707 | |||
| 581131bdc8 | |||
| 495ed349ea | |||
| 350a255d6e | |||
| e3087e46c2 | |||
| ae6b638df6 | |||
| 2c73cc683a | |||
| 32adfb1111 | |||
| fe8fcbae96 | |||
| 5756f7ceea | |||
| 2be0f808d2 | |||
| 7a0f651eaf | |||
| 2e47104b17 | |||
| 76bfa612c5 | |||
| 71aa0813ad | |||
| cc675c1b24 | |||
| 44e4941120 | |||
| 86acfa6871 | |||
| 7f09b2d2d3 | |||
| f3a37f1158 | |||
| 9d6db86318 | |||
| e11993c81f | |||
| 4bd259a2a0 | |||
| 162ae11cdd | |||
| 8431b47322 | |||
| 3ed5ecc3f0 | |||
| d08ab3efd6 | |||
| a4f6628e52 | |||
| 25073dd619 | |||
| 4293181b4e | |||
| 551a4ef577 | |||
| 6afb18fc58 | |||
| f310220261 | |||
| ef7c0c1cea | |||
| 27230fa03b | |||
| c52d40b760 | |||
| d3ba579a8b | |||
| 9ef8209622 | |||
| ba66dd868e | |||
| 17755fa2b5 | |||
| f94d3a8942 | |||
| 20e1109602 | |||
| c25d3be44e | |||
| 8db558891a | |||
| 0e06ba5993 | |||
| 736769d630 | |||
| 93c970d0dd | |||
| 5544a65947 | |||
| 918723d57b | |||
| a418b48809 | |||
|
55846562c1
|
|||
|
95c7770cad
|
|||
|
1b900e3202
|
|||
|
d083acfd1c
|
|||
|
244160e20e
|
|||
| ed2847a2c6 | |||
| 1457caa64b | |||
| b9a38c1724 | |||
| 74edad5d3e | |||
|
|
e5cad365ee
|
||
|
|
53851ba10f
|
||
|
|
056e5208c8
|
||
|
|
af142fdb15
|
||
|
|
5ce92beb52
|
||
|
|
ae379ec408
|
||
| 458837b007 | |||
| a3feab6f7e | |||
| fa647ab2d8 | |||
| 19902813b5 | |||
| 23a7f310c5 | |||
| 58d1f2de91 | |||
| d623844411 | |||
| 04825b552e | |||
| ca7966f3e0 | |||
| 0c47ca4d32 | |||
| 7bad8eb5a9 | |||
| c2c60b77e5 | |||
| 76b5a5dc0d | |||
| 58da491049 | |||
| 626bead304 | |||
| 30bac996f4 | |||
| 9adc718b6f | |||
| 070f45bc25 | |||
| d061f29699 | |||
| f6fd96881d | |||
| 2be785676a | |||
| 1b9ee2d46a | |||
| b7bbafacf8 | |||
| 240896f101 | |||
| a7b326fceb | |||
| 76ed03f86f | |||
| 1af2f53755 | |||
| 0de9071f92 | |||
| 855faea659 | |||
| dbd581ffef | |||
| 1390225614 | |||
| a03491f75c | |||
| 2092fc4f42 | |||
| d2973f4792 | |||
| 58022c6edc | |||
| 994e6b3ca2 | |||
| 6414336ee6 | |||
| 1104581eb6 | |||
| 6ad110e5d3 | |||
| e39ce302b8 | |||
| 3f216dc275 | |||
| 8af7e58534 | |||
| 043673d9ac | |||
| 73737be16a | |||
| 1d3ed81ac5 | |||
| 49c4784e70 | |||
| be15c86973 | |||
| 72857db561 | |||
| faf53aaae2 | |||
| 9e058101f0 | |||
|
|
4ae45d9913
|
||
| 56cf8293ed | |||
| fd5b3598bf | |||
| 9dd5a57b81 | |||
| 1cd385e227 | |||
| 3c20eb567b | |||
| ff7df44d37 | |||
| 1a19873f77 | |||
| 6a09bf021e | |||
| f2089f99fc | |||
| 10ea3a8f8f | |||
| 9faa6ead49 | |||
| 29cce5741a | |||
| 804e542c3f | |||
| 72899162b0 | |||
| 168a419bbe | |||
| 6a71641295 | |||
| 5addde0a71 | |||
| 90ea1c9463 | |||
| a4fb3e6151 | |||
| e2abbfcc63 | |||
| f6cb35acca | |||
| f4386c3d12 | |||
| f8594af8c6 | |||
| b8ef69a5a7 | |||
| c980b9f84d | |||
| 4651440457 | |||
| 7d58882ed8 | |||
| e90ba95869 | |||
| 1015e79026 | |||
| 833cb8b430 | |||
| a882ae283f | |||
| 305e90b254 | |||
| c80683182d | |||
| 51cd4da97b | |||
| cf71fe49d6 | |||
| 85a9a81804 | |||
| 72b979b587 | |||
| bb4350519b | |||
| 62658e23a7 | |||
| 6a1f92af12 | |||
| 24ed21ce7d | |||
| 46a187a60e | |||
| e0c7ea44ea | |||
| f013180c4b | |||
| b96b784b38 | |||
| 6864e95404 | |||
| 0ba4652595 | |||
| 8d0047649a | |||
| 2f180c264e | |||
| 7b87c4fd73 | |||
| 7d2233dd4a | |||
| c30ae5750b | |||
| ea2134c6e9 | |||
| b22f418595 | |||
| c4c9a16e7f | |||
| 5b1e6941c2 | |||
| 923bcd89db | |||
| e96f4d343b | |||
| bcb63fe1e9 | |||
| f4e78e53d3 | |||
| 082db173f3 | |||
| 7e055c3169 | |||
| 7e59e527d8 | |||
| 518845b3d8 | |||
| b091e0e98d | |||
| cedb7c0e76 | |||
| 7209f9f519 | |||
| bd5482d54e | |||
| bc95d50378 | |||
| fff996b497 | |||
| 2b67e6d80e | |||
| 5a875c19a0 | |||
| 538698373a | |||
| 84b8ca65ce | |||
| a02358e641 | |||
| 33c9bffc2e | |||
| 4894c0d90a | |||
| 090c291f8a | |||
| 42cedb6267 | |||
| f7229dfaf9 | |||
|
|
933ca9a3db
|
||
| 390cb89702 | |||
| 6cbeaef6f2 | |||
| d146fb8c4e | |||
| 7e78ee83c1 | |||
| 17e925bfd7 | |||
| 878df15e58 | |||
| c84af6286d | |||
| d2cbb8fd34 | |||
| 8436a49c5d | |||
| 106e36d6bf | |||
| 0ec4986163 | |||
| fb7f6adc98 | |||
| 231f29b065 | |||
| 3f3645a453 | |||
| 42e2713cd8 | |||
| 3d24dce5c0 | |||
| 0cefb45d8a | |||
| ddbb824006 | |||
| 69dac4ec31 | |||
| b7e03ab465 | |||
| 76aec3aabb | |||
| 19fb7e277b | |||
| 51261f1bc1 | |||
| 949810709d | |||
| c012570e89 | |||
| 44a3b15a7d | |||
|
e5d07f0ce6
|
|||
|
df9478a920
|
33
.gitea/workflows/go-generate-check.yaml
Normal file
33
.gitea/workflows/go-generate-check.yaml
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
name: go-generate-check
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: ['main']
|
||||||
|
paths:
|
||||||
|
- '**.go'
|
||||||
|
- '**.mod'
|
||||||
|
- '**.sum'
|
||||||
|
pull_request:
|
||||||
|
paths:
|
||||||
|
- '**.go'
|
||||||
|
- '**.mod'
|
||||||
|
- '**.sum'
|
||||||
|
workflow_dispatch:
|
||||||
|
jobs:
|
||||||
|
go-generate-check:
|
||||||
|
name: go-generate-check
|
||||||
|
container:
|
||||||
|
image: registry.opensuse.org/devel/factory/git-workflow/containers/opensuse/bci/golang-extended:latest
|
||||||
|
steps:
|
||||||
|
- run: git clone --no-checkout --depth 1 ${{ gitea.server_url }}/${{ gitea.repository }} .
|
||||||
|
- run: git fetch origin ${{ gitea.ref }}
|
||||||
|
- run: git checkout FETCH_HEAD
|
||||||
|
- run: go generate -C common
|
||||||
|
- run: go generate -C workflow-pr
|
||||||
|
- run: git add -N .; git diff
|
||||||
|
- run: |
|
||||||
|
status=$(git status --short)
|
||||||
|
if [[ -n "$status" ]]; then
|
||||||
|
echo -e "$status"
|
||||||
|
echo "Please commit the differences from running: go generate"
|
||||||
|
false
|
||||||
|
fi
|
||||||
24
.gitea/workflows/go-generate-push.yaml
Normal file
24
.gitea/workflows/go-generate-push.yaml
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
name: go-generate-push
|
||||||
|
on:
|
||||||
|
workflow_dispatch:
|
||||||
|
jobs:
|
||||||
|
go-generate-push:
|
||||||
|
name: go-generate-push
|
||||||
|
container:
|
||||||
|
image: registry.opensuse.org/devel/factory/git-workflow/containers/opensuse/bci/golang-extended:latest
|
||||||
|
steps:
|
||||||
|
- run: git clone --no-checkout --depth 1 ${{ gitea.server_url }}/${{ gitea.repository }} .
|
||||||
|
- run: git fetch origin ${{ gitea.ref }}
|
||||||
|
- run: git checkout FETCH_HEAD
|
||||||
|
- run: go generate -C common
|
||||||
|
- run: go generate -C workflow-pr
|
||||||
|
- run: |
|
||||||
|
host=${{ gitea.server_url }}
|
||||||
|
host=${host#https://}
|
||||||
|
echo $host
|
||||||
|
git remote set-url origin "https://x-access-token:${{ secrets.GITEA_TOKEN }}@$host/${{ gitea.repository }}"
|
||||||
|
git config user.name "Gitea Actions"
|
||||||
|
git config user.email "gitea_noreply@opensuse.org"
|
||||||
|
- run: 'git status --short; git status --porcelain=2|grep --quiet -v . || ( git add .;git commit -m "CI run result of: go generate"; git push origin HEAD:${{ gitea.ref }} )'
|
||||||
|
- run: git log -p FETCH_HEAD...HEAD
|
||||||
|
- run: git log --numstat FETCH_HEAD...HEAD
|
||||||
33
.gitea/workflows/go-vendor-check.yaml
Normal file
33
.gitea/workflows/go-vendor-check.yaml
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
name: go-vendor-check
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: ['main']
|
||||||
|
paths:
|
||||||
|
- '**.mod'
|
||||||
|
- '**.sum'
|
||||||
|
pull_request:
|
||||||
|
paths:
|
||||||
|
- '**.mod'
|
||||||
|
- '**.sum'
|
||||||
|
workflow_dispatch:
|
||||||
|
jobs:
|
||||||
|
go-generate-check:
|
||||||
|
name: go-vendor-check
|
||||||
|
container:
|
||||||
|
image: registry.opensuse.org/devel/factory/git-workflow/containers/opensuse/bci/golang-extended:latest
|
||||||
|
steps:
|
||||||
|
- run: git clone --no-checkout --depth 1 ${{ gitea.server_url }}/${{ gitea.repository }} .
|
||||||
|
- run: git fetch origin ${{ gitea.ref }}
|
||||||
|
- run: git checkout FETCH_HEAD
|
||||||
|
- run: go mod download
|
||||||
|
- run: go mod vendor
|
||||||
|
- run: go mod verify
|
||||||
|
- run: git add -N .; git diff
|
||||||
|
- run: go mod tidy -diff || true
|
||||||
|
- run: |
|
||||||
|
status=$(git status --short)
|
||||||
|
if [[ -n "$status" ]]; then
|
||||||
|
echo -e "$status"
|
||||||
|
echo "Please commit the differences from running: go generate"
|
||||||
|
false
|
||||||
|
fi
|
||||||
26
.gitea/workflows/go-vendor-push.yaml
Normal file
26
.gitea/workflows/go-vendor-push.yaml
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
name: go-generate-push
|
||||||
|
on:
|
||||||
|
workflow_dispatch:
|
||||||
|
jobs:
|
||||||
|
go-generate-push:
|
||||||
|
name: go-generate-push
|
||||||
|
container:
|
||||||
|
image: registry.opensuse.org/devel/factory/git-workflow/containers/opensuse/bci/golang-extended:latest
|
||||||
|
steps:
|
||||||
|
- run: git clone --no-checkout --depth 1 ${{ gitea.server_url }}/${{ gitea.repository }} .
|
||||||
|
- run: git fetch origin ${{ gitea.ref }}
|
||||||
|
- run: git checkout FETCH_HEAD
|
||||||
|
- run: go mod download
|
||||||
|
- run: go mod vendor
|
||||||
|
- run: go mod verify
|
||||||
|
- run: |
|
||||||
|
host=${{ gitea.server_url }}
|
||||||
|
host=${host#https://}
|
||||||
|
echo $host
|
||||||
|
git remote set-url origin "https://x-access-token:${{ secrets.GITEA_TOKEN }}@$host/${{ gitea.repository }}"
|
||||||
|
git config user.name "Gitea Actions"
|
||||||
|
git config user.email "gitea_noreply@opensuse.org"
|
||||||
|
- run: 'git status --short; git status --porcelain=2|grep --quiet -v . || ( git add .;git commit -m "CI run result of: go mod vendor"; git push origin HEAD:${{ gitea.ref }} )'
|
||||||
|
- run: go mod tidy -diff || true
|
||||||
|
- run: git log -p FETCH_HEAD...HEAD
|
||||||
|
- run: git log --numstat FETCH_HEAD...HEAD
|
||||||
10
.gitignore
vendored
10
.gitignore
vendored
@@ -1,5 +1,7 @@
|
|||||||
mock
|
|
||||||
node_modules
|
|
||||||
*.obscpio
|
|
||||||
autogits-tmp.tar.zst
|
|
||||||
*.osc
|
*.osc
|
||||||
|
*.conf
|
||||||
|
/integration/gitea-data
|
||||||
|
/integration/gitea-logs
|
||||||
|
/integration/rabbitmq-data
|
||||||
|
/integration/workflow-pr-repos
|
||||||
|
__pycache__/
|
||||||
|
|||||||
4
Makefile
Normal file
4
Makefile
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
MODULES := devel-importer utils/hujson utils/maintainer-update gitea-events-rabbitmq-publisher gitea_status_proxy group-review obs-forward-bot obs-staging-bot obs-status-service workflow-direct workflow-pr
|
||||||
|
|
||||||
|
build:
|
||||||
|
for m in $(MODULES); do go build -C $$m -buildmode=pie || exit 1 ; done
|
||||||
19
README.md
19
README.md
@@ -5,11 +5,15 @@ The bots that drive Git Workflow for package management
|
|||||||
|
|
||||||
* devel-importer -- helper to import an OBS devel project into a Gitea organization
|
* devel-importer -- helper to import an OBS devel project into a Gitea organization
|
||||||
* gitea-events-rabbitmq-publisher -- takes all events from a Gitea organization (webhook) and publishes it on a RabbitMQ instance
|
* gitea-events-rabbitmq-publisher -- takes all events from a Gitea organization (webhook) and publishes it on a RabbitMQ instance
|
||||||
|
* gitea-status-proxy -- allows bots without code owner permission to set Gitea's commit status
|
||||||
|
* group-review -- group review proxy
|
||||||
|
* hujson -- translates JWCC (json with commas and comments) to Standard JSON
|
||||||
|
* obs-forward-bot -- forwards PR as OBS sr (TODO)
|
||||||
* obs-staging-bot -- build bot for a PR
|
* obs-staging-bot -- build bot for a PR
|
||||||
* obs-status-service -- report build status of an OBS project as an SVG
|
* obs-status-service -- report build status of an OBS project as an SVG
|
||||||
* workflow-pr -- keeps PR to _ObsPrj consistent with a PR to a package update
|
* workflow-pr -- keeps PR to _ObsPrj consistent with a PR to a package update
|
||||||
* workflow-direct -- update _ObsPrj based on direct pushes and repo creations/removals from organization
|
* workflow-direct -- update _ObsPrj based on direct pushes and repo creations/removals from organization
|
||||||
* staging-utils -- review tooling for PR
|
* staging-utils -- review tooling for PR (TODO)
|
||||||
- list PR
|
- list PR
|
||||||
- merge PR
|
- merge PR
|
||||||
- split PR
|
- split PR
|
||||||
@@ -19,7 +23,18 @@ The bots that drive Git Workflow for package management
|
|||||||
Bugs
|
Bugs
|
||||||
----
|
----
|
||||||
|
|
||||||
Report bugs to issue tracker at https://src.opensuse.org/adamm/autogits
|
Report bugs to issue tracker at https://src.opensuse.org/git-workflow/autogits
|
||||||
|
|
||||||
|
|
||||||
|
Build Status
|
||||||
|
------------
|
||||||
|
|
||||||
|
Devel project build status (`main` branch):
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
`staging` branch build status:
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
|
||||||
|
|||||||
15
_service
15
_service
@@ -1,15 +0,0 @@
|
|||||||
<services>
|
|
||||||
<!-- workaround, go_modules needs a tar and obs_scm doesn't take file://. -->
|
|
||||||
<service name="roast" mode="manual">
|
|
||||||
<param name="target">.</param>
|
|
||||||
<param name="reproducible">true</param>
|
|
||||||
<param name="outfile">autogits-tmp.tar.zst</param>
|
|
||||||
<param name="exclude">autogits-tmp.tar.zst</param>
|
|
||||||
</service>
|
|
||||||
<service name="go_modules" mode="manual">
|
|
||||||
<param name="basename">./</param>
|
|
||||||
<param name="compression">zst</param>
|
|
||||||
<param name="vendorname">vendor</param>
|
|
||||||
</service>
|
|
||||||
</services>
|
|
||||||
|
|
||||||
263
autogits.spec
263
autogits.spec
@@ -17,15 +17,14 @@
|
|||||||
|
|
||||||
|
|
||||||
Name: autogits
|
Name: autogits
|
||||||
Version: 0
|
Version: 1
|
||||||
Release: 0
|
Release: 0
|
||||||
Summary: GitWorkflow utilities
|
Summary: GitWorkflow utilities
|
||||||
License: GPL-2.0-or-later
|
License: GPL-2.0-or-later
|
||||||
URL: https://src.opensuse.org/adamm/autogits
|
URL: https://src.opensuse.org/adamm/autogits
|
||||||
Source1: vendor.tar.zst
|
BuildRequires: git
|
||||||
BuildRequires: golang-packaging
|
|
||||||
BuildRequires: systemd-rpm-macros
|
BuildRequires: systemd-rpm-macros
|
||||||
BuildRequires: zstd
|
BuildRequires: go
|
||||||
%{?systemd_ordering}
|
%{?systemd_ordering}
|
||||||
|
|
||||||
%description
|
%description
|
||||||
@@ -33,160 +32,288 @@ Git Workflow tooling and utilities enabling automated handing of OBS projects
|
|||||||
as git repositories
|
as git repositories
|
||||||
|
|
||||||
|
|
||||||
%package -n gitea-events-rabbitmq-publisher
|
%package devel-importer
|
||||||
|
Summary: Imports devel projects from obs to git
|
||||||
|
|
||||||
|
%description -n autogits-devel-importer
|
||||||
|
Command-line tool to import devel projects from obs to git
|
||||||
|
|
||||||
|
|
||||||
|
%package doc
|
||||||
|
Summary: Common documentation files
|
||||||
|
BuildArch: noarch
|
||||||
|
|
||||||
|
%description -n autogits-doc
|
||||||
|
Common documentation files
|
||||||
|
|
||||||
|
|
||||||
|
%package gitea-events-rabbitmq-publisher
|
||||||
Summary: Publishes Gitea webhook data via RabbitMQ
|
Summary: Publishes Gitea webhook data via RabbitMQ
|
||||||
|
|
||||||
%description -n gitea-events-rabbitmq-publisher
|
%description gitea-events-rabbitmq-publisher
|
||||||
Listens on an HTTP socket and publishes Gitea events on a RabbitMQ instance
|
Listens on an HTTP socket and publishes Gitea events on a RabbitMQ instance
|
||||||
with a topic
|
with a topic
|
||||||
<scope>.src.$organization.$webhook_type.[$webhook_action_type]
|
<scope>.src.$organization.$webhook_type.[$webhook_action_type]
|
||||||
|
|
||||||
|
|
||||||
%package -n doc
|
%package gitea-status-proxy
|
||||||
Summary: Common documentation files
|
Summary: Proxy for setting commit status in Gitea
|
||||||
|
|
||||||
%description -n doc
|
%description gitea-status-proxy
|
||||||
Common documentation files
|
Setting commit status requires code write access token. This proxy
|
||||||
|
is middleware that delegates status setting without access to other APIs
|
||||||
|
|
||||||
|
%package group-review
|
||||||
%package -n devel-importer
|
|
||||||
Summary: Imports devel projects from obs to git
|
|
||||||
|
|
||||||
%description -n devel-importer
|
|
||||||
Command-line tool to import devel projects from obs to git
|
|
||||||
|
|
||||||
|
|
||||||
%package -n group-review
|
|
||||||
Summary: Reviews of groups defined in ProjectGit
|
Summary: Reviews of groups defined in ProjectGit
|
||||||
|
|
||||||
%description -n group-review
|
%description group-review
|
||||||
Is used to handle reviews associated with groups defined in the
|
Is used to handle reviews associated with groups defined in the
|
||||||
ProjectGit.
|
ProjectGit.
|
||||||
|
|
||||||
|
|
||||||
%package -n obs-staging-bot
|
%package obs-forward-bot
|
||||||
|
Summary: obs-forward-bot
|
||||||
|
|
||||||
|
%description obs-forward-bot
|
||||||
|
|
||||||
|
|
||||||
|
%package obs-staging-bot
|
||||||
Summary: Build a PR against a ProjectGit, if review is requested
|
Summary: Build a PR against a ProjectGit, if review is requested
|
||||||
|
|
||||||
%description -n obs-staging-bot
|
%description obs-staging-bot
|
||||||
Build a PR against a ProjectGit, if review is requested.
|
Build a PR against a ProjectGit, if review is requested.
|
||||||
|
|
||||||
|
|
||||||
%package -n obs-status-service
|
%package obs-status-service
|
||||||
Summary: Reports build status of OBS service as an easily to produce SVG
|
Summary: Reports build status of OBS service as an easily to produce SVG
|
||||||
|
|
||||||
%description -n obs-status-service
|
%description obs-status-service
|
||||||
Reports build status of OBS service as an easily to produce SVG
|
Reports build status of OBS service as an easily to produce SVG
|
||||||
|
|
||||||
|
|
||||||
%package -n workflow-direct
|
%package utils
|
||||||
Summary: Keep ProjectGit in sync for a devel project
|
Summary: HuJSON to JSON parser
|
||||||
|
Provides: hujson
|
||||||
|
Provides: /usr/bin/hujson
|
||||||
|
|
||||||
%description -n workflow-direct
|
%description utils
|
||||||
|
HuJSON to JSON parser, using stdin -> stdout pipe
|
||||||
|
|
||||||
|
|
||||||
|
%package workflow-direct
|
||||||
|
Summary: Keep ProjectGit in sync for a devel project
|
||||||
|
Requires: openssh-clients
|
||||||
|
Requires: git-core
|
||||||
|
|
||||||
|
%description workflow-direct
|
||||||
Keep ProjectGit in sync with packages in the organization of a devel project
|
Keep ProjectGit in sync with packages in the organization of a devel project
|
||||||
|
|
||||||
|
|
||||||
%package -n workflow-pr
|
%package workflow-pr
|
||||||
Summary: Keeps ProjectGit PR in-sync with a PackageGit PR
|
Summary: Keeps ProjectGit PR in-sync with a PackageGit PR
|
||||||
|
Requires: openssh-clients
|
||||||
|
Requires: git-core
|
||||||
|
|
||||||
%description -n workflow-pr
|
%description workflow-pr
|
||||||
Keeps ProjectGit PR in-sync with a PackageGit PR
|
Keeps ProjectGit PR in-sync with a PackageGit PR
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
%prep
|
%prep
|
||||||
cp -r /home/abuild/rpmbuild/SOURCES/* ./
|
cp -r /home/abuild/rpmbuild/SOURCES/* ./
|
||||||
tar x --zstd -f %{SOURCE1}
|
|
||||||
|
|
||||||
%build
|
%build
|
||||||
go build \
|
go build \
|
||||||
-C gitea-events-rabbitmq-publisher \
|
-C devel-importer \
|
||||||
-mod=vendor \
|
|
||||||
-buildmode=pie
|
-buildmode=pie
|
||||||
go build \
|
go build \
|
||||||
-C devel-importer \
|
-C utils/hujson \
|
||||||
-mod=vendor \
|
-buildmode=pie
|
||||||
|
go build \
|
||||||
|
-C utils/maintainer-update \
|
||||||
|
-buildmode=pie
|
||||||
|
go build \
|
||||||
|
-C gitea-events-rabbitmq-publisher \
|
||||||
|
-buildmode=pie
|
||||||
|
go build \
|
||||||
|
-C gitea_status_proxy \
|
||||||
-buildmode=pie
|
-buildmode=pie
|
||||||
go build \
|
go build \
|
||||||
-C group-review \
|
-C group-review \
|
||||||
-mod=vendor \
|
-buildmode=pie
|
||||||
|
go build \
|
||||||
|
-C obs-forward-bot \
|
||||||
-buildmode=pie
|
-buildmode=pie
|
||||||
go build \
|
go build \
|
||||||
-C obs-staging-bot \
|
-C obs-staging-bot \
|
||||||
-mod=vendor \
|
|
||||||
-buildmode=pie
|
-buildmode=pie
|
||||||
go build \
|
go build \
|
||||||
-C obs-status-service \
|
-C obs-status-service \
|
||||||
-mod=vendor \
|
|
||||||
-buildmode=pie
|
-buildmode=pie
|
||||||
#go build \
|
go build \
|
||||||
# -C workflow-direct \
|
-C workflow-direct \
|
||||||
# -mod=vendor \
|
-buildmode=pie
|
||||||
# -buildmode=pie
|
go build \
|
||||||
#go build \
|
-C workflow-pr \
|
||||||
# -C workflow-pr \
|
-buildmode=pie
|
||||||
# -mod=vendor \
|
|
||||||
# -buildmode=pie
|
%check
|
||||||
|
go test -C common -v
|
||||||
|
go test -C group-review -v
|
||||||
|
go test -C obs-staging-bot -v
|
||||||
|
go test -C obs-status-service -v
|
||||||
|
go test -C workflow-direct -v
|
||||||
|
go test -C utils/maintainer-update
|
||||||
|
# TODO build fails
|
||||||
|
#go test -C workflow-pr -v
|
||||||
|
|
||||||
%install
|
%install
|
||||||
|
install -D -m0755 devel-importer/devel-importer %{buildroot}%{_bindir}/devel-importer
|
||||||
install -D -m0755 gitea-events-rabbitmq-publisher/gitea-events-rabbitmq-publisher %{buildroot}%{_bindir}/gitea-events-rabbitmq-publisher
|
install -D -m0755 gitea-events-rabbitmq-publisher/gitea-events-rabbitmq-publisher %{buildroot}%{_bindir}/gitea-events-rabbitmq-publisher
|
||||||
install -D -m0644 systemd/gitea-events-rabbitmq-publisher.service %{buildroot}%{_unitdir}/gitea-events-rabbitmq-publisher.service
|
install -D -m0644 systemd/gitea-events-rabbitmq-publisher.service %{buildroot}%{_unitdir}/gitea-events-rabbitmq-publisher.service
|
||||||
install -D -m0755 devel-importer/devel-importer %{buildroot}%{_bindir}/devel-importer
|
install -D -m0755 gitea_status_proxy/gitea_status_proxy %{buildroot}%{_bindir}/gitea_status_proxy
|
||||||
install -D -m0755 group-review/group-review %{buildroot}%{_bindir}/group-review
|
install -D -m0755 group-review/group-review %{buildroot}%{_bindir}/group-review
|
||||||
|
install -D -m0644 systemd/group-review@.service %{buildroot}%{_unitdir}/group-review@.service
|
||||||
|
install -D -m0755 obs-forward-bot/obs-forward-bot %{buildroot}%{_bindir}/obs-forward-bot
|
||||||
install -D -m0755 obs-staging-bot/obs-staging-bot %{buildroot}%{_bindir}/obs-staging-bot
|
install -D -m0755 obs-staging-bot/obs-staging-bot %{buildroot}%{_bindir}/obs-staging-bot
|
||||||
|
install -D -m0644 systemd/obs-staging-bot.service %{buildroot}%{_unitdir}/obs-staging-bot.service
|
||||||
install -D -m0755 obs-status-service/obs-status-service %{buildroot}%{_bindir}/obs-status-service
|
install -D -m0755 obs-status-service/obs-status-service %{buildroot}%{_bindir}/obs-status-service
|
||||||
#install -D -m0755 workflow-direct/workflow-direct %{buildroot}%{_bindir}/workflow-direct
|
install -D -m0644 systemd/obs-status-service.service %{buildroot}%{_unitdir}/obs-status-service.service
|
||||||
#install -D -m0755 workflow-pr/workflow-pr %{buildroot}%{_bindir}/workflow-pr
|
install -D -m0755 workflow-direct/workflow-direct %{buildroot}%{_bindir}/workflow-direct
|
||||||
|
install -D -m0644 systemd/workflow-direct@.service %{buildroot}%{_unitdir}/workflow-direct@.service
|
||||||
|
install -D -m0755 workflow-pr/workflow-pr %{buildroot}%{_bindir}/workflow-pr
|
||||||
|
install -D -m0644 systemd/workflow-pr@.service %{buildroot}%{_unitdir}/workflow-pr@.service
|
||||||
|
install -D -m0755 utils/hujson/hujson %{buildroot}%{_bindir}/hujson
|
||||||
|
install -D -m0755 utils/maintainer-update/maintainer-update %{buildroot}%{_bindir}/maintainer-update
|
||||||
|
|
||||||
%pre -n gitea-events-rabbitmq-publisher
|
%pre gitea-events-rabbitmq-publisher
|
||||||
%service_add_pre gitea-events-rabbitmq-publisher.service
|
%service_add_pre gitea-events-rabbitmq-publisher.service
|
||||||
|
|
||||||
%post -n gitea-events-rabbitmq-publisher
|
%post gitea-events-rabbitmq-publisher
|
||||||
%service_add_post gitea-events-rabbitmq-publisher.service
|
%service_add_post gitea-events-rabbitmq-publisher.service
|
||||||
|
|
||||||
%preun -n gitea-events-rabbitmq-publisher
|
%preun gitea-events-rabbitmq-publisher
|
||||||
%service_del_preun gitea-events-rabbitmq-publisher.service
|
%service_del_preun gitea-events-rabbitmq-publisher.service
|
||||||
|
|
||||||
%postun -n gitea-events-rabbitmq-publisher
|
%postun gitea-events-rabbitmq-publisher
|
||||||
%service_del_postun gitea-events-rabbitmq-publisher.service
|
%service_del_postun gitea-events-rabbitmq-publisher.service
|
||||||
|
|
||||||
%files -n gitea-events-rabbitmq-publisher
|
%pre group-review
|
||||||
|
%service_add_pre group-review@.service
|
||||||
|
|
||||||
|
%post group-review
|
||||||
|
%service_add_post group-review@.service
|
||||||
|
|
||||||
|
%preun group-review
|
||||||
|
%service_del_preun group-review@.service
|
||||||
|
|
||||||
|
%postun group-review
|
||||||
|
%service_del_postun group-review@.service
|
||||||
|
|
||||||
|
%pre obs-staging-bot
|
||||||
|
%service_add_pre obs-staging-bot.service
|
||||||
|
|
||||||
|
%post obs-staging-bot
|
||||||
|
%service_add_post obs-staging-bot.service
|
||||||
|
|
||||||
|
%preun obs-staging-bot
|
||||||
|
%service_del_preun obs-staging-bot.service
|
||||||
|
|
||||||
|
%postun obs-staging-bot
|
||||||
|
%service_del_postun obs-staging-bot.service
|
||||||
|
|
||||||
|
%pre obs-status-service
|
||||||
|
%service_add_pre obs-status-service.service
|
||||||
|
|
||||||
|
%post obs-status-service
|
||||||
|
%service_add_post obs-status-service.service
|
||||||
|
|
||||||
|
%preun obs-status-service
|
||||||
|
%service_del_preun obs-status-service.service
|
||||||
|
|
||||||
|
%postun obs-status-service
|
||||||
|
%service_del_postun obs-status-service.service
|
||||||
|
|
||||||
|
%pre workflow-direct
|
||||||
|
%service_add_pre workflow-direct.service
|
||||||
|
|
||||||
|
%post workflow-direct
|
||||||
|
%service_add_post workflow-direct.service
|
||||||
|
|
||||||
|
%preun workflow-direct
|
||||||
|
%service_del_preun workflow-direct.service
|
||||||
|
|
||||||
|
%postun workflow-direct
|
||||||
|
%service_del_postun workflow-direct.service
|
||||||
|
|
||||||
|
%pre workflow-pr
|
||||||
|
%service_add_pre workflow-pr.service
|
||||||
|
|
||||||
|
%post workflow-pr
|
||||||
|
%service_add_post workflow-pr.service
|
||||||
|
|
||||||
|
%preun workflow-pr
|
||||||
|
%service_del_preun workflow-pr.service
|
||||||
|
|
||||||
|
%postun workflow-pr
|
||||||
|
%service_del_postun workflow-pr.service
|
||||||
|
|
||||||
|
%files devel-importer
|
||||||
|
%license COPYING
|
||||||
|
%doc devel-importer/README.md
|
||||||
|
%{_bindir}/devel-importer
|
||||||
|
|
||||||
|
%files doc
|
||||||
|
%license COPYING
|
||||||
|
%doc doc/README.md
|
||||||
|
%doc doc/workflows.md
|
||||||
|
|
||||||
|
%files gitea-events-rabbitmq-publisher
|
||||||
%license COPYING
|
%license COPYING
|
||||||
%doc gitea-events-rabbitmq-publisher/README.md
|
%doc gitea-events-rabbitmq-publisher/README.md
|
||||||
%{_bindir}/gitea-events-rabbitmq-publisher
|
%{_bindir}/gitea-events-rabbitmq-publisher
|
||||||
%{_unitdir}/gitea-events-rabbitmq-publisher.service
|
%{_unitdir}/gitea-events-rabbitmq-publisher.service
|
||||||
|
|
||||||
%files -n doc
|
%files gitea-status-proxy
|
||||||
%license COPYING
|
%license COPYING
|
||||||
%doc doc/README.md
|
%{_bindir}/gitea_status_proxy
|
||||||
%doc doc/workflows.md
|
|
||||||
|
|
||||||
%files -n devel-importer
|
%files group-review
|
||||||
%license COPYING
|
|
||||||
%doc devel-importer/README.md
|
|
||||||
%{_bindir}/devel-importer
|
|
||||||
|
|
||||||
%files -n group-review
|
|
||||||
%license COPYING
|
%license COPYING
|
||||||
%doc group-review/README.md
|
%doc group-review/README.md
|
||||||
%{_bindir}/group-review
|
%{_bindir}/group-review
|
||||||
|
%{_unitdir}/group-review@.service
|
||||||
|
|
||||||
%files -n obs-staging-bot
|
%files obs-forward-bot
|
||||||
|
%license COPYING
|
||||||
|
%{_bindir}/obs-forward-bot
|
||||||
|
|
||||||
|
%files obs-staging-bot
|
||||||
%license COPYING
|
%license COPYING
|
||||||
%doc obs-staging-bot/README.md
|
%doc obs-staging-bot/README.md
|
||||||
%{_bindir}/obs-staging-bot
|
%{_bindir}/obs-staging-bot
|
||||||
|
%{_unitdir}/obs-staging-bot.service
|
||||||
|
|
||||||
%files -n obs-status-service
|
%files obs-status-service
|
||||||
%license COPYING
|
%license COPYING
|
||||||
%doc obs-status-service/README.md
|
%doc obs-status-service/README.md
|
||||||
%{_bindir}/obs-status-service
|
%{_bindir}/obs-status-service
|
||||||
|
%{_unitdir}/obs-status-service.service
|
||||||
|
|
||||||
%files -n workflow-direct
|
%files utils
|
||||||
|
%license COPYING
|
||||||
|
%{_bindir}/hujson
|
||||||
|
%{_bindir}/maintainer-update
|
||||||
|
|
||||||
|
%files workflow-direct
|
||||||
%license COPYING
|
%license COPYING
|
||||||
%doc workflow-direct/README.md
|
%doc workflow-direct/README.md
|
||||||
#%{_bindir}/workflow-direct
|
%{_bindir}/workflow-direct
|
||||||
|
%{_unitdir}/workflow-direct@.service
|
||||||
|
|
||||||
%files -n workflow-pr
|
%files workflow-pr
|
||||||
%license COPYING
|
%license COPYING
|
||||||
%doc workflow-pr/README.md
|
%doc workflow-pr/README.md
|
||||||
#%{_bindir}/workflow-pr
|
%{_bindir}/workflow-pr
|
||||||
|
%{_unitdir}/workflow-pr@.service
|
||||||
|
|
||||||
|
|||||||
@@ -11,7 +11,7 @@ import (
|
|||||||
"strings"
|
"strings"
|
||||||
)
|
)
|
||||||
|
|
||||||
const PrPattern = "PR: %s/%s#%d"
|
const PrPattern = "PR: %s/%s!%d"
|
||||||
|
|
||||||
type BasicPR struct {
|
type BasicPR struct {
|
||||||
Org, Repo string
|
Org, Repo string
|
||||||
@@ -36,10 +36,14 @@ func parsePrLine(line string) (BasicPR, error) {
|
|||||||
return ret, errors.New("missing / separator")
|
return ret, errors.New("missing / separator")
|
||||||
}
|
}
|
||||||
|
|
||||||
repo := strings.SplitN(org[1], "#", 2)
|
repo := strings.SplitN(org[1], "!", 2)
|
||||||
ret.Repo = repo[0]
|
ret.Repo = repo[0]
|
||||||
if len(repo) != 2 {
|
if len(repo) != 2 {
|
||||||
return ret, errors.New("Missing # separator")
|
repo = strings.SplitN(org[1], "#", 2)
|
||||||
|
ret.Repo = repo[0]
|
||||||
|
}
|
||||||
|
if len(repo) != 2 {
|
||||||
|
return ret, errors.New("Missing ! or # separator")
|
||||||
}
|
}
|
||||||
|
|
||||||
// Gitea requires that each org and repo be [A-Za-z0-9_-]+
|
// Gitea requires that each org and repo be [A-Za-z0-9_-]+
|
||||||
|
|||||||
@@ -34,7 +34,7 @@ func TestAssociatedPRScanner(t *testing.T) {
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"Multiple PRs",
|
"Multiple PRs",
|
||||||
"Some header of the issue\n\nFollowed by some description\nPR: test/foo#4\n\nPR: test/goo#5\n",
|
"Some header of the issue\n\nFollowed by some description\nPR: test/foo#4\n\nPR: test/goo!5\n",
|
||||||
[]common.BasicPR{
|
[]common.BasicPR{
|
||||||
{Org: "test", Repo: "foo", Num: 4},
|
{Org: "test", Repo: "foo", Num: 4},
|
||||||
{Org: "test", Repo: "goo", Num: 5},
|
{Org: "test", Repo: "goo", Num: 5},
|
||||||
@@ -107,7 +107,7 @@ func TestAppendingPRsToDescription(t *testing.T) {
|
|||||||
[]common.BasicPR{
|
[]common.BasicPR{
|
||||||
{Org: "a", Repo: "b", Num: 100},
|
{Org: "a", Repo: "b", Num: 100},
|
||||||
},
|
},
|
||||||
"something\n\nPR: a/b#100",
|
"something\n\nPR: a/b!100",
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"Append multiple PR to end of description",
|
"Append multiple PR to end of description",
|
||||||
@@ -119,7 +119,7 @@ func TestAppendingPRsToDescription(t *testing.T) {
|
|||||||
{Org: "b", Repo: "b", Num: 100},
|
{Org: "b", Repo: "b", Num: 100},
|
||||||
{Org: "c", Repo: "b", Num: 100},
|
{Org: "c", Repo: "b", Num: 100},
|
||||||
},
|
},
|
||||||
"something\n\nPR: a1/b#100\nPR: a1/c#100\nPR: a1/c#101\nPR: b/b#100\nPR: c/b#100",
|
"something\n\nPR: a1/b!100\nPR: a1/c!100\nPR: a1/c!101\nPR: b/b!100\nPR: c/b!100",
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"Append multiple sorted PR to end of description and remove dups",
|
"Append multiple sorted PR to end of description and remove dups",
|
||||||
@@ -133,7 +133,7 @@ func TestAppendingPRsToDescription(t *testing.T) {
|
|||||||
{Org: "a1", Repo: "c", Num: 101},
|
{Org: "a1", Repo: "c", Num: 101},
|
||||||
{Org: "a1", Repo: "b", Num: 100},
|
{Org: "a1", Repo: "b", Num: 100},
|
||||||
},
|
},
|
||||||
"something\n\nPR: a1/b#100\nPR: a1/c#100\nPR: a1/c#101\nPR: b/b#100\nPR: c/b#100",
|
"something\n\nPR: a1/b!100\nPR: a1/c!100\nPR: a1/c!101\nPR: b/b!100\nPR: c/b!100",
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
108
common/config.go
108
common/config.go
@@ -25,6 +25,7 @@ import (
|
|||||||
"io"
|
"io"
|
||||||
"log"
|
"log"
|
||||||
"os"
|
"os"
|
||||||
|
"slices"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/tailscale/hujson"
|
"github.com/tailscale/hujson"
|
||||||
@@ -35,6 +36,13 @@ import (
|
|||||||
const (
|
const (
|
||||||
ProjectConfigFile = "workflow.config"
|
ProjectConfigFile = "workflow.config"
|
||||||
StagingConfigFile = "staging.config"
|
StagingConfigFile = "staging.config"
|
||||||
|
|
||||||
|
Permission_ForceMerge = "force-merge"
|
||||||
|
Permission_Group = "release-engineering"
|
||||||
|
|
||||||
|
MergeModeFF = "ff-only"
|
||||||
|
MergeModeReplace = "replace"
|
||||||
|
MergeModeDevel = "devel"
|
||||||
)
|
)
|
||||||
|
|
||||||
type ConfigFile struct {
|
type ConfigFile struct {
|
||||||
@@ -43,25 +51,54 @@ type ConfigFile struct {
|
|||||||
|
|
||||||
type ReviewGroup struct {
|
type ReviewGroup struct {
|
||||||
Name string
|
Name string
|
||||||
|
Silent bool // will not request reviews from group members
|
||||||
Reviewers []string
|
Reviewers []string
|
||||||
}
|
}
|
||||||
|
|
||||||
type QAConfig struct {
|
type QAConfig struct {
|
||||||
Name string
|
Name string
|
||||||
Origin string
|
Origin string
|
||||||
|
Label string // requires this gitea lable to be set or skipped
|
||||||
|
BuildDisableRepos []string // which repos to build disable in the new project
|
||||||
|
}
|
||||||
|
|
||||||
|
type Permissions struct {
|
||||||
|
Permission string
|
||||||
|
Members []string
|
||||||
|
}
|
||||||
|
|
||||||
|
const (
|
||||||
|
Label_StagingAuto = "staging/Auto"
|
||||||
|
Label_ReviewPending = "review/Pending"
|
||||||
|
Label_ReviewDone = "review/Done"
|
||||||
|
)
|
||||||
|
|
||||||
|
func LabelKey(tag_value string) string {
|
||||||
|
// capitalize first letter and remove /
|
||||||
|
if len(tag_value) == 0 {
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
return strings.ToUpper(tag_value[0:1]) + strings.ReplaceAll(tag_value[1:], "/", "")
|
||||||
}
|
}
|
||||||
|
|
||||||
type AutogitConfig struct {
|
type AutogitConfig struct {
|
||||||
Workflows []string // [pr, direct, test]
|
Workflows []string // [pr, direct, test]
|
||||||
Organization string
|
Organization string
|
||||||
GitProjectName string // Organization/GitProjectName.git is PrjGit
|
GitProjectName string // Organization/GitProjectName.git is PrjGit
|
||||||
Branch string // branch name of PkgGit that aligns with PrjGit submodules
|
Branch string // branch name of PkgGit that aligns with PrjGit submodules
|
||||||
Reviewers []string // only used by `pr` workflow
|
Reviewers []string // only used by `pr` workflow
|
||||||
ReviewGroups []ReviewGroup
|
Permissions []*Permissions // only used by `pr` workflow
|
||||||
|
ReviewGroups []*ReviewGroup
|
||||||
Committers []string // group in addition to Reviewers and Maintainers that can order the bot around, mostly as helper for factory-maintainers
|
Committers []string // group in addition to Reviewers and Maintainers that can order the bot around, mostly as helper for factory-maintainers
|
||||||
|
Subdirs []string // list of directories to sort submodules into. Needed b/c _manifest cannot list non-existent directories
|
||||||
|
|
||||||
|
Labels map[string]string // list of tags, if not default, to apply
|
||||||
|
MergeMode string // project merge mode
|
||||||
|
|
||||||
|
NoProjectGitPR bool // do not automatically create project git PRs, just assign reviewers and assume somethign else creates the ProjectGit PR
|
||||||
ManualMergeOnly bool // only merge with "Merge OK" comment by Project Maintainers and/or Package Maintainers and/or reviewers
|
ManualMergeOnly bool // only merge with "Merge OK" comment by Project Maintainers and/or Package Maintainers and/or reviewers
|
||||||
ManualMergeProject bool // require merge of ProjectGit PRs with "Merge OK" by ProjectMaintainers and/or reviewers
|
ManualMergeProject bool // require merge of ProjectGit PRs with "Merge OK" by ProjectMaintainers and/or reviewers
|
||||||
|
ReviewRequired bool // always require a maintainer review, even if maintainer submits it. Only ignored if no other package or project reviewers
|
||||||
}
|
}
|
||||||
|
|
||||||
type AutogitConfigs []*AutogitConfig
|
type AutogitConfigs []*AutogitConfig
|
||||||
@@ -151,6 +188,17 @@ func ReadWorkflowConfig(gitea GiteaFileContentAndRepoFetcher, git_project string
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
config.GitProjectName = config.GitProjectName + "#" + branch
|
config.GitProjectName = config.GitProjectName + "#" + branch
|
||||||
|
|
||||||
|
// verify merge modes
|
||||||
|
switch config.MergeMode {
|
||||||
|
case MergeModeFF, MergeModeDevel, MergeModeReplace:
|
||||||
|
break // good results
|
||||||
|
case "":
|
||||||
|
config.MergeMode = MergeModeFF
|
||||||
|
default:
|
||||||
|
return nil, fmt.Errorf("Unsupported merge mode in %s: %s", git_project, config.MergeMode)
|
||||||
|
}
|
||||||
|
|
||||||
return config, nil
|
return config, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -175,6 +223,8 @@ func (configs AutogitConfigs) GetPrjGitConfig(org, repo, branch string) *Autogit
|
|||||||
if c.GitProjectName == prjgit {
|
if c.GitProjectName == prjgit {
|
||||||
return c
|
return c
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
for _, c := range configs {
|
||||||
if c.Organization == org && c.Branch == branch {
|
if c.Organization == org && c.Branch == branch {
|
||||||
return c
|
return c
|
||||||
}
|
}
|
||||||
@@ -183,6 +233,27 @@ func (configs AutogitConfigs) GetPrjGitConfig(org, repo, branch string) *Autogit
|
|||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (config *AutogitConfig) HasPermission(user, permission string) bool {
|
||||||
|
if config == nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, p := range config.Permissions {
|
||||||
|
if p.Permission == permission {
|
||||||
|
if slices.Contains(p.Members, user) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, m := range p.Members {
|
||||||
|
if members, err := config.GetReviewGroupMembers(m); err == nil && slices.Contains(members, user) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
func (config *AutogitConfig) GetReviewGroupMembers(reviewer string) ([]string, error) {
|
func (config *AutogitConfig) GetReviewGroupMembers(reviewer string) ([]string, error) {
|
||||||
for _, g := range config.ReviewGroups {
|
for _, g := range config.ReviewGroups {
|
||||||
if g.Name == reviewer {
|
if g.Name == reviewer {
|
||||||
@@ -193,10 +264,19 @@ func (config *AutogitConfig) GetReviewGroupMembers(reviewer string) ([]string, e
|
|||||||
return nil, errors.New("User " + reviewer + " not found as group reviewer for " + config.GitProjectName)
|
return nil, errors.New("User " + reviewer + " not found as group reviewer for " + config.GitProjectName)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (config *AutogitConfig) GetReviewGroup(reviewer string) (*ReviewGroup, error) {
|
||||||
|
for _, g := range config.ReviewGroups {
|
||||||
|
if g.Name == reviewer {
|
||||||
|
return g, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil, errors.New("User " + reviewer + " not found as group reviewer for " + config.GitProjectName)
|
||||||
|
}
|
||||||
|
|
||||||
func (config *AutogitConfig) GetPrjGit() (string, string, string) {
|
func (config *AutogitConfig) GetPrjGit() (string, string, string) {
|
||||||
org := config.Organization
|
org := config.Organization
|
||||||
repo := DefaultGitPrj
|
repo := DefaultGitPrj
|
||||||
branch := "master"
|
branch := ""
|
||||||
|
|
||||||
a := strings.Split(config.GitProjectName, "/")
|
a := strings.Split(config.GitProjectName, "/")
|
||||||
if len(a[0]) > 0 {
|
if len(a[0]) > 0 {
|
||||||
@@ -220,6 +300,9 @@ func (config *AutogitConfig) GetPrjGit() (string, string, string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if len(branch) == 0 {
|
||||||
|
panic("branch for project is undefined. Should not happend." + org + "/" + repo)
|
||||||
|
}
|
||||||
return org, repo, branch
|
return org, repo, branch
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -227,6 +310,14 @@ func (config *AutogitConfig) GetRemoteBranch() string {
|
|||||||
return "origin_" + config.Branch
|
return "origin_" + config.Branch
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (config *AutogitConfig) Label(label string) string {
|
||||||
|
if t, found := config.Labels[LabelKey(label)]; found {
|
||||||
|
return t
|
||||||
|
}
|
||||||
|
|
||||||
|
return label
|
||||||
|
}
|
||||||
|
|
||||||
type StagingConfig struct {
|
type StagingConfig struct {
|
||||||
ObsProject string
|
ObsProject string
|
||||||
RebuildAll bool
|
RebuildAll bool
|
||||||
@@ -239,6 +330,9 @@ type StagingConfig struct {
|
|||||||
|
|
||||||
func ParseStagingConfig(data []byte) (*StagingConfig, error) {
|
func ParseStagingConfig(data []byte) (*StagingConfig, error) {
|
||||||
var staging StagingConfig
|
var staging StagingConfig
|
||||||
|
if len(data) == 0 {
|
||||||
|
return nil, errors.New("non-existent config file.")
|
||||||
|
}
|
||||||
data, err := hujson.Standardize(data)
|
data, err := hujson.Standardize(data)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
|
|||||||
@@ -10,6 +10,67 @@ import (
|
|||||||
mock_common "src.opensuse.org/autogits/common/mock"
|
mock_common "src.opensuse.org/autogits/common/mock"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
func TestLabelKey(t *testing.T) {
|
||||||
|
tests := map[string]string{
|
||||||
|
"": "",
|
||||||
|
"foo": "Foo",
|
||||||
|
"foo/bar": "Foobar",
|
||||||
|
"foo/Bar": "FooBar",
|
||||||
|
}
|
||||||
|
|
||||||
|
for k, v := range tests {
|
||||||
|
if c := common.LabelKey(k); c != v {
|
||||||
|
t.Error("expected", v, "got", c, "input", k)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestConfigLabelParser(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
json string
|
||||||
|
label_value string
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "empty",
|
||||||
|
json: "{}",
|
||||||
|
label_value: "path/String",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "defined",
|
||||||
|
json: `{"Labels": {"foo": "bar", "PathString": "moo/Label"}}`,
|
||||||
|
label_value: "moo/Label",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "undefined",
|
||||||
|
json: `{"Labels": {"foo": "bar", "NotPathString": "moo/Label"}}`,
|
||||||
|
label_value: "path/String",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, test := range tests {
|
||||||
|
t.Run(test.name, func(t *testing.T) {
|
||||||
|
repo := models.Repository{
|
||||||
|
DefaultBranch: "master",
|
||||||
|
}
|
||||||
|
|
||||||
|
ctl := gomock.NewController(t)
|
||||||
|
gitea := mock_common.NewMockGiteaFileContentAndRepoFetcher(ctl)
|
||||||
|
gitea.EXPECT().GetRepositoryFileContent("foo", "bar", "", "workflow.config").Return([]byte(test.json), "abc", nil)
|
||||||
|
gitea.EXPECT().GetRepository("foo", "bar").Return(&repo, nil)
|
||||||
|
|
||||||
|
config, err := common.ReadWorkflowConfig(gitea, "foo/bar")
|
||||||
|
if err != nil || config == nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if l := config.Label("path/String"); l != test.label_value {
|
||||||
|
t.Error("Expecting", test.label_value, "got", l)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestProjectConfigMatcher(t *testing.T) {
|
func TestProjectConfigMatcher(t *testing.T) {
|
||||||
configs := common.AutogitConfigs{
|
configs := common.AutogitConfigs{
|
||||||
{
|
{
|
||||||
@@ -21,6 +82,15 @@ func TestProjectConfigMatcher(t *testing.T) {
|
|||||||
Branch: "main",
|
Branch: "main",
|
||||||
GitProjectName: "test/prjgit#main",
|
GitProjectName: "test/prjgit#main",
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
Organization: "test",
|
||||||
|
Branch: "main",
|
||||||
|
GitProjectName: "test/bar#never_match",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Organization: "test",
|
||||||
|
GitProjectName: "test/bar#main",
|
||||||
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
@@ -50,6 +120,20 @@ func TestProjectConfigMatcher(t *testing.T) {
|
|||||||
branch: "main",
|
branch: "main",
|
||||||
config: 1,
|
config: 1,
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
name: "prjgit only match",
|
||||||
|
org: "test",
|
||||||
|
repo: "bar",
|
||||||
|
branch: "main",
|
||||||
|
config: 3,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "non-default branch match",
|
||||||
|
org: "test",
|
||||||
|
repo: "bar",
|
||||||
|
branch: "something_main",
|
||||||
|
config: -1,
|
||||||
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
for _, test := range tests {
|
for _, test := range tests {
|
||||||
@@ -105,10 +189,15 @@ func TestConfigWorkflowParser(t *testing.T) {
|
|||||||
if config.ManualMergeOnly != false {
|
if config.ManualMergeOnly != false {
|
||||||
t.Fatal("This should be false")
|
t.Fatal("This should be false")
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if config.Label("foobar") != "foobar" {
|
||||||
|
t.Fatal("undefined label should return default value")
|
||||||
|
}
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// FIXME: should test ReadWorkflowConfig as it will always set prjgit completely
|
||||||
func TestProjectGitParser(t *testing.T) {
|
func TestProjectGitParser(t *testing.T) {
|
||||||
tests := []struct {
|
tests := []struct {
|
||||||
name string
|
name string
|
||||||
@@ -119,20 +208,21 @@ func TestProjectGitParser(t *testing.T) {
|
|||||||
}{
|
}{
|
||||||
{
|
{
|
||||||
name: "repo only",
|
name: "repo only",
|
||||||
prjgit: "repo.git",
|
prjgit: "repo.git#master",
|
||||||
org: "org",
|
org: "org",
|
||||||
branch: "br",
|
branch: "br",
|
||||||
res: [3]string{"org", "repo.git", "master"},
|
res: [3]string{"org", "repo.git", "master"},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "default",
|
name: "default",
|
||||||
org: "org",
|
org: "org",
|
||||||
res: [3]string{"org", common.DefaultGitPrj, "master"},
|
prjgit: "org/_ObsPrj#master",
|
||||||
|
res: [3]string{"org", common.DefaultGitPrj, "master"},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "repo with branch",
|
name: "repo with branch",
|
||||||
org: "org2",
|
org: "org2",
|
||||||
prjgit: "repo.git#somebranch",
|
prjgit: "org2/repo.git#somebranch",
|
||||||
res: [3]string{"org2", "repo.git", "somebranch"},
|
res: [3]string{"org2", "repo.git", "somebranch"},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -149,25 +239,25 @@ func TestProjectGitParser(t *testing.T) {
|
|||||||
{
|
{
|
||||||
name: "repo org and empty branch",
|
name: "repo org and empty branch",
|
||||||
org: "org3",
|
org: "org3",
|
||||||
prjgit: "oorg/foo.bar#",
|
prjgit: "oorg/foo.bar#master",
|
||||||
res: [3]string{"oorg", "foo.bar", "master"},
|
res: [3]string{"oorg", "foo.bar", "master"},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "only branch defined",
|
name: "only branch defined",
|
||||||
org: "org3",
|
org: "org3",
|
||||||
prjgit: "#mybranch",
|
prjgit: "org3/_ObsPrj#mybranch",
|
||||||
res: [3]string{"org3", "_ObsPrj", "mybranch"},
|
res: [3]string{"org3", "_ObsPrj", "mybranch"},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "only org and branch defined",
|
name: "only org and branch defined",
|
||||||
org: "org3",
|
org: "org3",
|
||||||
prjgit: "org1/#mybranch",
|
prjgit: "org1/_ObsPrj#mybranch",
|
||||||
res: [3]string{"org1", "_ObsPrj", "mybranch"},
|
res: [3]string{"org1", "_ObsPrj", "mybranch"},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "empty org and repo",
|
name: "empty org and repo",
|
||||||
org: "org3",
|
org: "org3",
|
||||||
prjgit: "/repo#",
|
prjgit: "org3/repo#master",
|
||||||
res: [3]string{"org3", "repo", "master"},
|
res: [3]string{"org3", "repo", "master"},
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
@@ -188,3 +278,131 @@ func TestProjectGitParser(t *testing.T) {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestConfigPermissions(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
permission string
|
||||||
|
user string
|
||||||
|
config *common.AutogitConfig
|
||||||
|
result bool
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "NoPermissions",
|
||||||
|
permission: common.Permission_ForceMerge,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "NoPermissions",
|
||||||
|
permission: common.Permission_Group,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Regular permission ForcePush",
|
||||||
|
permission: common.Permission_ForceMerge,
|
||||||
|
result: true,
|
||||||
|
user: "user",
|
||||||
|
config: &common.AutogitConfig{
|
||||||
|
Permissions: []*common.Permissions{
|
||||||
|
&common.Permissions{
|
||||||
|
Permission: common.Permission_ForceMerge,
|
||||||
|
Members: []string{"user"},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "User is part of a group",
|
||||||
|
permission: common.Permission_ForceMerge,
|
||||||
|
result: true,
|
||||||
|
user: "user",
|
||||||
|
config: &common.AutogitConfig{
|
||||||
|
Permissions: []*common.Permissions{
|
||||||
|
&common.Permissions{
|
||||||
|
Permission: common.Permission_ForceMerge,
|
||||||
|
Members: []string{"group"},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
ReviewGroups: []*common.ReviewGroup{
|
||||||
|
&common.ReviewGroup{
|
||||||
|
Name: "group",
|
||||||
|
Reviewers: []string{"some", "members", "including", "user"},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, test := range tests {
|
||||||
|
t.Run(test.name, func(t *testing.T) {
|
||||||
|
if r := test.config.HasPermission(test.user, test.permission); r != test.result {
|
||||||
|
t.Error("Expecting", test.result, "but got opposite")
|
||||||
|
}
|
||||||
|
if r := test.config.HasPermission(test.user+test.user, test.permission); r {
|
||||||
|
t.Error("Expecting false for fake user, but got opposite")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestConfigMergeModeParser(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
json string
|
||||||
|
mergeMode string
|
||||||
|
wantErr bool
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "empty",
|
||||||
|
json: "{}",
|
||||||
|
mergeMode: common.MergeModeFF,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "ff-only",
|
||||||
|
json: `{"MergeMode": "ff-only"}`,
|
||||||
|
mergeMode: common.MergeModeFF,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "replace",
|
||||||
|
json: `{"MergeMode": "replace"}`,
|
||||||
|
mergeMode: common.MergeModeReplace,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "devel",
|
||||||
|
json: `{"MergeMode": "devel"}`,
|
||||||
|
mergeMode: common.MergeModeDevel,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "unsupported",
|
||||||
|
json: `{"MergeMode": "invalid"}`,
|
||||||
|
wantErr: true,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, test := range tests {
|
||||||
|
t.Run(test.name, func(t *testing.T) {
|
||||||
|
repo := models.Repository{
|
||||||
|
DefaultBranch: "master",
|
||||||
|
}
|
||||||
|
|
||||||
|
ctl := gomock.NewController(t)
|
||||||
|
gitea := mock_common.NewMockGiteaFileContentAndRepoFetcher(ctl)
|
||||||
|
gitea.EXPECT().GetRepositoryFileContent("foo", "bar", "", "workflow.config").Return([]byte(test.json), "abc", nil)
|
||||||
|
gitea.EXPECT().GetRepository("foo", "bar").Return(&repo, nil)
|
||||||
|
|
||||||
|
config, err := common.ReadWorkflowConfig(gitea, "foo/bar")
|
||||||
|
if test.wantErr {
|
||||||
|
if err == nil {
|
||||||
|
t.Fatal("Expected error, got nil")
|
||||||
|
}
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if config.MergeMode != test.mergeMode {
|
||||||
|
t.Errorf("Expected MergeMode %s, got %s", test.mergeMode, config.MergeMode)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -20,10 +20,13 @@ package common
|
|||||||
|
|
||||||
const (
|
const (
|
||||||
GiteaTokenEnv = "GITEA_TOKEN"
|
GiteaTokenEnv = "GITEA_TOKEN"
|
||||||
|
GiteaHostEnv = "GITEA_HOST"
|
||||||
ObsUserEnv = "OBS_USER"
|
ObsUserEnv = "OBS_USER"
|
||||||
ObsPasswordEnv = "OBS_PASSWORD"
|
ObsPasswordEnv = "OBS_PASSWORD"
|
||||||
ObsSshkeyEnv = "OBS_SSHKEY"
|
ObsSshkeyEnv = "OBS_SSHKEY"
|
||||||
ObsSshkeyFileEnv = "OBS_SSHKEYFILE"
|
ObsSshkeyFileEnv = "OBS_SSHKEYFILE"
|
||||||
|
ObsApiEnv = "OBS_API"
|
||||||
|
ObsWebEnv = "OBS_WEB"
|
||||||
|
|
||||||
DefaultGitPrj = "_ObsPrj"
|
DefaultGitPrj = "_ObsPrj"
|
||||||
PrjLinksFile = "links.json"
|
PrjLinksFile = "links.json"
|
||||||
|
|||||||
@@ -1731,3 +1731,246 @@ const requestedReviewJSON = `{
|
|||||||
"commit_id": "",
|
"commit_id": "",
|
||||||
"review": null
|
"review": null
|
||||||
}`
|
}`
|
||||||
|
|
||||||
|
const requestStatusJSON=`{
|
||||||
|
"commit": {
|
||||||
|
"id": "e637d86cbbdd438edbf60148e28f9d75a74d51b27b01f75610f247cd18394c8e",
|
||||||
|
"message": "Update nodejs-common.changes\n",
|
||||||
|
"url": "https://src.opensuse.org/autogits/nodejs-common/commit/e637d86cbbdd438edbf60148e28f9d75a74d51b27b01f75610f247cd18394c8e",
|
||||||
|
"author": {
|
||||||
|
"name": "Adam Majer",
|
||||||
|
"email": "adamm@noreply.src.opensuse.org",
|
||||||
|
"username": "adamm"
|
||||||
|
},
|
||||||
|
"committer": {
|
||||||
|
"name": "Adam Majer",
|
||||||
|
"email": "adamm@noreply.src.opensuse.org",
|
||||||
|
"username": "adamm"
|
||||||
|
},
|
||||||
|
"verification": null,
|
||||||
|
"timestamp": "2025-09-16T12:41:02+02:00",
|
||||||
|
"added": [],
|
||||||
|
"removed": [],
|
||||||
|
"modified": [
|
||||||
|
"nodejs-common.changes"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"context": "test",
|
||||||
|
"created_at": "2025-09-16T10:50:32Z",
|
||||||
|
"description": "",
|
||||||
|
"id": 21663,
|
||||||
|
"repository": {
|
||||||
|
"id": 90520,
|
||||||
|
"owner": {
|
||||||
|
"id": 983,
|
||||||
|
"login": "autogits",
|
||||||
|
"login_name": "",
|
||||||
|
"source_id": 0,
|
||||||
|
"full_name": "",
|
||||||
|
"email": "",
|
||||||
|
"avatar_url": "https://src.opensuse.org/avatars/80a61ef3a14c3c22f0b8b1885d1a75d4",
|
||||||
|
"html_url": "https://src.opensuse.org/autogits",
|
||||||
|
"language": "",
|
||||||
|
"is_admin": false,
|
||||||
|
"last_login": "0001-01-01T00:00:00Z",
|
||||||
|
"created": "2024-06-20T09:46:37+02:00",
|
||||||
|
"restricted": false,
|
||||||
|
"active": false,
|
||||||
|
"prohibit_login": false,
|
||||||
|
"location": "",
|
||||||
|
"website": "",
|
||||||
|
"description": "",
|
||||||
|
"visibility": "public",
|
||||||
|
"followers_count": 0,
|
||||||
|
"following_count": 0,
|
||||||
|
"starred_repos_count": 0,
|
||||||
|
"username": "autogits"
|
||||||
|
},
|
||||||
|
"name": "nodejs-common",
|
||||||
|
"full_name": "autogits/nodejs-common",
|
||||||
|
"description": "",
|
||||||
|
"empty": false,
|
||||||
|
"private": false,
|
||||||
|
"fork": true,
|
||||||
|
"template": false,
|
||||||
|
"parent": {
|
||||||
|
"id": 62649,
|
||||||
|
"owner": {
|
||||||
|
"id": 64,
|
||||||
|
"login": "pool",
|
||||||
|
"login_name": "",
|
||||||
|
"source_id": 0,
|
||||||
|
"full_name": "",
|
||||||
|
"email": "",
|
||||||
|
"avatar_url": "https://src.opensuse.org/avatars/b10a8c0bede9eb4ea771b04db3149f28",
|
||||||
|
"html_url": "https://src.opensuse.org/pool",
|
||||||
|
"language": "",
|
||||||
|
"is_admin": false,
|
||||||
|
"last_login": "0001-01-01T00:00:00Z",
|
||||||
|
"created": "2023-03-01T14:41:17+01:00",
|
||||||
|
"restricted": false,
|
||||||
|
"active": false,
|
||||||
|
"prohibit_login": false,
|
||||||
|
"location": "",
|
||||||
|
"website": "",
|
||||||
|
"description": "",
|
||||||
|
"visibility": "public",
|
||||||
|
"followers_count": 2,
|
||||||
|
"following_count": 0,
|
||||||
|
"starred_repos_count": 0,
|
||||||
|
"username": "pool"
|
||||||
|
},
|
||||||
|
"name": "nodejs-common",
|
||||||
|
"full_name": "pool/nodejs-common",
|
||||||
|
"description": "",
|
||||||
|
"empty": false,
|
||||||
|
"private": false,
|
||||||
|
"fork": false,
|
||||||
|
"template": false,
|
||||||
|
"mirror": false,
|
||||||
|
"size": 134,
|
||||||
|
"language": "",
|
||||||
|
"languages_url": "https://src.opensuse.org/api/v1/repos/pool/nodejs-common/languages",
|
||||||
|
"html_url": "https://src.opensuse.org/pool/nodejs-common",
|
||||||
|
"url": "https://src.opensuse.org/api/v1/repos/pool/nodejs-common",
|
||||||
|
"link": "",
|
||||||
|
"ssh_url": "gitea@src.opensuse.org:pool/nodejs-common.git",
|
||||||
|
"clone_url": "https://src.opensuse.org/pool/nodejs-common.git",
|
||||||
|
"original_url": "",
|
||||||
|
"website": "",
|
||||||
|
"stars_count": 0,
|
||||||
|
"forks_count": 3,
|
||||||
|
"watchers_count": 12,
|
||||||
|
"open_issues_count": 0,
|
||||||
|
"open_pr_counter": 0,
|
||||||
|
"release_counter": 0,
|
||||||
|
"default_branch": "factory",
|
||||||
|
"archived": false,
|
||||||
|
"created_at": "2024-06-17T17:08:45+02:00",
|
||||||
|
"updated_at": "2025-08-21T21:58:31+02:00",
|
||||||
|
"archived_at": "1970-01-01T01:00:00+01:00",
|
||||||
|
"permissions": {
|
||||||
|
"admin": true,
|
||||||
|
"push": true,
|
||||||
|
"pull": true
|
||||||
|
},
|
||||||
|
"has_issues": true,
|
||||||
|
"internal_tracker": {
|
||||||
|
"enable_time_tracker": false,
|
||||||
|
"allow_only_contributors_to_track_time": true,
|
||||||
|
"enable_issue_dependencies": true
|
||||||
|
},
|
||||||
|
"has_wiki": false,
|
||||||
|
"has_pull_requests": true,
|
||||||
|
"has_projects": false,
|
||||||
|
"projects_mode": "all",
|
||||||
|
"has_releases": false,
|
||||||
|
"has_packages": false,
|
||||||
|
"has_actions": false,
|
||||||
|
"ignore_whitespace_conflicts": false,
|
||||||
|
"allow_merge_commits": true,
|
||||||
|
"allow_rebase": true,
|
||||||
|
"allow_rebase_explicit": true,
|
||||||
|
"allow_squash_merge": true,
|
||||||
|
"allow_fast_forward_only_merge": true,
|
||||||
|
"allow_rebase_update": true,
|
||||||
|
"allow_manual_merge": true,
|
||||||
|
"autodetect_manual_merge": true,
|
||||||
|
"default_delete_branch_after_merge": false,
|
||||||
|
"default_merge_style": "merge",
|
||||||
|
"default_allow_maintainer_edit": false,
|
||||||
|
"avatar_url": "",
|
||||||
|
"internal": false,
|
||||||
|
"mirror_interval": "",
|
||||||
|
"object_format_name": "sha256",
|
||||||
|
"mirror_updated": "0001-01-01T00:00:00Z",
|
||||||
|
"topics": [],
|
||||||
|
"licenses": []
|
||||||
|
},
|
||||||
|
"mirror": false,
|
||||||
|
"size": 143,
|
||||||
|
"language": "",
|
||||||
|
"languages_url": "https://src.opensuse.org/api/v1/repos/autogits/nodejs-common/languages",
|
||||||
|
"html_url": "https://src.opensuse.org/autogits/nodejs-common",
|
||||||
|
"url": "https://src.opensuse.org/api/v1/repos/autogits/nodejs-common",
|
||||||
|
"link": "",
|
||||||
|
"ssh_url": "gitea@src.opensuse.org:autogits/nodejs-common.git",
|
||||||
|
"clone_url": "https://src.opensuse.org/autogits/nodejs-common.git",
|
||||||
|
"original_url": "",
|
||||||
|
"website": "",
|
||||||
|
"stars_count": 0,
|
||||||
|
"forks_count": 1,
|
||||||
|
"watchers_count": 4,
|
||||||
|
"open_issues_count": 0,
|
||||||
|
"open_pr_counter": 1,
|
||||||
|
"release_counter": 0,
|
||||||
|
"default_branch": "factory",
|
||||||
|
"archived": false,
|
||||||
|
"created_at": "2024-07-01T13:29:03+02:00",
|
||||||
|
"updated_at": "2025-09-16T12:41:03+02:00",
|
||||||
|
"archived_at": "1970-01-01T01:00:00+01:00",
|
||||||
|
"permissions": {
|
||||||
|
"admin": true,
|
||||||
|
"push": true,
|
||||||
|
"pull": true
|
||||||
|
},
|
||||||
|
"has_issues": false,
|
||||||
|
"has_wiki": false,
|
||||||
|
"has_pull_requests": true,
|
||||||
|
"has_projects": false,
|
||||||
|
"projects_mode": "all",
|
||||||
|
"has_releases": false,
|
||||||
|
"has_packages": false,
|
||||||
|
"has_actions": false,
|
||||||
|
"ignore_whitespace_conflicts": false,
|
||||||
|
"allow_merge_commits": true,
|
||||||
|
"allow_rebase": true,
|
||||||
|
"allow_rebase_explicit": true,
|
||||||
|
"allow_squash_merge": true,
|
||||||
|
"allow_fast_forward_only_merge": true,
|
||||||
|
"allow_rebase_update": true,
|
||||||
|
"allow_manual_merge": true,
|
||||||
|
"autodetect_manual_merge": true,
|
||||||
|
"default_delete_branch_after_merge": false,
|
||||||
|
"default_merge_style": "merge",
|
||||||
|
"default_allow_maintainer_edit": false,
|
||||||
|
"avatar_url": "",
|
||||||
|
"internal": false,
|
||||||
|
"mirror_interval": "",
|
||||||
|
"object_format_name": "sha256",
|
||||||
|
"mirror_updated": "0001-01-01T00:00:00Z",
|
||||||
|
"topics": [],
|
||||||
|
"licenses": [
|
||||||
|
"MIT"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"sender": {
|
||||||
|
"id": 129,
|
||||||
|
"login": "adamm",
|
||||||
|
"login_name": "",
|
||||||
|
"source_id": 0,
|
||||||
|
"full_name": "Adam Majer",
|
||||||
|
"email": "adamm@noreply.src.opensuse.org",
|
||||||
|
"avatar_url": "https://src.opensuse.org/avatars/3e8917bfbf04293f7c20c28cacd83dae2ba9b78a6c6a9a1bedf14c683d8a3763",
|
||||||
|
"html_url": "https://src.opensuse.org/adamm",
|
||||||
|
"language": "",
|
||||||
|
"is_admin": false,
|
||||||
|
"last_login": "0001-01-01T00:00:00Z",
|
||||||
|
"created": "2023-07-21T16:43:48+02:00",
|
||||||
|
"restricted": false,
|
||||||
|
"active": false,
|
||||||
|
"prohibit_login": false,
|
||||||
|
"location": "",
|
||||||
|
"website": "",
|
||||||
|
"description": "",
|
||||||
|
"visibility": "public",
|
||||||
|
"followers_count": 1,
|
||||||
|
"following_count": 0,
|
||||||
|
"starred_repos_count": 0,
|
||||||
|
"username": "adamm"
|
||||||
|
},
|
||||||
|
"sha": "e637d86cbbdd438edbf60148e28f9d75a74d51b27b01f75610f247cd18394c8e",
|
||||||
|
"state": "pending",
|
||||||
|
"target_url": "https://src.opensuse.org/",
|
||||||
|
"updated_at": "2025-09-16T10:50:32Z"
|
||||||
|
}`
|
||||||
|
|||||||
@@ -40,6 +40,10 @@ type GitSubmoduleLister interface {
|
|||||||
GitSubmoduleCommitId(cwd, packageName, commitId string) (subCommitId string, valid bool)
|
GitSubmoduleCommitId(cwd, packageName, commitId string) (subCommitId string, valid bool)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
type GitDirectoryLister interface {
|
||||||
|
GitDirectoryList(gitPath, commitId string) (dirlist map[string]string, err error)
|
||||||
|
}
|
||||||
|
|
||||||
type GitStatusLister interface {
|
type GitStatusLister interface {
|
||||||
GitStatus(cwd string) ([]GitStatusData, error)
|
GitStatus(cwd string) ([]GitStatusData, error)
|
||||||
}
|
}
|
||||||
@@ -61,12 +65,14 @@ type Git interface {
|
|||||||
io.Closer
|
io.Closer
|
||||||
|
|
||||||
GitSubmoduleLister
|
GitSubmoduleLister
|
||||||
|
GitDirectoryLister
|
||||||
GitStatusLister
|
GitStatusLister
|
||||||
|
|
||||||
GitExecWithOutputOrPanic(cwd string, params ...string) string
|
GitExecWithOutputOrPanic(cwd string, params ...string) string
|
||||||
GitExecOrPanic(cwd string, params ...string)
|
GitExecOrPanic(cwd string, params ...string)
|
||||||
GitExec(cwd string, params ...string) error
|
GitExec(cwd string, params ...string) error
|
||||||
GitExecWithOutput(cwd string, params ...string) (string, error)
|
GitExecWithOutput(cwd string, params ...string) (string, error)
|
||||||
|
GitExecQuietOrPanic(cwd string, params ...string)
|
||||||
|
|
||||||
GitDiffLister
|
GitDiffLister
|
||||||
}
|
}
|
||||||
@@ -76,7 +82,8 @@ type GitHandlerImpl struct {
|
|||||||
GitCommiter string
|
GitCommiter string
|
||||||
GitEmail string
|
GitEmail string
|
||||||
|
|
||||||
lock *sync.Mutex
|
lock *sync.Mutex
|
||||||
|
quiet bool
|
||||||
}
|
}
|
||||||
|
|
||||||
func (s *GitHandlerImpl) GetPath() string {
|
func (s *GitHandlerImpl) GetPath() string {
|
||||||
@@ -211,7 +218,7 @@ func (e *GitHandlerImpl) GitClone(repo, branch, remoteUrl string) (string, error
|
|||||||
return "", fmt.Errorf("Cannot parse remote URL: %w", err)
|
return "", fmt.Errorf("Cannot parse remote URL: %w", err)
|
||||||
}
|
}
|
||||||
remoteBranch := "HEAD"
|
remoteBranch := "HEAD"
|
||||||
if len(branch) == 0 && remoteUrlComp != nil {
|
if len(branch) == 0 && remoteUrlComp != nil && remoteUrlComp.Commit != "HEAD" {
|
||||||
branch = remoteUrlComp.Commit
|
branch = remoteUrlComp.Commit
|
||||||
remoteBranch = branch
|
remoteBranch = branch
|
||||||
} else if len(branch) > 0 {
|
} else if len(branch) > 0 {
|
||||||
@@ -240,46 +247,51 @@ func (e *GitHandlerImpl) GitClone(repo, branch, remoteUrl string) (string, error
|
|||||||
|
|
||||||
// check if we have submodule to deinit
|
// check if we have submodule to deinit
|
||||||
if list, _ := e.GitSubmoduleList(repo, "HEAD"); len(list) > 0 {
|
if list, _ := e.GitSubmoduleList(repo, "HEAD"); len(list) > 0 {
|
||||||
e.GitExecOrPanic(repo, "submodule", "deinit", "--all", "--force")
|
e.GitExecQuietOrPanic(repo, "submodule", "deinit", "--all", "--force")
|
||||||
}
|
}
|
||||||
|
|
||||||
e.GitExecOrPanic(repo, "fetch", "--prune", remoteName, remoteBranch)
|
e.GitExecOrPanic(repo, "fetch", "--prune", remoteName, remoteBranch)
|
||||||
}
|
}
|
||||||
|
/*
|
||||||
|
refsBytes, err := os.ReadFile(path.Join(e.GitPath, repo, ".git/refs/remotes", remoteName, "HEAD"))
|
||||||
|
if err != nil {
|
||||||
|
LogError("Cannot read HEAD of remote", remoteName)
|
||||||
|
return remoteName, fmt.Errorf("Cannot read HEAD of remote %s", remoteName)
|
||||||
|
}
|
||||||
|
|
||||||
refsBytes, err := os.ReadFile(path.Join(e.GitPath, repo, ".git/refs/remotes", remoteName, "HEAD"))
|
refs := string(refsBytes)
|
||||||
if err != nil {
|
if refs[0:5] != "ref: " {
|
||||||
LogError("Cannot read HEAD of remote", remoteName)
|
LogError("Unexpected format of remote HEAD ref:", refs)
|
||||||
return remoteName, fmt.Errorf("Cannot read HEAD of remote %s", remoteName)
|
return remoteName, fmt.Errorf("Unexpected format of remote HEAD ref: %s", refs)
|
||||||
}
|
}
|
||||||
|
|
||||||
refs := string(refsBytes)
|
|
||||||
if refs[0:5] != "ref: " {
|
|
||||||
LogError("Unexpected format of remote HEAD ref:", refs)
|
|
||||||
return remoteName, fmt.Errorf("Unexpected format of remote HEAD ref: %s", refs)
|
|
||||||
}
|
|
||||||
|
|
||||||
if len(branch) == 0 || branch == "HEAD" {
|
|
||||||
remoteRef = strings.TrimSpace(refs[5:])
|
|
||||||
branch = remoteRef[strings.LastIndex(remoteRef, "/")+1:]
|
|
||||||
LogDebug("remoteRef", remoteRef)
|
|
||||||
LogDebug("branch", branch)
|
|
||||||
}
|
|
||||||
|
|
||||||
|
if len(branch) == 0 || branch == "HEAD" {
|
||||||
|
remoteRef = strings.TrimSpace(refs[5:])
|
||||||
|
branch = remoteRef[strings.LastIndex(remoteRef, "/")+1:]
|
||||||
|
LogDebug("remoteRef", remoteRef)
|
||||||
|
LogDebug("branch", branch)
|
||||||
|
}
|
||||||
|
*/
|
||||||
args := []string{"fetch", "--prune", remoteName, branch}
|
args := []string{"fetch", "--prune", remoteName, branch}
|
||||||
if strings.TrimSpace(e.GitExecWithOutputOrPanic(repo, "rev-parse", "--is-shallow-repository")) == "true" {
|
if strings.TrimSpace(e.GitExecWithOutputOrPanic(repo, "rev-parse", "--is-shallow-repository")) == "true" {
|
||||||
args = slices.Insert(args, 1, "--unshallow")
|
args = slices.Insert(args, 1, "--unshallow")
|
||||||
}
|
}
|
||||||
e.GitExecOrPanic(repo, args...)
|
e.GitExecOrPanic(repo, args...)
|
||||||
return remoteName, e.GitExec(repo, "checkout", "--track", "-B", branch, remoteRef)
|
return remoteName, e.GitExec(repo, "checkout", "-f", "--track", "-B", branch, remoteRef)
|
||||||
}
|
}
|
||||||
|
|
||||||
func (e *GitHandlerImpl) GitBranchHead(gitDir, branchName string) (string, error) {
|
func (e *GitHandlerImpl) GitBranchHead(gitDir, branchName string) (string, error) {
|
||||||
id, err := e.GitExecWithOutput(gitDir, "show-ref", "--hash", "--verify", "refs/heads/"+branchName)
|
id, err := e.GitExecWithOutput(gitDir, "show-ref", "--heads", "--hash", branchName)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return "", fmt.Errorf("Can't find default branch: %s", branchName)
|
return "", fmt.Errorf("Can't find default branch: %s", branchName)
|
||||||
}
|
}
|
||||||
|
|
||||||
return strings.TrimSpace(id), nil
|
id = strings.TrimSpace(SplitLines(id)[0])
|
||||||
|
if len(id) < 10 {
|
||||||
|
return "", fmt.Errorf("Can't find branch: %s", branchName)
|
||||||
|
}
|
||||||
|
|
||||||
|
return id, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func (e *GitHandlerImpl) GitRemoteHead(gitDir, remote, branchName string) (string, error) {
|
func (e *GitHandlerImpl) GitRemoteHead(gitDir, remote, branchName string) (string, error) {
|
||||||
@@ -338,6 +350,10 @@ var ExtraGitParams []string
|
|||||||
|
|
||||||
func (e *GitHandlerImpl) GitExecWithOutput(cwd string, params ...string) (string, error) {
|
func (e *GitHandlerImpl) GitExecWithOutput(cwd string, params ...string) (string, error) {
|
||||||
cmd := exec.Command("/usr/bin/git", params...)
|
cmd := exec.Command("/usr/bin/git", params...)
|
||||||
|
var identityFile string
|
||||||
|
if i := os.Getenv("AUTOGITS_IDENTITY_FILE"); len(i) > 0 {
|
||||||
|
identityFile = " -i " + i
|
||||||
|
}
|
||||||
cmd.Env = []string{
|
cmd.Env = []string{
|
||||||
"GIT_CEILING_DIRECTORIES=" + e.GitPath,
|
"GIT_CEILING_DIRECTORIES=" + e.GitPath,
|
||||||
"GIT_CONFIG_GLOBAL=/dev/null",
|
"GIT_CONFIG_GLOBAL=/dev/null",
|
||||||
@@ -345,7 +361,8 @@ func (e *GitHandlerImpl) GitExecWithOutput(cwd string, params ...string) (string
|
|||||||
"GIT_COMMITTER_NAME=" + e.GitCommiter,
|
"GIT_COMMITTER_NAME=" + e.GitCommiter,
|
||||||
"EMAIL=not@exist@src.opensuse.org",
|
"EMAIL=not@exist@src.opensuse.org",
|
||||||
"GIT_LFS_SKIP_SMUDGE=1",
|
"GIT_LFS_SKIP_SMUDGE=1",
|
||||||
"GIT_SSH_COMMAND=/usr/bin/ssh -o StrictHostKeyChecking=yes",
|
"GIT_LFS_SKIP_PUSH=1",
|
||||||
|
"GIT_SSH_COMMAND=/usr/bin/ssh -o StrictHostKeyChecking=yes" + identityFile,
|
||||||
}
|
}
|
||||||
if len(ExtraGitParams) > 0 {
|
if len(ExtraGitParams) > 0 {
|
||||||
cmd.Env = append(cmd.Env, ExtraGitParams...)
|
cmd.Env = append(cmd.Env, ExtraGitParams...)
|
||||||
@@ -355,7 +372,9 @@ func (e *GitHandlerImpl) GitExecWithOutput(cwd string, params ...string) (string
|
|||||||
|
|
||||||
LogDebug("git execute @", cwd, ":", cmd.Args)
|
LogDebug("git execute @", cwd, ":", cmd.Args)
|
||||||
out, err := cmd.CombinedOutput()
|
out, err := cmd.CombinedOutput()
|
||||||
LogDebug(string(out))
|
if !e.quiet {
|
||||||
|
LogDebug(string(out))
|
||||||
|
}
|
||||||
if err != nil {
|
if err != nil {
|
||||||
LogError("git", cmd.Args, " error:", err)
|
LogError("git", cmd.Args, " error:", err)
|
||||||
return "", fmt.Errorf("error executing: git %#v \n%s\n err: %w", cmd.Args, out, err)
|
return "", fmt.Errorf("error executing: git %#v \n%s\n err: %w", cmd.Args, out, err)
|
||||||
@@ -364,6 +383,13 @@ func (e *GitHandlerImpl) GitExecWithOutput(cwd string, params ...string) (string
|
|||||||
return string(out), nil
|
return string(out), nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (e *GitHandlerImpl) GitExecQuietOrPanic(cwd string, params ...string) {
|
||||||
|
e.quiet = true
|
||||||
|
e.GitExecOrPanic(cwd, params...)
|
||||||
|
e.quiet = false
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
type ChanIO struct {
|
type ChanIO struct {
|
||||||
ch chan byte
|
ch chan byte
|
||||||
}
|
}
|
||||||
@@ -761,6 +787,80 @@ func (e *GitHandlerImpl) GitCatFile(cwd, commitId, filename string) (data []byte
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// return (filename) -> (hash) map for all submodules
|
||||||
|
func (e *GitHandlerImpl) GitDirectoryList(gitPath, commitId string) (directoryList map[string]string, err error) {
|
||||||
|
var done sync.Mutex
|
||||||
|
directoryList = make(map[string]string)
|
||||||
|
|
||||||
|
done.Lock()
|
||||||
|
data_in, data_out := ChanIO{make(chan byte)}, ChanIO{make(chan byte)}
|
||||||
|
|
||||||
|
LogDebug("Getting directory for:", commitId)
|
||||||
|
|
||||||
|
go func() {
|
||||||
|
defer done.Unlock()
|
||||||
|
defer close(data_out.ch)
|
||||||
|
|
||||||
|
data_out.Write([]byte(commitId))
|
||||||
|
data_out.ch <- '\x00'
|
||||||
|
var c GitCommit
|
||||||
|
c, err = parseGitCommit(data_in.ch)
|
||||||
|
if err != nil {
|
||||||
|
err = fmt.Errorf("Error parsing git commit. Err: %w", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
trees := make(map[string]string)
|
||||||
|
trees[""] = c.Tree
|
||||||
|
|
||||||
|
for len(trees) > 0 {
|
||||||
|
for p, tree := range trees {
|
||||||
|
delete(trees, p)
|
||||||
|
|
||||||
|
data_out.Write([]byte(tree))
|
||||||
|
data_out.ch <- '\x00'
|
||||||
|
var tree GitTree
|
||||||
|
tree, err = parseGitTree(data_in.ch)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
err = fmt.Errorf("Error parsing git tree: %w", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, te := range tree.items {
|
||||||
|
if te.isTree() {
|
||||||
|
directoryList[p+te.name] = te.hash
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
|
||||||
|
cmd := exec.Command("/usr/bin/git", "cat-file", "--batch", "-Z")
|
||||||
|
cmd.Env = []string{
|
||||||
|
"GIT_CEILING_DIRECTORIES=" + e.GitPath,
|
||||||
|
"GIT_LFS_SKIP_SMUDGE=1",
|
||||||
|
"GIT_CONFIG_GLOBAL=/dev/null",
|
||||||
|
}
|
||||||
|
cmd.Dir = filepath.Join(e.GitPath, gitPath)
|
||||||
|
cmd.Stdout = &data_in
|
||||||
|
cmd.Stdin = &data_out
|
||||||
|
cmd.Stderr = writeFunc(func(data []byte) (int, error) {
|
||||||
|
LogError(string(data))
|
||||||
|
return len(data), nil
|
||||||
|
})
|
||||||
|
LogDebug("command run:", cmd.Args)
|
||||||
|
if e := cmd.Run(); e != nil {
|
||||||
|
LogError(e)
|
||||||
|
close(data_in.ch)
|
||||||
|
close(data_out.ch)
|
||||||
|
return directoryList, e
|
||||||
|
}
|
||||||
|
|
||||||
|
done.Lock()
|
||||||
|
return directoryList, err
|
||||||
|
}
|
||||||
|
|
||||||
// return (filename) -> (hash) map for all submodules
|
// return (filename) -> (hash) map for all submodules
|
||||||
func (e *GitHandlerImpl) GitSubmoduleList(gitPath, commitId string) (submoduleList map[string]string, err error) {
|
func (e *GitHandlerImpl) GitSubmoduleList(gitPath, commitId string) (submoduleList map[string]string, err error) {
|
||||||
var done sync.Mutex
|
var done sync.Mutex
|
||||||
|
|||||||
@@ -392,6 +392,7 @@ func TestCommitTreeParsing(t *testing.T) {
|
|||||||
commitId = commitId + strings.TrimSpace(string(data))
|
commitId = commitId + strings.TrimSpace(string(data))
|
||||||
return len(data), nil
|
return len(data), nil
|
||||||
})
|
})
|
||||||
|
cmd.Stderr = os.Stderr
|
||||||
if err := cmd.Run(); err != nil {
|
if err := cmd.Run(); err != nil {
|
||||||
t.Fatal(err.Error())
|
t.Fatal(err.Error())
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -29,6 +29,7 @@ import (
|
|||||||
"path"
|
"path"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
"slices"
|
"slices"
|
||||||
|
"sync"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
transport "github.com/go-openapi/runtime/client"
|
transport "github.com/go-openapi/runtime/client"
|
||||||
@@ -66,7 +67,16 @@ const (
|
|||||||
ReviewStateUnknown models.ReviewStateType = ""
|
ReviewStateUnknown models.ReviewStateType = ""
|
||||||
)
|
)
|
||||||
|
|
||||||
|
type GiteaLabelGetter interface {
|
||||||
|
GetLabels(org, repo string, idx int64) ([]*models.Label, error)
|
||||||
|
}
|
||||||
|
|
||||||
|
type GiteaLabelSettter interface {
|
||||||
|
SetLabels(org, repo string, idx int64, labels []string) ([]*models.Label, error)
|
||||||
|
}
|
||||||
|
|
||||||
type GiteaTimelineFetcher interface {
|
type GiteaTimelineFetcher interface {
|
||||||
|
ResetTimelineCache(org, repo string, idx int64)
|
||||||
GetTimeline(org, repo string, idx int64) ([]*models.TimelineComment, error)
|
GetTimeline(org, repo string, idx int64) ([]*models.TimelineComment, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -91,9 +101,10 @@ type GiteaPRUpdater interface {
|
|||||||
UpdatePullRequest(org, repo string, num int64, options *models.EditPullRequestOption) (*models.PullRequest, error)
|
UpdatePullRequest(org, repo string, num int64, options *models.EditPullRequestOption) (*models.PullRequest, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
type GiteaPRTimelineFetcher interface {
|
type GiteaPRTimelineReviewFetcher interface {
|
||||||
GiteaPRFetcher
|
GiteaPRFetcher
|
||||||
GiteaTimelineFetcher
|
GiteaTimelineFetcher
|
||||||
|
GiteaReviewFetcher
|
||||||
}
|
}
|
||||||
|
|
||||||
type GiteaCommitFetcher interface {
|
type GiteaCommitFetcher interface {
|
||||||
@@ -119,10 +130,16 @@ type GiteaPRChecker interface {
|
|||||||
GiteaMaintainershipReader
|
GiteaMaintainershipReader
|
||||||
}
|
}
|
||||||
|
|
||||||
type GiteaReviewFetcherAndRequester interface {
|
type GiteaReviewFetcherAndRequesterAndUnrequester interface {
|
||||||
GiteaReviewTimelineFetcher
|
GiteaReviewTimelineFetcher
|
||||||
GiteaCommentFetcher
|
GiteaCommentFetcher
|
||||||
GiteaReviewRequester
|
GiteaReviewRequester
|
||||||
|
GiteaReviewUnrequester
|
||||||
|
}
|
||||||
|
|
||||||
|
type GiteaUnreviewTimelineFetcher interface {
|
||||||
|
GiteaTimelineFetcher
|
||||||
|
GiteaReviewUnrequester
|
||||||
}
|
}
|
||||||
|
|
||||||
type GiteaReviewRequester interface {
|
type GiteaReviewRequester interface {
|
||||||
@@ -160,6 +177,10 @@ type GiteaCommitStatusGetter interface {
|
|||||||
GetCommitStatus(org, repo, hash string) ([]*models.CommitStatus, error)
|
GetCommitStatus(org, repo, hash string) ([]*models.CommitStatus, error)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
type GiteaMerger interface {
|
||||||
|
ManualMergePR(org, repo string, id int64, commitid string, delBranch bool) error
|
||||||
|
}
|
||||||
|
|
||||||
type Gitea interface {
|
type Gitea interface {
|
||||||
GiteaComment
|
GiteaComment
|
||||||
GiteaRepoFetcher
|
GiteaRepoFetcher
|
||||||
@@ -168,6 +189,7 @@ type Gitea interface {
|
|||||||
GiteaReviewer
|
GiteaReviewer
|
||||||
GiteaPRFetcher
|
GiteaPRFetcher
|
||||||
GiteaPRUpdater
|
GiteaPRUpdater
|
||||||
|
GiteaMerger
|
||||||
GiteaCommitFetcher
|
GiteaCommitFetcher
|
||||||
GiteaReviewFetcher
|
GiteaReviewFetcher
|
||||||
GiteaCommentFetcher
|
GiteaCommentFetcher
|
||||||
@@ -177,7 +199,8 @@ type Gitea interface {
|
|||||||
GiteaCommitStatusGetter
|
GiteaCommitStatusGetter
|
||||||
GiteaCommitStatusSetter
|
GiteaCommitStatusSetter
|
||||||
GiteaSetRepoOptions
|
GiteaSetRepoOptions
|
||||||
GiteaTimelineFetcher
|
GiteaLabelGetter
|
||||||
|
GiteaLabelSettter
|
||||||
|
|
||||||
GetNotifications(Type string, since *time.Time) ([]*models.NotificationThread, error)
|
GetNotifications(Type string, since *time.Time) ([]*models.NotificationThread, error)
|
||||||
GetDoneNotifications(Type string, page int64) ([]*models.NotificationThread, error)
|
GetDoneNotifications(Type string, page int64) ([]*models.NotificationThread, error)
|
||||||
@@ -185,7 +208,7 @@ type Gitea interface {
|
|||||||
GetOrganization(orgName string) (*models.Organization, error)
|
GetOrganization(orgName string) (*models.Organization, error)
|
||||||
GetOrganizationRepositories(orgName string) ([]*models.Repository, error)
|
GetOrganizationRepositories(orgName string) ([]*models.Repository, error)
|
||||||
CreateRepositoryIfNotExist(git Git, org, repoName string) (*models.Repository, error)
|
CreateRepositoryIfNotExist(git Git, org, repoName string) (*models.Repository, error)
|
||||||
CreatePullRequestIfNotExist(repo *models.Repository, srcId, targetId, title, body string) (*models.PullRequest, error)
|
CreatePullRequestIfNotExist(repo *models.Repository, srcId, targetId, title, body string) (*models.PullRequest, error, bool)
|
||||||
GetPullRequestFileContent(pr *models.PullRequest, path string) ([]byte, string, error)
|
GetPullRequestFileContent(pr *models.PullRequest, path string) ([]byte, string, error)
|
||||||
GetRecentPullRequests(org, repo, branch string) ([]*models.PullRequest, error)
|
GetRecentPullRequests(org, repo, branch string) ([]*models.PullRequest, error)
|
||||||
GetRecentCommits(org, repo, branch string, commitNo int64) ([]*models.Commit, error)
|
GetRecentCommits(org, repo, branch string, commitNo int64) ([]*models.Commit, error)
|
||||||
@@ -233,6 +256,11 @@ func (gitea *GiteaTransport) GetPullRequest(org, project string, num int64) (*mo
|
|||||||
gitea.transport.DefaultAuthentication,
|
gitea.transport.DefaultAuthentication,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
LogError(err)
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
return pr.Payload, err
|
return pr.Payload, err
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -246,9 +274,36 @@ func (gitea *GiteaTransport) UpdatePullRequest(org, repo string, num int64, opti
|
|||||||
gitea.transport.DefaultAuthentication,
|
gitea.transport.DefaultAuthentication,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
LogError(err)
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
return pr.Payload, err
|
return pr.Payload, err
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (gitea *GiteaTransport) ManualMergePR(org, repo string, num int64, commitid string, delBranch bool) error {
|
||||||
|
manual_merge := "manually-merged"
|
||||||
|
_, err := gitea.client.Repository.RepoMergePullRequest(
|
||||||
|
repository.NewRepoMergePullRequestParams().
|
||||||
|
WithOwner(org).
|
||||||
|
WithRepo(repo).
|
||||||
|
WithIndex(num).
|
||||||
|
WithBody(&models.MergePullRequestForm{
|
||||||
|
Do: &manual_merge,
|
||||||
|
DeleteBranchAfterMerge: delBranch,
|
||||||
|
HeadCommitID: commitid,
|
||||||
|
}), gitea.transport.DefaultAuthentication,
|
||||||
|
)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
LogError(err)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
func (gitea *GiteaTransport) GetPullRequests(org, repo string) ([]*models.PullRequest, error) {
|
func (gitea *GiteaTransport) GetPullRequests(org, repo string) ([]*models.PullRequest, error) {
|
||||||
var page, limit int64
|
var page, limit int64
|
||||||
|
|
||||||
@@ -273,6 +328,9 @@ func (gitea *GiteaTransport) GetPullRequests(org, repo string) ([]*models.PullRe
|
|||||||
return nil, fmt.Errorf("cannot fetch PR list for %s / %s : %w", org, repo, err)
|
return nil, fmt.Errorf("cannot fetch PR list for %s / %s : %w", org, repo, err)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if len(req.Payload) == 0 {
|
||||||
|
break
|
||||||
|
}
|
||||||
prs = slices.Concat(prs, req.Payload)
|
prs = slices.Concat(prs, req.Payload)
|
||||||
if len(req.Payload) < int(limit) {
|
if len(req.Payload) < int(limit) {
|
||||||
break
|
break
|
||||||
@@ -295,11 +353,11 @@ func (gitea *GiteaTransport) GetCommitStatus(org, repo, hash string) ([]*models.
|
|||||||
if err != nil {
|
if err != nil {
|
||||||
return res, err
|
return res, err
|
||||||
}
|
}
|
||||||
|
if len(r.Payload) == 0 {
|
||||||
res = append(res, r.Payload...)
|
|
||||||
if len(r.Payload) < int(limit) {
|
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
|
res = append(res, r.Payload...)
|
||||||
|
page++
|
||||||
}
|
}
|
||||||
|
|
||||||
return res, nil
|
return res, nil
|
||||||
@@ -360,10 +418,10 @@ func (gitea *GiteaTransport) GetPullRequestReviews(org, project string, PRnum in
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
allReviews = slices.Concat(allReviews, reviews.Payload)
|
if len(reviews.Payload) == 0 {
|
||||||
if len(reviews.Payload) < int(limit) {
|
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
|
allReviews = slices.Concat(allReviews, reviews.Payload)
|
||||||
page++
|
page++
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -427,6 +485,30 @@ func (gitea *GiteaTransport) SetRepoOptions(owner, repo string, manual_merge boo
|
|||||||
return ok.Payload, err
|
return ok.Payload, err
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (gitea *GiteaTransport) GetLabels(owner, repo string, idx int64) ([]*models.Label, error) {
|
||||||
|
ret, err := gitea.client.Issue.IssueGetLabels(issue.NewIssueGetLabelsParams().WithOwner(owner).WithRepo(repo).WithIndex(idx), gitea.transport.DefaultAuthentication)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return ret.Payload, err
|
||||||
|
}
|
||||||
|
|
||||||
|
func (gitea *GiteaTransport) SetLabels(owner, repo string, idx int64, labels []string) ([]*models.Label, error) {
|
||||||
|
interfaceLabels := make([]interface{}, len(labels))
|
||||||
|
for i, l := range labels {
|
||||||
|
interfaceLabels[i] = l
|
||||||
|
}
|
||||||
|
|
||||||
|
ret, err := gitea.client.Issue.IssueAddLabel(issue.NewIssueAddLabelParams().WithOwner(owner).WithRepo(repo).WithIndex(idx).WithBody(&models.IssueLabelsOption{Labels: interfaceLabels}),
|
||||||
|
gitea.transport.DefaultAuthentication)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return ret.Payload, nil
|
||||||
|
}
|
||||||
|
|
||||||
const (
|
const (
|
||||||
GiteaNotificationType_Pull = "Pull"
|
GiteaNotificationType_Pull = "Pull"
|
||||||
)
|
)
|
||||||
@@ -453,6 +535,9 @@ func (gitea *GiteaTransport) GetNotifications(Type string, since *time.Time) ([]
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if len(list.Payload) == 0 {
|
||||||
|
break
|
||||||
|
}
|
||||||
ret = slices.Concat(ret, list.Payload)
|
ret = slices.Concat(ret, list.Payload)
|
||||||
if len(list.Payload) < int(bigLimit) {
|
if len(list.Payload) < int(bigLimit) {
|
||||||
break
|
break
|
||||||
@@ -601,7 +686,7 @@ func (gitea *GiteaTransport) CreateRepositoryIfNotExist(git Git, org, repoName s
|
|||||||
return repo.Payload, nil
|
return repo.Payload, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func (gitea *GiteaTransport) CreatePullRequestIfNotExist(repo *models.Repository, srcId, targetId, title, body string) (*models.PullRequest, error) {
|
func (gitea *GiteaTransport) CreatePullRequestIfNotExist(repo *models.Repository, srcId, targetId, title, body string) (*models.PullRequest, error, bool) {
|
||||||
prOptions := models.CreatePullRequestOption{
|
prOptions := models.CreatePullRequestOption{
|
||||||
Base: targetId,
|
Base: targetId,
|
||||||
Head: srcId,
|
Head: srcId,
|
||||||
@@ -610,10 +695,14 @@ func (gitea *GiteaTransport) CreatePullRequestIfNotExist(repo *models.Repository
|
|||||||
}
|
}
|
||||||
|
|
||||||
if pr, err := gitea.client.Repository.RepoGetPullRequestByBaseHead(
|
if pr, err := gitea.client.Repository.RepoGetPullRequestByBaseHead(
|
||||||
repository.NewRepoGetPullRequestByBaseHeadParams().WithOwner(repo.Owner.UserName).WithRepo(repo.Name).WithBase(targetId).WithHead(srcId),
|
repository.NewRepoGetPullRequestByBaseHeadParams().
|
||||||
|
WithOwner(repo.Owner.UserName).
|
||||||
|
WithRepo(repo.Name).
|
||||||
|
WithBase(targetId).
|
||||||
|
WithHead(srcId),
|
||||||
gitea.transport.DefaultAuthentication,
|
gitea.transport.DefaultAuthentication,
|
||||||
); err == nil {
|
); err == nil && pr.Payload.State == "open" {
|
||||||
return pr.Payload, nil
|
return pr.Payload, nil, false
|
||||||
}
|
}
|
||||||
|
|
||||||
pr, err := gitea.client.Repository.RepoCreatePullRequest(
|
pr, err := gitea.client.Repository.RepoCreatePullRequest(
|
||||||
@@ -627,10 +716,10 @@ func (gitea *GiteaTransport) CreatePullRequestIfNotExist(repo *models.Repository
|
|||||||
)
|
)
|
||||||
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, fmt.Errorf("Cannot create pull request. %w", err)
|
return nil, fmt.Errorf("Cannot create pull request. %w", err), true
|
||||||
}
|
}
|
||||||
|
|
||||||
return pr.GetPayload(), nil
|
return pr.GetPayload(), nil, true
|
||||||
}
|
}
|
||||||
|
|
||||||
func (gitea *GiteaTransport) RequestReviews(pr *models.PullRequest, reviewers ...string) ([]*models.PullReview, error) {
|
func (gitea *GiteaTransport) RequestReviews(pr *models.PullRequest, reviewers ...string) ([]*models.PullReview, error) {
|
||||||
@@ -717,39 +806,91 @@ func (gitea *GiteaTransport) AddComment(pr *models.PullRequest, comment string)
|
|||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
type TimelineCacheData struct {
|
||||||
|
data []*models.TimelineComment
|
||||||
|
lastCheck time.Time
|
||||||
|
}
|
||||||
|
|
||||||
|
var giteaTimelineCache map[string]TimelineCacheData = make(map[string]TimelineCacheData)
|
||||||
|
var giteaTimelineCacheMutex sync.RWMutex
|
||||||
|
|
||||||
|
func (gitea *GiteaTransport) ResetTimelineCache(org, repo string, idx int64) {
|
||||||
|
giteaTimelineCacheMutex.Lock()
|
||||||
|
defer giteaTimelineCacheMutex.Unlock()
|
||||||
|
|
||||||
|
prID := fmt.Sprintf("%s/%s!%d", org, repo, idx)
|
||||||
|
Cache, IsCached := giteaTimelineCache[prID]
|
||||||
|
if IsCached {
|
||||||
|
Cache.lastCheck = Cache.lastCheck.Add(-time.Hour)
|
||||||
|
giteaTimelineCache[prID] = Cache
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// returns timeline in reverse chronological create order
|
||||||
func (gitea *GiteaTransport) GetTimeline(org, repo string, idx int64) ([]*models.TimelineComment, error) {
|
func (gitea *GiteaTransport) GetTimeline(org, repo string, idx int64) ([]*models.TimelineComment, error) {
|
||||||
page := int64(1)
|
page := int64(1)
|
||||||
resCount := 1
|
resCount := 1
|
||||||
|
|
||||||
retData := []*models.TimelineComment{}
|
prID := fmt.Sprintf("%s/%s!%d", org, repo, idx)
|
||||||
|
giteaTimelineCacheMutex.RLock()
|
||||||
|
TimelineCache, IsCached := giteaTimelineCache[prID]
|
||||||
|
var LastCachedTime strfmt.DateTime
|
||||||
|
if IsCached {
|
||||||
|
l := len(TimelineCache.data)
|
||||||
|
if l > 0 {
|
||||||
|
LastCachedTime = TimelineCache.data[0].Updated
|
||||||
|
}
|
||||||
|
|
||||||
|
// cache data for 5 seconds
|
||||||
|
if TimelineCache.lastCheck.Add(time.Second*5).Compare(time.Now()) > 0 {
|
||||||
|
giteaTimelineCacheMutex.RUnlock()
|
||||||
|
return TimelineCache.data, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
giteaTimelineCacheMutex.RUnlock()
|
||||||
|
|
||||||
|
giteaTimelineCacheMutex.Lock()
|
||||||
|
defer giteaTimelineCacheMutex.Unlock()
|
||||||
|
|
||||||
for resCount > 0 {
|
for resCount > 0 {
|
||||||
res, err := gitea.client.Issue.IssueGetCommentsAndTimeline(
|
opts := issue.NewIssueGetCommentsAndTimelineParams().WithOwner(org).WithRepo(repo).WithIndex(idx).WithPage(&page)
|
||||||
issue.NewIssueGetCommentsAndTimelineParams().
|
if !LastCachedTime.IsZero() {
|
||||||
WithOwner(org).
|
opts = opts.WithSince(&LastCachedTime)
|
||||||
WithRepo(repo).
|
}
|
||||||
WithIndex(idx).
|
res, err := gitea.client.Issue.IssueGetCommentsAndTimeline(opts, gitea.transport.DefaultAuthentication)
|
||||||
WithPage(&page),
|
|
||||||
gitea.transport.DefaultAuthentication,
|
|
||||||
)
|
|
||||||
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
resCount = len(res.Payload)
|
if resCount = len(res.Payload); resCount == 0 {
|
||||||
LogDebug("page:", page, "len:", resCount)
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, d := range res.Payload {
|
||||||
|
if d != nil {
|
||||||
|
if time.Time(d.Created).Compare(time.Time(LastCachedTime)) > 0 {
|
||||||
|
// created after last check, so we append here
|
||||||
|
TimelineCache.data = append(TimelineCache.data, d)
|
||||||
|
} else {
|
||||||
|
// we need something updated in the timeline, maybe
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if resCount < 10 {
|
||||||
|
break
|
||||||
|
}
|
||||||
page++
|
page++
|
||||||
|
|
||||||
retData = append(retData, res.Payload...)
|
|
||||||
}
|
}
|
||||||
LogDebug("total results:", len(retData))
|
LogDebug("timeline", prID, "# timeline:", len(TimelineCache.data))
|
||||||
|
slices.SortFunc(TimelineCache.data, func(a, b *models.TimelineComment) int {
|
||||||
slices.SortFunc(retData, func(a, b *models.TimelineComment) int {
|
|
||||||
return time.Time(b.Created).Compare(time.Time(a.Created))
|
return time.Time(b.Created).Compare(time.Time(a.Created))
|
||||||
})
|
})
|
||||||
|
|
||||||
return retData, nil
|
TimelineCache.lastCheck = time.Now()
|
||||||
|
giteaTimelineCache[prID] = TimelineCache
|
||||||
|
|
||||||
|
return TimelineCache.data, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func (gitea *GiteaTransport) GetRepositoryFileContent(org, repo, hash, path string) ([]byte, string, error) {
|
func (gitea *GiteaTransport) GetRepositoryFileContent(org, repo, hash, path string) ([]byte, string, error) {
|
||||||
|
|||||||
324
common/listen.go
324
common/listen.go
@@ -1,324 +0,0 @@
|
|||||||
package common
|
|
||||||
|
|
||||||
/*
|
|
||||||
* This file is part of Autogits.
|
|
||||||
*
|
|
||||||
* Copyright © 2024 SUSE LLC
|
|
||||||
*
|
|
||||||
* Autogits is free software: you can redistribute it and/or modify it under
|
|
||||||
* the terms of the GNU General Public License as published by the Free Software
|
|
||||||
* Foundation, either version 2 of the License, or (at your option) any later
|
|
||||||
* version.
|
|
||||||
*
|
|
||||||
* Autogits is distributed in the hope that it will be useful, but WITHOUT ANY
|
|
||||||
* WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
|
|
||||||
* PARTICULAR PURPOSE. See the GNU General Public License for more details.
|
|
||||||
*
|
|
||||||
* You should have received a copy of the GNU General Public License along with
|
|
||||||
* Foobar. If not, see <https://www.gnu.org/licenses/>.
|
|
||||||
*/
|
|
||||||
|
|
||||||
import (
|
|
||||||
"crypto/tls"
|
|
||||||
"fmt"
|
|
||||||
"net/url"
|
|
||||||
"runtime/debug"
|
|
||||||
"slices"
|
|
||||||
"strings"
|
|
||||||
"time"
|
|
||||||
|
|
||||||
rabbitmq "github.com/rabbitmq/amqp091-go"
|
|
||||||
)
|
|
||||||
|
|
||||||
const RequestType_CreateBrachTag = "create"
|
|
||||||
const RequestType_DeleteBranchTag = "delete"
|
|
||||||
const RequestType_Fork = "fork"
|
|
||||||
const RequestType_Issue = "issues"
|
|
||||||
const RequestType_IssueAssign = "issue_assign"
|
|
||||||
const RequestType_IssueComment = "issue_comment"
|
|
||||||
const RequestType_IssueLabel = "issue_label"
|
|
||||||
const RequestType_IssueMilestone = "issue_milestone"
|
|
||||||
const RequestType_Push = "push"
|
|
||||||
const RequestType_Repository = "repository"
|
|
||||||
const RequestType_Release = "release"
|
|
||||||
const RequestType_PR = "pull_request"
|
|
||||||
const RequestType_PRAssign = "pull_request_assign"
|
|
||||||
const RequestType_PRLabel = "pull_request_label"
|
|
||||||
const RequestType_PRComment = "pull_request_comment"
|
|
||||||
const RequestType_PRMilestone = "pull_request_milestone"
|
|
||||||
const RequestType_PRSync = "pull_request_sync"
|
|
||||||
const RequestType_PRReviewAccepted = "pull_request_review_approved"
|
|
||||||
const RequestType_PRReviewRejected = "pull_request_review_rejected"
|
|
||||||
const RequestType_PRReviewRequest = "pull_request_review_request"
|
|
||||||
const RequestType_PRReviewComment = "pull_request_review_comment"
|
|
||||||
const RequestType_Wiki = "wiki"
|
|
||||||
|
|
||||||
type RequestProcessor interface {
|
|
||||||
ProcessFunc(*Request) error
|
|
||||||
}
|
|
||||||
|
|
||||||
type ListenDefinitions struct {
|
|
||||||
RabbitURL *url.URL // amqps://user:password@host/queue
|
|
||||||
|
|
||||||
GitAuthor string
|
|
||||||
Handlers map[string]RequestProcessor
|
|
||||||
Orgs []string
|
|
||||||
|
|
||||||
topics []string
|
|
||||||
topicSubChanges chan string // +topic = subscribe, -topic = unsubscribe
|
|
||||||
}
|
|
||||||
|
|
||||||
type RabbitMessage rabbitmq.Delivery
|
|
||||||
|
|
||||||
func (l *ListenDefinitions) processTopicChanges(ch *rabbitmq.Channel, queueName string) {
|
|
||||||
for {
|
|
||||||
topic, ok := <-l.topicSubChanges
|
|
||||||
if !ok {
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
LogDebug(" topic change:", topic)
|
|
||||||
switch topic[0] {
|
|
||||||
case '+':
|
|
||||||
if err := ch.QueueBind(queueName, topic[1:], "pubsub", false, nil); err != nil {
|
|
||||||
LogError(err)
|
|
||||||
}
|
|
||||||
case '-':
|
|
||||||
if err := ch.QueueUnbind(queueName, topic[1:], "pubsub", nil); err != nil {
|
|
||||||
LogError(err)
|
|
||||||
}
|
|
||||||
default:
|
|
||||||
LogInfo("Ignoring unknown topic change:", topic)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *ListenDefinitions) processRabbitMQ(msgCh chan<- RabbitMessage) error {
|
|
||||||
queueName := l.RabbitURL.Path
|
|
||||||
l.RabbitURL.Path = ""
|
|
||||||
|
|
||||||
if len(queueName) > 0 && queueName[0] == '/' {
|
|
||||||
queueName = queueName[1:]
|
|
||||||
}
|
|
||||||
|
|
||||||
connection, err := rabbitmq.DialTLS(l.RabbitURL.String(), &tls.Config{
|
|
||||||
ServerName: l.RabbitURL.Hostname(),
|
|
||||||
})
|
|
||||||
if err != nil {
|
|
||||||
return fmt.Errorf("Cannot connect to %s . Err: %w", l.RabbitURL.Hostname(), err)
|
|
||||||
}
|
|
||||||
defer connection.Close()
|
|
||||||
|
|
||||||
ch, err := connection.Channel()
|
|
||||||
if err != nil {
|
|
||||||
return fmt.Errorf("Cannot create a channel. Err: %w", err)
|
|
||||||
}
|
|
||||||
defer ch.Close()
|
|
||||||
|
|
||||||
if err = ch.ExchangeDeclarePassive("pubsub", "topic", true, false, false, false, nil); err != nil {
|
|
||||||
return fmt.Errorf("Cannot find pubsub exchange? Err: %w", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
var q rabbitmq.Queue
|
|
||||||
if len(queueName) == 0 {
|
|
||||||
q, err = ch.QueueDeclare("", false, true, true, false, nil)
|
|
||||||
} else {
|
|
||||||
q, err = ch.QueueDeclarePassive(queueName, true, false, true, false, nil)
|
|
||||||
if err != nil {
|
|
||||||
LogInfo("queue not found .. trying to create it:", err)
|
|
||||||
if ch.IsClosed() {
|
|
||||||
ch, err = connection.Channel()
|
|
||||||
if err != nil {
|
|
||||||
return fmt.Errorf("Channel cannot be re-opened. Err: %w", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
q, err = ch.QueueDeclare(queueName, true, false, true, false, nil)
|
|
||||||
|
|
||||||
if err != nil {
|
|
||||||
LogInfo("can't create persistent queue ... falling back to temporaty queue:", err)
|
|
||||||
if ch.IsClosed() {
|
|
||||||
ch, err = connection.Channel()
|
|
||||||
return fmt.Errorf("Channel cannot be re-opened. Err: %w", err)
|
|
||||||
}
|
|
||||||
q, err = ch.QueueDeclare("", false, true, true, false, nil)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if err != nil {
|
|
||||||
return fmt.Errorf("Cannot declare queue. Err: %w", err)
|
|
||||||
}
|
|
||||||
// log.Printf("queue: %s:%d", q.Name, q.Consumers)
|
|
||||||
|
|
||||||
LogDebug(" -- listening to topics:")
|
|
||||||
l.topicSubChanges = make(chan string)
|
|
||||||
defer close(l.topicSubChanges)
|
|
||||||
go l.processTopicChanges(ch, q.Name)
|
|
||||||
|
|
||||||
for _, topic := range l.topics {
|
|
||||||
l.topicSubChanges <- "+" + topic
|
|
||||||
}
|
|
||||||
|
|
||||||
msgs, err := ch.Consume(q.Name, "", true, true, false, false, nil)
|
|
||||||
if err != nil {
|
|
||||||
return fmt.Errorf("Cannot start consumer. Err: %w", err)
|
|
||||||
}
|
|
||||||
// log.Printf("queue: %s:%d", q.Name, q.Consumers)
|
|
||||||
|
|
||||||
for {
|
|
||||||
msg, ok := <-msgs
|
|
||||||
if !ok {
|
|
||||||
return fmt.Errorf("channel/connection closed?\n")
|
|
||||||
}
|
|
||||||
|
|
||||||
msgCh <- RabbitMessage(msg)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *ListenDefinitions) connectAndProcessRabbitMQ(ch chan<- RabbitMessage) {
|
|
||||||
defer func() {
|
|
||||||
if r := recover(); r != nil {
|
|
||||||
LogError(r)
|
|
||||||
LogError("'crash' RabbitMQ worker. Recovering... reconnecting...")
|
|
||||||
time.Sleep(5 * time.Second)
|
|
||||||
go l.connectAndProcessRabbitMQ(ch)
|
|
||||||
}
|
|
||||||
}()
|
|
||||||
|
|
||||||
for {
|
|
||||||
err := l.processRabbitMQ(ch)
|
|
||||||
if err != nil {
|
|
||||||
LogError("Error in RabbitMQ connection. %#v", err)
|
|
||||||
LogInfo("Reconnecting in 2 seconds...")
|
|
||||||
time.Sleep(2 * time.Second)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *ListenDefinitions) connectToRabbitMQ() chan RabbitMessage {
|
|
||||||
ch := make(chan RabbitMessage, 100)
|
|
||||||
go l.connectAndProcessRabbitMQ(ch)
|
|
||||||
|
|
||||||
return ch
|
|
||||||
}
|
|
||||||
|
|
||||||
func ProcessEvent(f RequestProcessor, request *Request) {
|
|
||||||
defer func() {
|
|
||||||
if r := recover(); r != nil {
|
|
||||||
LogError("panic caught")
|
|
||||||
if err, ok := r.(error); !ok {
|
|
||||||
LogError(err)
|
|
||||||
}
|
|
||||||
LogError(string(debug.Stack()))
|
|
||||||
}
|
|
||||||
}()
|
|
||||||
|
|
||||||
if err := f.ProcessFunc(request); err != nil {
|
|
||||||
LogError(err)
|
|
||||||
}
|
|
||||||
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *ListenDefinitions) generateTopics() []string {
|
|
||||||
topics := make([]string, 0, len(l.Handlers)*len(l.Orgs))
|
|
||||||
scope := "suse"
|
|
||||||
if l.RabbitURL.Hostname() == "rabbit.opensuse.org" {
|
|
||||||
scope = "opensuse"
|
|
||||||
}
|
|
||||||
|
|
||||||
for _, org := range l.Orgs {
|
|
||||||
for requestType, _ := range l.Handlers {
|
|
||||||
topics = append(topics, fmt.Sprintf("%s.src.%s.%s.#", scope, org, requestType))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
slices.Sort(topics)
|
|
||||||
return slices.Compact(topics)
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *ListenDefinitions) UpdateTopics() {
|
|
||||||
newTopics := l.generateTopics()
|
|
||||||
|
|
||||||
j := 0
|
|
||||||
next_new_topic:
|
|
||||||
for i := 0; i < len(newTopics); i++ {
|
|
||||||
topic := newTopics[i]
|
|
||||||
|
|
||||||
for j < len(l.topics) {
|
|
||||||
cmp := strings.Compare(topic, l.topics[j])
|
|
||||||
|
|
||||||
if cmp == 0 {
|
|
||||||
j++
|
|
||||||
continue next_new_topic
|
|
||||||
}
|
|
||||||
|
|
||||||
if cmp < 0 {
|
|
||||||
l.topicSubChanges <- "+" + topic
|
|
||||||
break
|
|
||||||
}
|
|
||||||
|
|
||||||
l.topicSubChanges <- "-" + l.topics[j]
|
|
||||||
j++
|
|
||||||
}
|
|
||||||
|
|
||||||
if j == len(l.topics) {
|
|
||||||
l.topicSubChanges <- "+" + topic
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for j < len(l.topics) {
|
|
||||||
l.topicSubChanges <- "-" + l.topics[j]
|
|
||||||
j++
|
|
||||||
}
|
|
||||||
|
|
||||||
l.topics = newTopics
|
|
||||||
}
|
|
||||||
|
|
||||||
func (l *ListenDefinitions) ProcessRabbitMQEvents() error {
|
|
||||||
LogInfo("RabbitMQ connection:", l.RabbitURL.String())
|
|
||||||
LogDebug("# Handlers:", len(l.Handlers))
|
|
||||||
LogDebug("# Orgs:", len(l.Orgs))
|
|
||||||
|
|
||||||
l.RabbitURL.User = url.UserPassword(rabbitUser, rabbitPassword)
|
|
||||||
l.topics = l.generateTopics()
|
|
||||||
ch := l.connectToRabbitMQ()
|
|
||||||
|
|
||||||
for {
|
|
||||||
msg, ok := <-ch
|
|
||||||
if !ok {
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
LogDebug("event:", msg.RoutingKey)
|
|
||||||
|
|
||||||
route := strings.Split(msg.RoutingKey, ".")
|
|
||||||
if len(route) > 3 {
|
|
||||||
reqType := route[3]
|
|
||||||
org := route[2]
|
|
||||||
|
|
||||||
if !slices.Contains(l.Orgs, org) {
|
|
||||||
LogInfo("Got event for unhandeled org:", org)
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
|
|
||||||
LogDebug("org:", org, "type:", reqType)
|
|
||||||
if handler, found := l.Handlers[reqType]; found {
|
|
||||||
/* h, err := CreateRequestHandler()
|
|
||||||
if err != nil {
|
|
||||||
log.Println("Cannot create request handler", err)
|
|
||||||
continue
|
|
||||||
}
|
|
||||||
*/
|
|
||||||
req, err := ParseRequestJSON(reqType, msg.Body)
|
|
||||||
if err != nil {
|
|
||||||
LogError("Error parsing request JSON:", err)
|
|
||||||
continue
|
|
||||||
} else {
|
|
||||||
LogDebug("processing req", req.Type)
|
|
||||||
// h.Request = req
|
|
||||||
ProcessEvent(handler, req)
|
|
||||||
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@@ -63,6 +63,10 @@ func SetLoggingLevel(ll LogLevel) {
|
|||||||
logLevel = ll
|
logLevel = ll
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func GetLoggingLevel() LogLevel {
|
||||||
|
return logLevel
|
||||||
|
}
|
||||||
|
|
||||||
func SetLoggingLevelFromString(ll string) error {
|
func SetLoggingLevelFromString(ll string) error {
|
||||||
switch ll {
|
switch ll {
|
||||||
case "info":
|
case "info":
|
||||||
|
|||||||
@@ -1,10 +1,12 @@
|
|||||||
package common
|
package common
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
"bytes"
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
"io"
|
||||||
"slices"
|
"slices"
|
||||||
|
"strings"
|
||||||
|
|
||||||
"src.opensuse.org/autogits/common/gitea-generated/client/repository"
|
"src.opensuse.org/autogits/common/gitea-generated/client/repository"
|
||||||
"src.opensuse.org/autogits/common/gitea-generated/models"
|
"src.opensuse.org/autogits/common/gitea-generated/models"
|
||||||
@@ -13,10 +15,10 @@ import (
|
|||||||
//go:generate mockgen -source=maintainership.go -destination=mock/maintainership.go -typed
|
//go:generate mockgen -source=maintainership.go -destination=mock/maintainership.go -typed
|
||||||
|
|
||||||
type MaintainershipData interface {
|
type MaintainershipData interface {
|
||||||
ListProjectMaintainers() []string
|
ListProjectMaintainers(OptionalGroupExpansion []*ReviewGroup) []string
|
||||||
ListPackageMaintainers(pkg string) []string
|
ListPackageMaintainers(Pkg string, OptionalGroupExpasion []*ReviewGroup) []string
|
||||||
|
|
||||||
IsApproved(pkg string, reviews []*models.PullReview, submitter string) bool
|
IsApproved(Pkg string, Reviews []*models.PullReview, Submitter string, ReviewGroups []*ReviewGroup) bool
|
||||||
}
|
}
|
||||||
|
|
||||||
const ProjectKey = ""
|
const ProjectKey = ""
|
||||||
@@ -25,12 +27,15 @@ const ProjectFileKey = "_project"
|
|||||||
type MaintainershipMap struct {
|
type MaintainershipMap struct {
|
||||||
Data map[string][]string
|
Data map[string][]string
|
||||||
IsDir bool
|
IsDir bool
|
||||||
|
Config *AutogitConfig
|
||||||
FetchPackage func(string) ([]byte, error)
|
FetchPackage func(string) ([]byte, error)
|
||||||
|
Raw []byte
|
||||||
}
|
}
|
||||||
|
|
||||||
func parseMaintainershipData(data []byte) (*MaintainershipMap, error) {
|
func ParseMaintainershipData(data []byte) (*MaintainershipMap, error) {
|
||||||
maintainers := &MaintainershipMap{
|
maintainers := &MaintainershipMap{
|
||||||
Data: make(map[string][]string),
|
Data: make(map[string][]string),
|
||||||
|
Raw: data,
|
||||||
}
|
}
|
||||||
if err := json.Unmarshal(data, &maintainers.Data); err != nil {
|
if err := json.Unmarshal(data, &maintainers.Data); err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
@@ -39,7 +44,9 @@ func parseMaintainershipData(data []byte) (*MaintainershipMap, error) {
|
|||||||
return maintainers, nil
|
return maintainers, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func FetchProjectMaintainershipData(gitea GiteaMaintainershipReader, org, prjGit, branch string) (*MaintainershipMap, error) {
|
func FetchProjectMaintainershipData(gitea GiteaMaintainershipReader, config *AutogitConfig) (*MaintainershipMap, error) {
|
||||||
|
org, prjGit, branch := config.GetPrjGit()
|
||||||
|
|
||||||
data, _, err := gitea.FetchMaintainershipDirFile(org, prjGit, branch, ProjectFileKey)
|
data, _, err := gitea.FetchMaintainershipDirFile(org, prjGit, branch, ProjectFileKey)
|
||||||
dir := true
|
dir := true
|
||||||
if err != nil || data == nil {
|
if err != nil || data == nil {
|
||||||
@@ -59,8 +66,9 @@ func FetchProjectMaintainershipData(gitea GiteaMaintainershipReader, org, prjGit
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
m, err := parseMaintainershipData(data)
|
m, err := ParseMaintainershipData(data)
|
||||||
if m != nil {
|
if m != nil {
|
||||||
|
m.Config = config
|
||||||
m.IsDir = dir
|
m.IsDir = dir
|
||||||
m.FetchPackage = func(pkg string) ([]byte, error) {
|
m.FetchPackage = func(pkg string) ([]byte, error) {
|
||||||
data, _, err := gitea.FetchMaintainershipDirFile(org, prjGit, branch, pkg)
|
data, _, err := gitea.FetchMaintainershipDirFile(org, prjGit, branch, pkg)
|
||||||
@@ -70,7 +78,7 @@ func FetchProjectMaintainershipData(gitea GiteaMaintainershipReader, org, prjGit
|
|||||||
return m, err
|
return m, err
|
||||||
}
|
}
|
||||||
|
|
||||||
func (data *MaintainershipMap) ListProjectMaintainers() []string {
|
func (data *MaintainershipMap) ListProjectMaintainers(groups []*ReviewGroup) []string {
|
||||||
if data == nil {
|
if data == nil {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
@@ -80,6 +88,13 @@ func (data *MaintainershipMap) ListProjectMaintainers() []string {
|
|||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
m = slices.Clone(m)
|
||||||
|
|
||||||
|
// expands groups
|
||||||
|
for _, g := range groups {
|
||||||
|
m = g.ExpandMaintainers(m)
|
||||||
|
}
|
||||||
|
|
||||||
return m
|
return m
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -96,7 +111,7 @@ func parsePkgDirData(pkg string, data []byte) []string {
|
|||||||
return pkgMaintainers
|
return pkgMaintainers
|
||||||
}
|
}
|
||||||
|
|
||||||
func (data *MaintainershipMap) ListPackageMaintainers(pkg string) []string {
|
func (data *MaintainershipMap) ListPackageMaintainers(pkg string, groups []*ReviewGroup) []string {
|
||||||
if data == nil {
|
if data == nil {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
@@ -111,7 +126,8 @@ func (data *MaintainershipMap) ListPackageMaintainers(pkg string) []string {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
prjMaintainers := data.ListProjectMaintainers()
|
pkgMaintainers = slices.Clone(pkgMaintainers)
|
||||||
|
prjMaintainers := data.ListProjectMaintainers(nil)
|
||||||
|
|
||||||
prjMaintainer:
|
prjMaintainer:
|
||||||
for _, prjm := range prjMaintainers {
|
for _, prjm := range prjMaintainers {
|
||||||
@@ -123,15 +139,20 @@ prjMaintainer:
|
|||||||
pkgMaintainers = append(pkgMaintainers, prjm)
|
pkgMaintainers = append(pkgMaintainers, prjm)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// expands groups
|
||||||
|
for _, g := range groups {
|
||||||
|
pkgMaintainers = g.ExpandMaintainers(pkgMaintainers)
|
||||||
|
}
|
||||||
|
|
||||||
return pkgMaintainers
|
return pkgMaintainers
|
||||||
}
|
}
|
||||||
|
|
||||||
func (data *MaintainershipMap) IsApproved(pkg string, reviews []*models.PullReview, submitter string) bool {
|
func (data *MaintainershipMap) IsApproved(pkg string, reviews []*models.PullReview, submitter string, groups []*ReviewGroup) bool {
|
||||||
var reviewers []string
|
var reviewers []string
|
||||||
if pkg != ProjectKey {
|
if pkg != ProjectKey {
|
||||||
reviewers = data.ListPackageMaintainers(pkg)
|
reviewers = data.ListPackageMaintainers(pkg, groups)
|
||||||
} else {
|
} else {
|
||||||
reviewers = data.ListProjectMaintainers()
|
reviewers = data.ListProjectMaintainers(groups)
|
||||||
}
|
}
|
||||||
|
|
||||||
if len(reviewers) == 0 {
|
if len(reviewers) == 0 {
|
||||||
@@ -139,7 +160,10 @@ func (data *MaintainershipMap) IsApproved(pkg string, reviews []*models.PullRevi
|
|||||||
}
|
}
|
||||||
|
|
||||||
LogDebug("Looking for review by:", reviewers)
|
LogDebug("Looking for review by:", reviewers)
|
||||||
if slices.Contains(reviewers, submitter) {
|
slices.Sort(reviewers)
|
||||||
|
reviewers = slices.Compact(reviewers)
|
||||||
|
SubmitterIdxInReviewers := slices.Index(reviewers, submitter)
|
||||||
|
if SubmitterIdxInReviewers > -1 && (!data.Config.ReviewRequired || len(reviewers) == 1) {
|
||||||
LogDebug("Submitter is maintainer. Approving.")
|
LogDebug("Submitter is maintainer. Approving.")
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
@@ -154,13 +178,135 @@ func (data *MaintainershipMap) IsApproved(pkg string, reviews []*models.PullRevi
|
|||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (data *MaintainershipMap) modifyInplace(writer io.StringWriter) error {
|
||||||
|
var original map[string][]string
|
||||||
|
if err := json.Unmarshal(data.Raw, &original); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
dec := json.NewDecoder(bytes.NewReader(data.Raw))
|
||||||
|
_, err := dec.Token()
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
output := ""
|
||||||
|
lastPos := 0
|
||||||
|
modified := false
|
||||||
|
|
||||||
|
type entry struct {
|
||||||
|
key string
|
||||||
|
valStart int
|
||||||
|
valEnd int
|
||||||
|
}
|
||||||
|
var entries []entry
|
||||||
|
|
||||||
|
for dec.More() {
|
||||||
|
kToken, _ := dec.Token()
|
||||||
|
key := kToken.(string)
|
||||||
|
var raw json.RawMessage
|
||||||
|
dec.Decode(&raw)
|
||||||
|
valEnd := int(dec.InputOffset())
|
||||||
|
valStart := valEnd - len(raw)
|
||||||
|
entries = append(entries, entry{key, valStart, valEnd})
|
||||||
|
}
|
||||||
|
|
||||||
|
changed := make(map[string]bool)
|
||||||
|
for k, v := range data.Data {
|
||||||
|
if ov, ok := original[k]; !ok || !slices.Equal(v, ov) {
|
||||||
|
changed[k] = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for k := range original {
|
||||||
|
if _, ok := data.Data[k]; !ok {
|
||||||
|
changed[k] = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(changed) == 0 {
|
||||||
|
_, err = writer.WriteString(string(data.Raw))
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, e := range entries {
|
||||||
|
if v, ok := data.Data[e.key]; ok {
|
||||||
|
prefix := string(data.Raw[lastPos:e.valStart])
|
||||||
|
if modified && strings.TrimSpace(output) == "{" {
|
||||||
|
if commaIdx := strings.Index(prefix, ","); commaIdx != -1 {
|
||||||
|
if quoteIdx := strings.Index(prefix, "\""); quoteIdx == -1 || commaIdx < quoteIdx {
|
||||||
|
prefix = prefix[:commaIdx] + prefix[commaIdx+1:]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
output += prefix
|
||||||
|
if changed[e.key] {
|
||||||
|
slices.Sort(v)
|
||||||
|
newVal, _ := json.Marshal(v)
|
||||||
|
output += string(newVal)
|
||||||
|
modified = true
|
||||||
|
} else {
|
||||||
|
output += string(data.Raw[e.valStart:e.valEnd])
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Deleted
|
||||||
|
modified = true
|
||||||
|
}
|
||||||
|
lastPos = e.valEnd
|
||||||
|
}
|
||||||
|
output += string(data.Raw[lastPos:])
|
||||||
|
|
||||||
|
// Handle additions (simplistic: at the end)
|
||||||
|
for k, v := range data.Data {
|
||||||
|
if _, ok := original[k]; !ok {
|
||||||
|
slices.Sort(v)
|
||||||
|
newVal, _ := json.Marshal(v)
|
||||||
|
keyStr, _ := json.Marshal(k)
|
||||||
|
|
||||||
|
// Insert before closing brace
|
||||||
|
if idx := strings.LastIndex(output, "}"); idx != -1 {
|
||||||
|
prefix := output[:idx]
|
||||||
|
suffix := output[idx:]
|
||||||
|
|
||||||
|
trimmedPrefix := strings.TrimRight(prefix, " \n\r\t")
|
||||||
|
if !strings.HasSuffix(trimmedPrefix, "{") && !strings.HasSuffix(trimmedPrefix, ",") {
|
||||||
|
// find the actual position of the last non-whitespace character in prefix
|
||||||
|
lastCharIdx := strings.LastIndexAny(prefix, "]}0123456789\"")
|
||||||
|
if lastCharIdx != -1 {
|
||||||
|
prefix = prefix[:lastCharIdx+1] + "," + prefix[lastCharIdx+1:]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
insertion := fmt.Sprintf(" %s: %s", string(keyStr), string(newVal))
|
||||||
|
if !strings.HasSuffix(prefix, "\n") {
|
||||||
|
insertion = "\n" + insertion
|
||||||
|
}
|
||||||
|
output = prefix + insertion + "\n" + suffix
|
||||||
|
modified = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if modified {
|
||||||
|
_, err := writer.WriteString(output)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
_, err = writer.WriteString(string(data.Raw))
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
func (data *MaintainershipMap) WriteMaintainershipFile(writer io.StringWriter) error {
|
func (data *MaintainershipMap) WriteMaintainershipFile(writer io.StringWriter) error {
|
||||||
if data.IsDir {
|
if data.IsDir {
|
||||||
return fmt.Errorf("Not implemented")
|
return fmt.Errorf("Not implemented")
|
||||||
}
|
}
|
||||||
|
|
||||||
writer.WriteString("{\n")
|
if len(data.Raw) > 0 {
|
||||||
|
if err := data.modifyInplace(writer); err == nil {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback to full write
|
||||||
|
writer.WriteString("{\n")
|
||||||
if d, ok := data.Data[""]; ok {
|
if d, ok := data.Data[""]; ok {
|
||||||
eol := ","
|
eol := ","
|
||||||
if len(data.Data) == 1 {
|
if len(data.Data) == 1 {
|
||||||
@@ -171,17 +317,12 @@ func (data *MaintainershipMap) WriteMaintainershipFile(writer io.StringWriter) e
|
|||||||
writer.WriteString(fmt.Sprintf(" \"\": %s%s\n", string(str), eol))
|
writer.WriteString(fmt.Sprintf(" \"\": %s%s\n", string(str), eol))
|
||||||
}
|
}
|
||||||
|
|
||||||
keys := make([]string, len(data.Data))
|
keys := make([]string, 0, len(data.Data))
|
||||||
i := 0
|
|
||||||
for pkg := range data.Data {
|
for pkg := range data.Data {
|
||||||
if pkg == "" {
|
if pkg == "" {
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
keys[i] = pkg
|
keys = append(keys, pkg)
|
||||||
i++
|
|
||||||
}
|
|
||||||
if len(keys) >= i {
|
|
||||||
keys = slices.Delete(keys, i, len(keys))
|
|
||||||
}
|
}
|
||||||
slices.Sort(keys)
|
slices.Sort(keys)
|
||||||
for i, pkg := range keys {
|
for i, pkg := range keys {
|
||||||
|
|||||||
@@ -13,10 +13,10 @@ import (
|
|||||||
)
|
)
|
||||||
|
|
||||||
func TestMaintainership(t *testing.T) {
|
func TestMaintainership(t *testing.T) {
|
||||||
config := common.AutogitConfig{
|
config := &common.AutogitConfig{
|
||||||
Branch: "bar",
|
Branch: "bar",
|
||||||
Organization: "foo",
|
Organization: "foo",
|
||||||
GitProjectName: common.DefaultGitPrj,
|
GitProjectName: common.DefaultGitPrj + "#bar",
|
||||||
}
|
}
|
||||||
|
|
||||||
packageTests := []struct {
|
packageTests := []struct {
|
||||||
@@ -28,6 +28,8 @@ func TestMaintainership(t *testing.T) {
|
|||||||
maintainersFile []byte
|
maintainersFile []byte
|
||||||
maintainersFileErr error
|
maintainersFileErr error
|
||||||
|
|
||||||
|
groups []*common.ReviewGroup
|
||||||
|
|
||||||
maintainersDir map[string][]byte
|
maintainersDir map[string][]byte
|
||||||
}{
|
}{
|
||||||
/* PACKAGE MAINTAINERS */
|
/* PACKAGE MAINTAINERS */
|
||||||
@@ -51,6 +53,22 @@ func TestMaintainership(t *testing.T) {
|
|||||||
maintainers: []string{"user1", "user2", "user3"},
|
maintainers: []string{"user1", "user2", "user3"},
|
||||||
packageName: "pkg",
|
packageName: "pkg",
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
name: "Multiple package maintainers and groups",
|
||||||
|
maintainersFile: []byte(`{"pkg": ["user1", "user2", "g2"], "": ["g2", "user1", "user3"]}`),
|
||||||
|
maintainersDir: map[string][]byte{
|
||||||
|
"_project": []byte(`{"": ["user1", "user3", "g2"]}`),
|
||||||
|
"pkg": []byte(`{"pkg": ["user1", "g2", "user2"]}`),
|
||||||
|
},
|
||||||
|
maintainers: []string{"user1", "user2", "user3", "user5"},
|
||||||
|
packageName: "pkg",
|
||||||
|
groups: []*common.ReviewGroup{
|
||||||
|
{
|
||||||
|
Name: "g2",
|
||||||
|
Reviewers: []string{"user1", "user5"},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
{
|
{
|
||||||
name: "No package maintainers and only project maintainer",
|
name: "No package maintainers and only project maintainer",
|
||||||
maintainersFile: []byte(`{"pkg2": ["user1", "user2"], "": ["user1", "user3"]}`),
|
maintainersFile: []byte(`{"pkg2": ["user1", "user2"], "": ["user1", "user3"]}`),
|
||||||
@@ -123,7 +141,7 @@ func TestMaintainership(t *testing.T) {
|
|||||||
notFoundError := repository.NewRepoGetContentsNotFound()
|
notFoundError := repository.NewRepoGetContentsNotFound()
|
||||||
for _, test := range packageTests {
|
for _, test := range packageTests {
|
||||||
runTests := func(t *testing.T, mi common.GiteaMaintainershipReader) {
|
runTests := func(t *testing.T, mi common.GiteaMaintainershipReader) {
|
||||||
maintainers, err := common.FetchProjectMaintainershipData(mi, config.Organization, config.GitProjectName, config.Branch)
|
maintainers, err := common.FetchProjectMaintainershipData(mi, config)
|
||||||
if err != nil && !test.otherError {
|
if err != nil && !test.otherError {
|
||||||
if test.maintainersFileErr == nil {
|
if test.maintainersFileErr == nil {
|
||||||
t.Fatal("Unexpected error recieved", err)
|
t.Fatal("Unexpected error recieved", err)
|
||||||
@@ -138,9 +156,9 @@ func TestMaintainership(t *testing.T) {
|
|||||||
|
|
||||||
var m []string
|
var m []string
|
||||||
if len(test.packageName) > 0 {
|
if len(test.packageName) > 0 {
|
||||||
m = maintainers.ListPackageMaintainers(test.packageName)
|
m = maintainers.ListPackageMaintainers(test.packageName, test.groups)
|
||||||
} else {
|
} else {
|
||||||
m = maintainers.ListProjectMaintainers()
|
m = maintainers.ListProjectMaintainers(test.groups)
|
||||||
}
|
}
|
||||||
|
|
||||||
if len(m) != len(test.maintainers) {
|
if len(m) != len(test.maintainers) {
|
||||||
@@ -190,6 +208,7 @@ func TestMaintainershipFileWrite(t *testing.T) {
|
|||||||
name string
|
name string
|
||||||
is_dir bool
|
is_dir bool
|
||||||
maintainers map[string][]string
|
maintainers map[string][]string
|
||||||
|
raw []byte
|
||||||
expected_output string
|
expected_output string
|
||||||
expected_error error
|
expected_error error
|
||||||
}{
|
}{
|
||||||
@@ -207,12 +226,49 @@ func TestMaintainershipFileWrite(t *testing.T) {
|
|||||||
{
|
{
|
||||||
name: "2 project maintainers and 2 single package maintainers",
|
name: "2 project maintainers and 2 single package maintainers",
|
||||||
maintainers: map[string][]string{
|
maintainers: map[string][]string{
|
||||||
"": {"two", "one"},
|
"": {"two", "one"},
|
||||||
"pkg1": {},
|
"pkg1": {},
|
||||||
"foo": {"four", "byte"},
|
"foo": {"four", "byte"},
|
||||||
},
|
},
|
||||||
expected_output: "{\n \"\": [\"one\",\"two\"],\n \"foo\": [\"byte\",\"four\"],\n \"pkg1\": []\n}\n",
|
expected_output: "{\n \"\": [\"one\",\"two\"],\n \"foo\": [\"byte\",\"four\"],\n \"pkg1\": []\n}\n",
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
name: "surgical modification",
|
||||||
|
maintainers: map[string][]string{
|
||||||
|
"": {"one", "two"},
|
||||||
|
"foo": {"byte", "four", "newone"},
|
||||||
|
"pkg1": {},
|
||||||
|
},
|
||||||
|
raw: []byte("{\n \"\": [\"one\",\"two\"],\n \"foo\": [\"byte\",\"four\"],\n \"pkg1\": []\n}\n"),
|
||||||
|
expected_output: "{\n \"\": [\"one\",\"two\"],\n \"foo\": [\"byte\",\"four\",\"newone\"],\n \"pkg1\": []\n}\n",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "no change",
|
||||||
|
maintainers: map[string][]string{
|
||||||
|
"": {"one", "two"},
|
||||||
|
"foo": {"byte", "four"},
|
||||||
|
"pkg1": {},
|
||||||
|
},
|
||||||
|
raw: []byte("{\n \"\": [\"one\",\"two\"],\n \"foo\": [\"byte\",\"four\"],\n \"pkg1\": []\n}\n"),
|
||||||
|
expected_output: "{\n \"\": [\"one\",\"two\"],\n \"foo\": [\"byte\",\"four\"],\n \"pkg1\": []\n}\n",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "surgical addition",
|
||||||
|
maintainers: map[string][]string{
|
||||||
|
"": {"one"},
|
||||||
|
"new": {"user"},
|
||||||
|
},
|
||||||
|
raw: []byte("{\n \"\": [ \"one\" ]\n}\n"),
|
||||||
|
expected_output: "{\n \"\": [ \"one\" ],\n \"new\": [\"user\"]\n}\n",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "surgical deletion",
|
||||||
|
maintainers: map[string][]string{
|
||||||
|
"": {"one"},
|
||||||
|
},
|
||||||
|
raw: []byte("{\n \"\": [\"one\"],\n \"old\": [\"user\"]\n}\n"),
|
||||||
|
expected_output: "{\n \"\": [\"one\"]\n}\n",
|
||||||
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
for _, test := range tests {
|
for _, test := range tests {
|
||||||
@@ -221,6 +277,7 @@ func TestMaintainershipFileWrite(t *testing.T) {
|
|||||||
data := common.MaintainershipMap{
|
data := common.MaintainershipMap{
|
||||||
Data: test.maintainers,
|
Data: test.maintainers,
|
||||||
IsDir: test.is_dir,
|
IsDir: test.is_dir,
|
||||||
|
Raw: test.raw,
|
||||||
}
|
}
|
||||||
|
|
||||||
if err := data.WriteMaintainershipFile(&b); err != test.expected_error {
|
if err := data.WriteMaintainershipFile(&b); err != test.expected_error {
|
||||||
@@ -230,8 +287,134 @@ func TestMaintainershipFileWrite(t *testing.T) {
|
|||||||
output := b.String()
|
output := b.String()
|
||||||
|
|
||||||
if test.expected_output != output {
|
if test.expected_output != output {
|
||||||
t.Fatal("unexpected output:", output, "Expecting:", test.expected_output)
|
t.Fatalf("unexpected output:\n%q\nExpecting:\n%q", output, test.expected_output)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestReviewRequired(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
maintainers []string
|
||||||
|
config *common.AutogitConfig
|
||||||
|
is_approved bool
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "ReviewRequired=false",
|
||||||
|
maintainers: []string{"maintainer1", "maintainer2"},
|
||||||
|
config: &common.AutogitConfig{ReviewRequired: false},
|
||||||
|
is_approved: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "ReviewRequired=true",
|
||||||
|
maintainers: []string{"maintainer1", "maintainer2"},
|
||||||
|
config: &common.AutogitConfig{ReviewRequired: true},
|
||||||
|
is_approved: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "ReviewRequired=true",
|
||||||
|
maintainers: []string{"maintainer1"},
|
||||||
|
config: &common.AutogitConfig{ReviewRequired: true},
|
||||||
|
is_approved: true,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, test := range tests {
|
||||||
|
t.Run(test.name, func(t *testing.T) {
|
||||||
|
m := &common.MaintainershipMap{
|
||||||
|
Data: map[string][]string{"": test.maintainers},
|
||||||
|
}
|
||||||
|
m.Config = test.config
|
||||||
|
if approved := m.IsApproved("", nil, "maintainer1", nil); approved != test.is_approved {
|
||||||
|
t.Error("Expected m.IsApproved()->", test.is_approved, "but didn't get it")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestMaintainershipDataCorruption_PackageAppend(t *testing.T) {
|
||||||
|
// Test corruption when append happens (merging project maintainers)
|
||||||
|
// If backing array has capacity, append writes to it.
|
||||||
|
|
||||||
|
// We construct a slice with capacity > len to simulate this common scenario
|
||||||
|
backingArray := make([]string, 1, 10)
|
||||||
|
backingArray[0] = "@g1"
|
||||||
|
|
||||||
|
initialData := map[string][]string{
|
||||||
|
"pkg": backingArray, // len 1, cap 10
|
||||||
|
"": {"prjUser"},
|
||||||
|
}
|
||||||
|
|
||||||
|
m := &common.MaintainershipMap{
|
||||||
|
Data: initialData,
|
||||||
|
}
|
||||||
|
|
||||||
|
groups := []*common.ReviewGroup{
|
||||||
|
{
|
||||||
|
Name: "@g1",
|
||||||
|
Reviewers: []string{"u1"},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
// ListPackageMaintainers("pkg", groups)
|
||||||
|
// 1. gets ["@g1"] (cap 10)
|
||||||
|
// 2. Appends "prjUser" -> ["@g1", "prjUser"] (in backing array)
|
||||||
|
// 3. Expands "@g1" -> "u1".
|
||||||
|
// Replace: ["u1", "prjUser"]
|
||||||
|
// Sort: ["prjUser", "u1"]
|
||||||
|
//
|
||||||
|
// The backing array is now ["prjUser", "u1", ...]
|
||||||
|
// The map entry "pkg" is still len 1.
|
||||||
|
// So it sees ["prjUser"].
|
||||||
|
|
||||||
|
list1 := m.ListPackageMaintainers("pkg", groups)
|
||||||
|
t.Logf("List1: %v", list1)
|
||||||
|
|
||||||
|
// ListPackageMaintainers("pkg", nil)
|
||||||
|
// Should be ["@g1", "prjUser"] (because prjUser is appended from project maintainers)
|
||||||
|
// But since backing array is corrupted:
|
||||||
|
// It sees ["prjUser"] (from map) + appends "prjUser" -> ["prjUser", "prjUser"].
|
||||||
|
|
||||||
|
list2 := m.ListPackageMaintainers("pkg", nil)
|
||||||
|
t.Logf("List2: %v", list2)
|
||||||
|
|
||||||
|
if !slices.Contains(list2, "@g1") {
|
||||||
|
t.Errorf("Corruption: '@g1' is missing from second call. Got %v", list2)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestMaintainershipDataCorruption_ProjectInPlace(t *testing.T) {
|
||||||
|
// Test corruption in ListProjectMaintainers when replacement fits in place
|
||||||
|
// e.g. replacing 1 group with 1 user.
|
||||||
|
|
||||||
|
initialData := map[string][]string{
|
||||||
|
"": {"@g1"},
|
||||||
|
}
|
||||||
|
|
||||||
|
m := &common.MaintainershipMap{
|
||||||
|
Data: initialData,
|
||||||
|
}
|
||||||
|
|
||||||
|
groups := []*common.ReviewGroup{
|
||||||
|
{
|
||||||
|
Name: "@g1",
|
||||||
|
Reviewers: []string{"u1"},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
// First call with expansion
|
||||||
|
// Replaces "@g1" with "u1". Length stays 1. Modifies backing array in place.
|
||||||
|
list1 := m.ListProjectMaintainers(groups)
|
||||||
|
t.Logf("List1: %v", list1)
|
||||||
|
|
||||||
|
// Second call without expansion
|
||||||
|
// Should return ["@g1"]
|
||||||
|
list2 := m.ListProjectMaintainers(nil)
|
||||||
|
t.Logf("List2: %v", list2)
|
||||||
|
|
||||||
|
if !slices.Contains(list2, "@g1") {
|
||||||
|
t.Errorf("Corruption: '@g1' is missing from second call (Project). Got %v", list2)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|||||||
56
common/manifest.go
Normal file
56
common/manifest.go
Normal file
@@ -0,0 +1,56 @@
|
|||||||
|
package common
|
||||||
|
|
||||||
|
import (
|
||||||
|
"os"
|
||||||
|
"path"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"gopkg.in/yaml.v3"
|
||||||
|
)
|
||||||
|
|
||||||
|
type Manifest struct {
|
||||||
|
Subdirectories []string
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m *Manifest) SubdirForPackage(pkg string) string {
|
||||||
|
if m == nil {
|
||||||
|
return pkg
|
||||||
|
}
|
||||||
|
|
||||||
|
idx := -1
|
||||||
|
matchLen := 0
|
||||||
|
basePkg := path.Base(pkg)
|
||||||
|
lowercasePkg := strings.ToLower(basePkg)
|
||||||
|
|
||||||
|
for i, sub := range m.Subdirectories {
|
||||||
|
basename := strings.ToLower(path.Base(sub))
|
||||||
|
if strings.HasPrefix(lowercasePkg, basename) && matchLen < len(basename) {
|
||||||
|
idx = i
|
||||||
|
matchLen = len(basename)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if idx > -1 {
|
||||||
|
return path.Join(m.Subdirectories[idx], basePkg)
|
||||||
|
}
|
||||||
|
return pkg
|
||||||
|
}
|
||||||
|
|
||||||
|
func ReadManifestFile(filename string) (*Manifest, error) {
|
||||||
|
data, err := os.ReadFile(filename)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return ParseManifestFile(data)
|
||||||
|
}
|
||||||
|
|
||||||
|
func ParseManifestFile(data []byte) (*Manifest, error) {
|
||||||
|
ret := &Manifest{}
|
||||||
|
err := yaml.Unmarshal(data, ret)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return ret, nil
|
||||||
|
}
|
||||||
56
common/manifest_test.go
Normal file
56
common/manifest_test.go
Normal file
@@ -0,0 +1,56 @@
|
|||||||
|
package common_test
|
||||||
|
|
||||||
|
import (
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"src.opensuse.org/autogits/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestManifestSubdirAssignments(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
Name string
|
||||||
|
ManifestContent string
|
||||||
|
Packages []string
|
||||||
|
ManifestLocations []string
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
Name: "empty manifest",
|
||||||
|
Packages: []string{"atom", "blarg", "Foobar", "X-Ray", "boost", "NodeJS"},
|
||||||
|
ManifestLocations: []string{"atom", "blarg", "Foobar", "X-Ray", "boost", "NodeJS"},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "only few subdirs manifest",
|
||||||
|
ManifestContent: "subdirectories:\n - a\n - b",
|
||||||
|
Packages: []string{"atom", "blarg", "Foobar", "X-Ray", "Boost", "NodeJS"},
|
||||||
|
ManifestLocations: []string{"a/atom", "b/blarg", "Foobar", "X-Ray", "b/Boost", "NodeJS"},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "multilayer subdirs manifest",
|
||||||
|
ManifestContent: "subdirectories:\n - a\n - b\n - libs/boo",
|
||||||
|
Packages: []string{"atom", "blarg", "Foobar", "X-Ray", "Boost", "NodeJS"},
|
||||||
|
ManifestLocations: []string{"a/atom", "b/blarg", "Foobar", "X-Ray", "libs/boo/Boost", "NodeJS"},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "multilayer subdirs manifest with trailing /",
|
||||||
|
ManifestContent: "subdirectories:\n - a\n - b\n - libs/boo/\n - somedir/Node/",
|
||||||
|
Packages: []string{"atom", "blarg", "Foobar", "X-Ray", "Boost", "NodeJS", "foobar/node2"},
|
||||||
|
ManifestLocations: []string{"a/atom", "b/blarg", "Foobar", "X-Ray", "libs/boo/Boost", "somedir/Node/NodeJS", "somedir/Node/node2"},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, test := range tests {
|
||||||
|
t.Run(test.Name, func(t *testing.T) {
|
||||||
|
m, err := common.ParseManifestFile([]byte(test.ManifestContent))
|
||||||
|
if err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, pkg := range test.Packages {
|
||||||
|
expected := test.ManifestLocations[i]
|
||||||
|
if l := m.SubdirForPackage(pkg); l != expected {
|
||||||
|
t.Error("Expected:", expected, "but got:", l)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
120
common/mock/config.go
Normal file
120
common/mock/config.go
Normal file
@@ -0,0 +1,120 @@
|
|||||||
|
// Code generated by MockGen. DO NOT EDIT.
|
||||||
|
// Source: config.go
|
||||||
|
//
|
||||||
|
// Generated by this command:
|
||||||
|
//
|
||||||
|
// mockgen -source=config.go -destination=mock/config.go -typed
|
||||||
|
//
|
||||||
|
|
||||||
|
// Package mock_common is a generated GoMock package.
|
||||||
|
package mock_common
|
||||||
|
|
||||||
|
import (
|
||||||
|
reflect "reflect"
|
||||||
|
|
||||||
|
gomock "go.uber.org/mock/gomock"
|
||||||
|
models "src.opensuse.org/autogits/common/gitea-generated/models"
|
||||||
|
)
|
||||||
|
|
||||||
|
// MockGiteaFileContentAndRepoFetcher is a mock of GiteaFileContentAndRepoFetcher interface.
|
||||||
|
type MockGiteaFileContentAndRepoFetcher struct {
|
||||||
|
ctrl *gomock.Controller
|
||||||
|
recorder *MockGiteaFileContentAndRepoFetcherMockRecorder
|
||||||
|
isgomock struct{}
|
||||||
|
}
|
||||||
|
|
||||||
|
// MockGiteaFileContentAndRepoFetcherMockRecorder is the mock recorder for MockGiteaFileContentAndRepoFetcher.
|
||||||
|
type MockGiteaFileContentAndRepoFetcherMockRecorder struct {
|
||||||
|
mock *MockGiteaFileContentAndRepoFetcher
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewMockGiteaFileContentAndRepoFetcher creates a new mock instance.
|
||||||
|
func NewMockGiteaFileContentAndRepoFetcher(ctrl *gomock.Controller) *MockGiteaFileContentAndRepoFetcher {
|
||||||
|
mock := &MockGiteaFileContentAndRepoFetcher{ctrl: ctrl}
|
||||||
|
mock.recorder = &MockGiteaFileContentAndRepoFetcherMockRecorder{mock}
|
||||||
|
return mock
|
||||||
|
}
|
||||||
|
|
||||||
|
// EXPECT returns an object that allows the caller to indicate expected use.
|
||||||
|
func (m *MockGiteaFileContentAndRepoFetcher) EXPECT() *MockGiteaFileContentAndRepoFetcherMockRecorder {
|
||||||
|
return m.recorder
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetRepository mocks base method.
|
||||||
|
func (m *MockGiteaFileContentAndRepoFetcher) GetRepository(org, repo string) (*models.Repository, error) {
|
||||||
|
m.ctrl.T.Helper()
|
||||||
|
ret := m.ctrl.Call(m, "GetRepository", org, repo)
|
||||||
|
ret0, _ := ret[0].(*models.Repository)
|
||||||
|
ret1, _ := ret[1].(error)
|
||||||
|
return ret0, ret1
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetRepository indicates an expected call of GetRepository.
|
||||||
|
func (mr *MockGiteaFileContentAndRepoFetcherMockRecorder) GetRepository(org, repo any) *MockGiteaFileContentAndRepoFetcherGetRepositoryCall {
|
||||||
|
mr.mock.ctrl.T.Helper()
|
||||||
|
call := mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "GetRepository", reflect.TypeOf((*MockGiteaFileContentAndRepoFetcher)(nil).GetRepository), org, repo)
|
||||||
|
return &MockGiteaFileContentAndRepoFetcherGetRepositoryCall{Call: call}
|
||||||
|
}
|
||||||
|
|
||||||
|
// MockGiteaFileContentAndRepoFetcherGetRepositoryCall wrap *gomock.Call
|
||||||
|
type MockGiteaFileContentAndRepoFetcherGetRepositoryCall struct {
|
||||||
|
*gomock.Call
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return rewrite *gomock.Call.Return
|
||||||
|
func (c *MockGiteaFileContentAndRepoFetcherGetRepositoryCall) Return(arg0 *models.Repository, arg1 error) *MockGiteaFileContentAndRepoFetcherGetRepositoryCall {
|
||||||
|
c.Call = c.Call.Return(arg0, arg1)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// Do rewrite *gomock.Call.Do
|
||||||
|
func (c *MockGiteaFileContentAndRepoFetcherGetRepositoryCall) Do(f func(string, string) (*models.Repository, error)) *MockGiteaFileContentAndRepoFetcherGetRepositoryCall {
|
||||||
|
c.Call = c.Call.Do(f)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// DoAndReturn rewrite *gomock.Call.DoAndReturn
|
||||||
|
func (c *MockGiteaFileContentAndRepoFetcherGetRepositoryCall) DoAndReturn(f func(string, string) (*models.Repository, error)) *MockGiteaFileContentAndRepoFetcherGetRepositoryCall {
|
||||||
|
c.Call = c.Call.DoAndReturn(f)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetRepositoryFileContent mocks base method.
|
||||||
|
func (m *MockGiteaFileContentAndRepoFetcher) GetRepositoryFileContent(org, repo, hash, path string) ([]byte, string, error) {
|
||||||
|
m.ctrl.T.Helper()
|
||||||
|
ret := m.ctrl.Call(m, "GetRepositoryFileContent", org, repo, hash, path)
|
||||||
|
ret0, _ := ret[0].([]byte)
|
||||||
|
ret1, _ := ret[1].(string)
|
||||||
|
ret2, _ := ret[2].(error)
|
||||||
|
return ret0, ret1, ret2
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetRepositoryFileContent indicates an expected call of GetRepositoryFileContent.
|
||||||
|
func (mr *MockGiteaFileContentAndRepoFetcherMockRecorder) GetRepositoryFileContent(org, repo, hash, path any) *MockGiteaFileContentAndRepoFetcherGetRepositoryFileContentCall {
|
||||||
|
mr.mock.ctrl.T.Helper()
|
||||||
|
call := mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "GetRepositoryFileContent", reflect.TypeOf((*MockGiteaFileContentAndRepoFetcher)(nil).GetRepositoryFileContent), org, repo, hash, path)
|
||||||
|
return &MockGiteaFileContentAndRepoFetcherGetRepositoryFileContentCall{Call: call}
|
||||||
|
}
|
||||||
|
|
||||||
|
// MockGiteaFileContentAndRepoFetcherGetRepositoryFileContentCall wrap *gomock.Call
|
||||||
|
type MockGiteaFileContentAndRepoFetcherGetRepositoryFileContentCall struct {
|
||||||
|
*gomock.Call
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return rewrite *gomock.Call.Return
|
||||||
|
func (c *MockGiteaFileContentAndRepoFetcherGetRepositoryFileContentCall) Return(arg0 []byte, arg1 string, arg2 error) *MockGiteaFileContentAndRepoFetcherGetRepositoryFileContentCall {
|
||||||
|
c.Call = c.Call.Return(arg0, arg1, arg2)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// Do rewrite *gomock.Call.Do
|
||||||
|
func (c *MockGiteaFileContentAndRepoFetcherGetRepositoryFileContentCall) Do(f func(string, string, string, string) ([]byte, string, error)) *MockGiteaFileContentAndRepoFetcherGetRepositoryFileContentCall {
|
||||||
|
c.Call = c.Call.Do(f)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// DoAndReturn rewrite *gomock.Call.DoAndReturn
|
||||||
|
func (c *MockGiteaFileContentAndRepoFetcherGetRepositoryFileContentCall) DoAndReturn(f func(string, string, string, string) ([]byte, string, error)) *MockGiteaFileContentAndRepoFetcherGetRepositoryFileContentCall {
|
||||||
|
c.Call = c.Call.DoAndReturn(f)
|
||||||
|
return c
|
||||||
|
}
|
||||||
1148
common/mock/git_utils.go
Normal file
1148
common/mock/git_utils.go
Normal file
File diff suppressed because it is too large
Load Diff
3598
common/mock/gitea_utils.go
Normal file
3598
common/mock/gitea_utils.go
Normal file
File diff suppressed because it is too large
Load Diff
156
common/mock/maintainership.go
Normal file
156
common/mock/maintainership.go
Normal file
@@ -0,0 +1,156 @@
|
|||||||
|
// Code generated by MockGen. DO NOT EDIT.
|
||||||
|
// Source: maintainership.go
|
||||||
|
//
|
||||||
|
// Generated by this command:
|
||||||
|
//
|
||||||
|
// mockgen -source=maintainership.go -destination=mock/maintainership.go -typed
|
||||||
|
//
|
||||||
|
|
||||||
|
// Package mock_common is a generated GoMock package.
|
||||||
|
package mock_common
|
||||||
|
|
||||||
|
import (
|
||||||
|
reflect "reflect"
|
||||||
|
|
||||||
|
gomock "go.uber.org/mock/gomock"
|
||||||
|
common "src.opensuse.org/autogits/common"
|
||||||
|
models "src.opensuse.org/autogits/common/gitea-generated/models"
|
||||||
|
)
|
||||||
|
|
||||||
|
// MockMaintainershipData is a mock of MaintainershipData interface.
|
||||||
|
type MockMaintainershipData struct {
|
||||||
|
ctrl *gomock.Controller
|
||||||
|
recorder *MockMaintainershipDataMockRecorder
|
||||||
|
isgomock struct{}
|
||||||
|
}
|
||||||
|
|
||||||
|
// MockMaintainershipDataMockRecorder is the mock recorder for MockMaintainershipData.
|
||||||
|
type MockMaintainershipDataMockRecorder struct {
|
||||||
|
mock *MockMaintainershipData
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewMockMaintainershipData creates a new mock instance.
|
||||||
|
func NewMockMaintainershipData(ctrl *gomock.Controller) *MockMaintainershipData {
|
||||||
|
mock := &MockMaintainershipData{ctrl: ctrl}
|
||||||
|
mock.recorder = &MockMaintainershipDataMockRecorder{mock}
|
||||||
|
return mock
|
||||||
|
}
|
||||||
|
|
||||||
|
// EXPECT returns an object that allows the caller to indicate expected use.
|
||||||
|
func (m *MockMaintainershipData) EXPECT() *MockMaintainershipDataMockRecorder {
|
||||||
|
return m.recorder
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsApproved mocks base method.
|
||||||
|
func (m *MockMaintainershipData) IsApproved(Pkg string, Reviews []*models.PullReview, Submitter string, ReviewGroups []*common.ReviewGroup) bool {
|
||||||
|
m.ctrl.T.Helper()
|
||||||
|
ret := m.ctrl.Call(m, "IsApproved", Pkg, Reviews, Submitter, ReviewGroups)
|
||||||
|
ret0, _ := ret[0].(bool)
|
||||||
|
return ret0
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsApproved indicates an expected call of IsApproved.
|
||||||
|
func (mr *MockMaintainershipDataMockRecorder) IsApproved(Pkg, Reviews, Submitter, ReviewGroups any) *MockMaintainershipDataIsApprovedCall {
|
||||||
|
mr.mock.ctrl.T.Helper()
|
||||||
|
call := mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "IsApproved", reflect.TypeOf((*MockMaintainershipData)(nil).IsApproved), Pkg, Reviews, Submitter, ReviewGroups)
|
||||||
|
return &MockMaintainershipDataIsApprovedCall{Call: call}
|
||||||
|
}
|
||||||
|
|
||||||
|
// MockMaintainershipDataIsApprovedCall wrap *gomock.Call
|
||||||
|
type MockMaintainershipDataIsApprovedCall struct {
|
||||||
|
*gomock.Call
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return rewrite *gomock.Call.Return
|
||||||
|
func (c *MockMaintainershipDataIsApprovedCall) Return(arg0 bool) *MockMaintainershipDataIsApprovedCall {
|
||||||
|
c.Call = c.Call.Return(arg0)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// Do rewrite *gomock.Call.Do
|
||||||
|
func (c *MockMaintainershipDataIsApprovedCall) Do(f func(string, []*models.PullReview, string, []*common.ReviewGroup) bool) *MockMaintainershipDataIsApprovedCall {
|
||||||
|
c.Call = c.Call.Do(f)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// DoAndReturn rewrite *gomock.Call.DoAndReturn
|
||||||
|
func (c *MockMaintainershipDataIsApprovedCall) DoAndReturn(f func(string, []*models.PullReview, string, []*common.ReviewGroup) bool) *MockMaintainershipDataIsApprovedCall {
|
||||||
|
c.Call = c.Call.DoAndReturn(f)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// ListPackageMaintainers mocks base method.
|
||||||
|
func (m *MockMaintainershipData) ListPackageMaintainers(Pkg string, OptionalGroupExpasion []*common.ReviewGroup) []string {
|
||||||
|
m.ctrl.T.Helper()
|
||||||
|
ret := m.ctrl.Call(m, "ListPackageMaintainers", Pkg, OptionalGroupExpasion)
|
||||||
|
ret0, _ := ret[0].([]string)
|
||||||
|
return ret0
|
||||||
|
}
|
||||||
|
|
||||||
|
// ListPackageMaintainers indicates an expected call of ListPackageMaintainers.
|
||||||
|
func (mr *MockMaintainershipDataMockRecorder) ListPackageMaintainers(Pkg, OptionalGroupExpasion any) *MockMaintainershipDataListPackageMaintainersCall {
|
||||||
|
mr.mock.ctrl.T.Helper()
|
||||||
|
call := mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "ListPackageMaintainers", reflect.TypeOf((*MockMaintainershipData)(nil).ListPackageMaintainers), Pkg, OptionalGroupExpasion)
|
||||||
|
return &MockMaintainershipDataListPackageMaintainersCall{Call: call}
|
||||||
|
}
|
||||||
|
|
||||||
|
// MockMaintainershipDataListPackageMaintainersCall wrap *gomock.Call
|
||||||
|
type MockMaintainershipDataListPackageMaintainersCall struct {
|
||||||
|
*gomock.Call
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return rewrite *gomock.Call.Return
|
||||||
|
func (c *MockMaintainershipDataListPackageMaintainersCall) Return(arg0 []string) *MockMaintainershipDataListPackageMaintainersCall {
|
||||||
|
c.Call = c.Call.Return(arg0)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// Do rewrite *gomock.Call.Do
|
||||||
|
func (c *MockMaintainershipDataListPackageMaintainersCall) Do(f func(string, []*common.ReviewGroup) []string) *MockMaintainershipDataListPackageMaintainersCall {
|
||||||
|
c.Call = c.Call.Do(f)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// DoAndReturn rewrite *gomock.Call.DoAndReturn
|
||||||
|
func (c *MockMaintainershipDataListPackageMaintainersCall) DoAndReturn(f func(string, []*common.ReviewGroup) []string) *MockMaintainershipDataListPackageMaintainersCall {
|
||||||
|
c.Call = c.Call.DoAndReturn(f)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// ListProjectMaintainers mocks base method.
|
||||||
|
func (m *MockMaintainershipData) ListProjectMaintainers(OptionalGroupExpansion []*common.ReviewGroup) []string {
|
||||||
|
m.ctrl.T.Helper()
|
||||||
|
ret := m.ctrl.Call(m, "ListProjectMaintainers", OptionalGroupExpansion)
|
||||||
|
ret0, _ := ret[0].([]string)
|
||||||
|
return ret0
|
||||||
|
}
|
||||||
|
|
||||||
|
// ListProjectMaintainers indicates an expected call of ListProjectMaintainers.
|
||||||
|
func (mr *MockMaintainershipDataMockRecorder) ListProjectMaintainers(OptionalGroupExpansion any) *MockMaintainershipDataListProjectMaintainersCall {
|
||||||
|
mr.mock.ctrl.T.Helper()
|
||||||
|
call := mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "ListProjectMaintainers", reflect.TypeOf((*MockMaintainershipData)(nil).ListProjectMaintainers), OptionalGroupExpansion)
|
||||||
|
return &MockMaintainershipDataListProjectMaintainersCall{Call: call}
|
||||||
|
}
|
||||||
|
|
||||||
|
// MockMaintainershipDataListProjectMaintainersCall wrap *gomock.Call
|
||||||
|
type MockMaintainershipDataListProjectMaintainersCall struct {
|
||||||
|
*gomock.Call
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return rewrite *gomock.Call.Return
|
||||||
|
func (c *MockMaintainershipDataListProjectMaintainersCall) Return(arg0 []string) *MockMaintainershipDataListProjectMaintainersCall {
|
||||||
|
c.Call = c.Call.Return(arg0)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// Do rewrite *gomock.Call.Do
|
||||||
|
func (c *MockMaintainershipDataListProjectMaintainersCall) Do(f func([]*common.ReviewGroup) []string) *MockMaintainershipDataListProjectMaintainersCall {
|
||||||
|
c.Call = c.Call.Do(f)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// DoAndReturn rewrite *gomock.Call.DoAndReturn
|
||||||
|
func (c *MockMaintainershipDataListProjectMaintainersCall) DoAndReturn(f func([]*common.ReviewGroup) []string) *MockMaintainershipDataListProjectMaintainersCall {
|
||||||
|
c.Call = c.Call.DoAndReturn(f)
|
||||||
|
return c
|
||||||
|
}
|
||||||
85
common/mock/obs_utils.go
Normal file
85
common/mock/obs_utils.go
Normal file
@@ -0,0 +1,85 @@
|
|||||||
|
// Code generated by MockGen. DO NOT EDIT.
|
||||||
|
// Source: obs_utils.go
|
||||||
|
//
|
||||||
|
// Generated by this command:
|
||||||
|
//
|
||||||
|
// mockgen -source=obs_utils.go -destination=mock/obs_utils.go -typed
|
||||||
|
//
|
||||||
|
|
||||||
|
// Package mock_common is a generated GoMock package.
|
||||||
|
package mock_common
|
||||||
|
|
||||||
|
import (
|
||||||
|
reflect "reflect"
|
||||||
|
|
||||||
|
gomock "go.uber.org/mock/gomock"
|
||||||
|
common "src.opensuse.org/autogits/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
// MockObsStatusFetcherWithState is a mock of ObsStatusFetcherWithState interface.
|
||||||
|
type MockObsStatusFetcherWithState struct {
|
||||||
|
ctrl *gomock.Controller
|
||||||
|
recorder *MockObsStatusFetcherWithStateMockRecorder
|
||||||
|
isgomock struct{}
|
||||||
|
}
|
||||||
|
|
||||||
|
// MockObsStatusFetcherWithStateMockRecorder is the mock recorder for MockObsStatusFetcherWithState.
|
||||||
|
type MockObsStatusFetcherWithStateMockRecorder struct {
|
||||||
|
mock *MockObsStatusFetcherWithState
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewMockObsStatusFetcherWithState creates a new mock instance.
|
||||||
|
func NewMockObsStatusFetcherWithState(ctrl *gomock.Controller) *MockObsStatusFetcherWithState {
|
||||||
|
mock := &MockObsStatusFetcherWithState{ctrl: ctrl}
|
||||||
|
mock.recorder = &MockObsStatusFetcherWithStateMockRecorder{mock}
|
||||||
|
return mock
|
||||||
|
}
|
||||||
|
|
||||||
|
// EXPECT returns an object that allows the caller to indicate expected use.
|
||||||
|
func (m *MockObsStatusFetcherWithState) EXPECT() *MockObsStatusFetcherWithStateMockRecorder {
|
||||||
|
return m.recorder
|
||||||
|
}
|
||||||
|
|
||||||
|
// BuildStatusWithState mocks base method.
|
||||||
|
func (m *MockObsStatusFetcherWithState) BuildStatusWithState(project string, opts *common.BuildResultOptions, packages ...string) (*common.BuildResultList, error) {
|
||||||
|
m.ctrl.T.Helper()
|
||||||
|
varargs := []any{project, opts}
|
||||||
|
for _, a := range packages {
|
||||||
|
varargs = append(varargs, a)
|
||||||
|
}
|
||||||
|
ret := m.ctrl.Call(m, "BuildStatusWithState", varargs...)
|
||||||
|
ret0, _ := ret[0].(*common.BuildResultList)
|
||||||
|
ret1, _ := ret[1].(error)
|
||||||
|
return ret0, ret1
|
||||||
|
}
|
||||||
|
|
||||||
|
// BuildStatusWithState indicates an expected call of BuildStatusWithState.
|
||||||
|
func (mr *MockObsStatusFetcherWithStateMockRecorder) BuildStatusWithState(project, opts any, packages ...any) *MockObsStatusFetcherWithStateBuildStatusWithStateCall {
|
||||||
|
mr.mock.ctrl.T.Helper()
|
||||||
|
varargs := append([]any{project, opts}, packages...)
|
||||||
|
call := mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "BuildStatusWithState", reflect.TypeOf((*MockObsStatusFetcherWithState)(nil).BuildStatusWithState), varargs...)
|
||||||
|
return &MockObsStatusFetcherWithStateBuildStatusWithStateCall{Call: call}
|
||||||
|
}
|
||||||
|
|
||||||
|
// MockObsStatusFetcherWithStateBuildStatusWithStateCall wrap *gomock.Call
|
||||||
|
type MockObsStatusFetcherWithStateBuildStatusWithStateCall struct {
|
||||||
|
*gomock.Call
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return rewrite *gomock.Call.Return
|
||||||
|
func (c *MockObsStatusFetcherWithStateBuildStatusWithStateCall) Return(arg0 *common.BuildResultList, arg1 error) *MockObsStatusFetcherWithStateBuildStatusWithStateCall {
|
||||||
|
c.Call = c.Call.Return(arg0, arg1)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// Do rewrite *gomock.Call.Do
|
||||||
|
func (c *MockObsStatusFetcherWithStateBuildStatusWithStateCall) Do(f func(string, *common.BuildResultOptions, ...string) (*common.BuildResultList, error)) *MockObsStatusFetcherWithStateBuildStatusWithStateCall {
|
||||||
|
c.Call = c.Call.Do(f)
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
// DoAndReturn rewrite *gomock.Call.DoAndReturn
|
||||||
|
func (c *MockObsStatusFetcherWithStateBuildStatusWithStateCall) DoAndReturn(f func(string, *common.BuildResultOptions, ...string) (*common.BuildResultList, error)) *MockObsStatusFetcherWithStateBuildStatusWithStateCall {
|
||||||
|
c.Call = c.Call.DoAndReturn(f)
|
||||||
|
return c
|
||||||
|
}
|
||||||
@@ -116,30 +116,43 @@ type Flags struct {
|
|||||||
Contents string `xml:",innerxml"`
|
Contents string `xml:",innerxml"`
|
||||||
}
|
}
|
||||||
|
|
||||||
|
type ProjectLinkMeta struct {
|
||||||
|
Project string `xml:"project,attr"`
|
||||||
|
}
|
||||||
|
|
||||||
type ProjectMeta struct {
|
type ProjectMeta struct {
|
||||||
XMLName xml.Name `xml:"project"`
|
XMLName xml.Name `xml:"project"`
|
||||||
Name string `xml:"name,attr"`
|
Name string `xml:"name,attr"`
|
||||||
Title string `xml:"title"`
|
Title string `xml:"title"`
|
||||||
Description string `xml:"description"`
|
Description string `xml:"description"`
|
||||||
Url string `xml:"url,omitempty"`
|
Url string `xml:"url,omitempty"`
|
||||||
ScmSync string `xml:"scmsync"`
|
ScmSync string `xml:"scmsync,omitempty"`
|
||||||
|
Link []ProjectLinkMeta `xml:"link"`
|
||||||
Persons []PersonRepoMeta `xml:"person"`
|
Persons []PersonRepoMeta `xml:"person"`
|
||||||
Groups []GroupRepoMeta `xml:"group"`
|
Groups []GroupRepoMeta `xml:"group"`
|
||||||
Repositories []RepositoryMeta `xml:"repository"`
|
Repositories []RepositoryMeta `xml:"repository"`
|
||||||
|
|
||||||
BuildFlags Flags `xml:"build"`
|
BuildFlags Flags `xml:"build"`
|
||||||
PublicFlags Flags `xml:"publish"`
|
PublicFlags Flags `xml:"publish"`
|
||||||
DebugFlags Flags `xml:"debuginfo"`
|
DebugFlags Flags `xml:"debuginfo"`
|
||||||
UseForBuild Flags `xml:"useforbuild"`
|
UseForBuild Flags `xml:"useforbuild"`
|
||||||
|
Access Flags `xml:"access"`
|
||||||
|
SourceAccess Flags `xml:"sourceaccess"`
|
||||||
}
|
}
|
||||||
|
|
||||||
type PackageMeta struct {
|
type PackageMeta struct {
|
||||||
XMLName xml.Name `xml:"package"`
|
XMLName xml.Name `xml:"package"`
|
||||||
Name string `xml:"name,attr"`
|
Name string `xml:"name,attr"`
|
||||||
Project string `xml:"project,attr"`
|
Project string `xml:"project,attr,omitempty"`
|
||||||
ScmSync string `xml:"scmsync"`
|
ScmSync string `xml:"scmsync,omitempty"`
|
||||||
Persons []PersonRepoMeta `xml:"person"`
|
Persons []PersonRepoMeta `xml:"person"`
|
||||||
Groups []GroupRepoMeta `xml:"group"`
|
Groups []GroupRepoMeta `xml:"group"`
|
||||||
|
|
||||||
|
BuildFlags Flags `xml:"build"`
|
||||||
|
PublicFlags Flags `xml:"publish"`
|
||||||
|
DebugFlags Flags `xml:"debuginfo"`
|
||||||
|
UseForBuild Flags `xml:"useforbuild"`
|
||||||
|
SourceAccess Flags `xml:"sourceaccess"`
|
||||||
}
|
}
|
||||||
|
|
||||||
type UserMeta struct {
|
type UserMeta struct {
|
||||||
@@ -562,25 +575,58 @@ func (c *ObsClient) DeleteProject(project string) error {
|
|||||||
}
|
}
|
||||||
|
|
||||||
return nil
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (c *ObsClient) BuildLog(prj, pkg, repo, arch string) (io.ReadCloser, error) {
|
||||||
|
url := c.baseUrl.JoinPath("build", prj, repo, arch, pkg, "_log")
|
||||||
|
query := url.Query()
|
||||||
|
query.Add("nostream", "1")
|
||||||
|
query.Add("start", "0")
|
||||||
|
url.RawQuery = query.Encode()
|
||||||
|
res, err := c.ObsRequestRaw("GET", url.String(), nil)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return res.Body, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
type PackageBuildStatus struct {
|
type PackageBuildStatus struct {
|
||||||
Package string `xml:"package,attr"`
|
Package string `xml:"package,attr"`
|
||||||
Code string `xml:"code,attr"`
|
Code string `xml:"code,attr"`
|
||||||
Details string `xml:"details"`
|
Details string `xml:"details"`
|
||||||
|
|
||||||
|
LastUpdate time.Time
|
||||||
|
}
|
||||||
|
|
||||||
|
func PackageBuildStatusComp(A, B *PackageBuildStatus) int {
|
||||||
|
return strings.Compare(A.Package, B.Package)
|
||||||
}
|
}
|
||||||
|
|
||||||
type BuildResult struct {
|
type BuildResult struct {
|
||||||
Project string `xml:"project,attr"`
|
XMLName xml.Name `xml:"result" json:"xml,omitempty"`
|
||||||
Repository string `xml:"repository,attr"`
|
Project string `xml:"project,attr"`
|
||||||
Arch string `xml:"arch,attr"`
|
Repository string `xml:"repository,attr"`
|
||||||
Code string `xml:"code,attr"`
|
Arch string `xml:"arch,attr"`
|
||||||
Dirty bool `xml:"dirty,attr"`
|
Code string `xml:"code,attr"`
|
||||||
ScmSync string `xml:"scmsync"`
|
Dirty bool `xml:"dirty,attr,omitempty"`
|
||||||
ScmInfo string `xml:"scminfo"`
|
ScmSync string `xml:"scmsync,omitempty"`
|
||||||
Status []PackageBuildStatus `xml:"status"`
|
ScmInfo string `xml:"scminfo,omitempty"`
|
||||||
Binaries []BinaryList `xml:"binarylist"`
|
Status []*PackageBuildStatus `xml:"status"`
|
||||||
|
Binaries []BinaryList `xml:"binarylist,omitempty"`
|
||||||
|
|
||||||
|
LastUpdate time.Time
|
||||||
|
}
|
||||||
|
|
||||||
|
func BuildResultComp(A, B *BuildResult) int {
|
||||||
|
if cmp := strings.Compare(A.Project, B.Project); cmp != 0 {
|
||||||
|
return cmp
|
||||||
|
}
|
||||||
|
if cmp := strings.Compare(A.Repository, B.Repository); cmp != 0 {
|
||||||
|
return cmp
|
||||||
|
}
|
||||||
|
return strings.Compare(A.Arch, B.Arch)
|
||||||
}
|
}
|
||||||
|
|
||||||
type Binary struct {
|
type Binary struct {
|
||||||
@@ -595,9 +641,9 @@ type BinaryList struct {
|
|||||||
}
|
}
|
||||||
|
|
||||||
type BuildResultList struct {
|
type BuildResultList struct {
|
||||||
XMLName xml.Name `xml:"resultlist"`
|
XMLName xml.Name `xml:"resultlist"`
|
||||||
State string `xml:"state,attr"`
|
State string `xml:"state,attr"`
|
||||||
Result []BuildResult `xml:"result"`
|
Result []*BuildResult `xml:"result"`
|
||||||
|
|
||||||
isLastBuild bool
|
isLastBuild bool
|
||||||
}
|
}
|
||||||
|
|||||||
476
common/pr.go
476
common/pr.go
@@ -9,6 +9,7 @@ import (
|
|||||||
"slices"
|
"slices"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
|
"src.opensuse.org/autogits/common/gitea-generated/client/repository"
|
||||||
"src.opensuse.org/autogits/common/gitea-generated/models"
|
"src.opensuse.org/autogits/common/gitea-generated/models"
|
||||||
)
|
)
|
||||||
|
|
||||||
@@ -22,7 +23,50 @@ type PRSet struct {
|
|||||||
PRs []*PRInfo
|
PRs []*PRInfo
|
||||||
Config *AutogitConfig
|
Config *AutogitConfig
|
||||||
|
|
||||||
BotUser string
|
BotUser string
|
||||||
|
HasAutoStaging bool
|
||||||
|
}
|
||||||
|
|
||||||
|
func (prinfo *PRInfo) PRComponents() (org string, repo string, idx int64) {
|
||||||
|
org = prinfo.PR.Base.Repo.Owner.UserName
|
||||||
|
repo = prinfo.PR.Base.Repo.Name
|
||||||
|
idx = prinfo.PR.Index
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
func (prinfo *PRInfo) RemoveReviewers(gitea GiteaUnreviewTimelineFetcher, Reviewers []string, BotUser string) {
|
||||||
|
org, repo, idx := prinfo.PRComponents()
|
||||||
|
tl, err := gitea.GetTimeline(org, repo, idx)
|
||||||
|
if err != nil {
|
||||||
|
LogError("Failed to fetch timeline for", PRtoString(prinfo.PR), err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// find review request for each reviewer
|
||||||
|
ReviewersToUnrequest := Reviewers
|
||||||
|
ReviewersAlreadyChecked := []string{}
|
||||||
|
|
||||||
|
for _, tlc := range tl {
|
||||||
|
if tlc.Type == TimelineCommentType_ReviewRequested && tlc.Assignee != nil {
|
||||||
|
user := tlc.Assignee.UserName
|
||||||
|
|
||||||
|
if idx := slices.Index(ReviewersToUnrequest, user); idx >= 0 && !slices.Contains(ReviewersAlreadyChecked, user) {
|
||||||
|
if tlc.User != nil && tlc.User.UserName == BotUser {
|
||||||
|
ReviewersAlreadyChecked = append(ReviewersAlreadyChecked, user)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
ReviewersToUnrequest = slices.Delete(ReviewersToUnrequest, idx, idx+1)
|
||||||
|
if len(Reviewers) == 0 {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
LogDebug("Unrequesting reviewes for", PRtoString(prinfo.PR), ReviewersToUnrequest)
|
||||||
|
err = gitea.UnrequestReview(org, repo, idx, ReviewersToUnrequest...)
|
||||||
|
if err != nil {
|
||||||
|
LogError("Failed to unrequest reviewers for", PRtoString(prinfo.PR), err)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func readPRData(gitea GiteaPRFetcher, pr *models.PullRequest, currentSet []*PRInfo, config *AutogitConfig) ([]*PRInfo, error) {
|
func readPRData(gitea GiteaPRFetcher, pr *models.PullRequest, currentSet []*PRInfo, config *AutogitConfig) ([]*PRInfo, error) {
|
||||||
@@ -55,14 +99,15 @@ func readPRData(gitea GiteaPRFetcher, pr *models.PullRequest, currentSet []*PRIn
|
|||||||
|
|
||||||
var Timeline_RefIssueNotFound error = errors.New("RefIssue not found on the timeline")
|
var Timeline_RefIssueNotFound error = errors.New("RefIssue not found on the timeline")
|
||||||
|
|
||||||
func LastPrjGitRefOnTimeline(gitea GiteaPRTimelineFetcher, org, repo string, num int64, prjGitOrg, prjGitRepo string) (*models.PullRequest, error) {
|
func LastPrjGitRefOnTimeline(botUser string, gitea GiteaPRTimelineReviewFetcher, org, repo string, num int64, config *AutogitConfig) (*models.PullRequest, error) {
|
||||||
prRefLine := fmt.Sprintf(PrPattern, org, repo, num)
|
|
||||||
timeline, err := gitea.GetTimeline(org, repo, num)
|
timeline, err := gitea.GetTimeline(org, repo, num)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
LogError("Failed to fetch timeline for", org, repo, "#", num, err)
|
LogError("Failed to fetch timeline for", org, repo, "#", num, err)
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
|
prjGitOrg, prjGitRepo, prjGitBranch := config.GetPrjGit()
|
||||||
|
|
||||||
for idx := len(timeline) - 1; idx >= 0; idx-- {
|
for idx := len(timeline) - 1; idx >= 0; idx-- {
|
||||||
item := timeline[idx]
|
item := timeline[idx]
|
||||||
issue := item.RefIssue
|
issue := item.RefIssue
|
||||||
@@ -72,9 +117,32 @@ func LastPrjGitRefOnTimeline(gitea GiteaPRTimelineFetcher, org, repo string, num
|
|||||||
issue.Repository.Owner == prjGitOrg &&
|
issue.Repository.Owner == prjGitOrg &&
|
||||||
issue.Repository.Name == prjGitRepo {
|
issue.Repository.Name == prjGitRepo {
|
||||||
|
|
||||||
lines := SplitLines(item.RefIssue.Body)
|
if !config.NoProjectGitPR {
|
||||||
for _, line := range lines {
|
if issue.User.UserName != botUser {
|
||||||
if strings.TrimSpace(line) == prRefLine {
|
continue
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pr, err := gitea.GetPullRequest(prjGitOrg, prjGitRepo, issue.Index)
|
||||||
|
if err != nil {
|
||||||
|
switch err.(type) {
|
||||||
|
case *repository.RepoGetPullRequestNotFound: // deleted?
|
||||||
|
continue
|
||||||
|
default:
|
||||||
|
LogDebug("PrjGit RefIssue fetch error from timeline", issue.Index, err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
LogDebug("found ref PR on timeline:", PRtoString(pr))
|
||||||
|
if pr.Base.Name != prjGitBranch {
|
||||||
|
LogDebug(" -> not matching:", pr.Base.Name, prjGitBranch)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
_, prs := ExtractDescriptionAndPRs(bufio.NewScanner(strings.NewReader(item.RefIssue.Body)))
|
||||||
|
for _, pr := range prs {
|
||||||
|
if pr.Org == org && pr.Repo == repo && pr.Num == num {
|
||||||
LogDebug("Found PrjGit PR in Timeline:", issue.Index)
|
LogDebug("Found PrjGit PR in Timeline:", issue.Index)
|
||||||
|
|
||||||
// found prjgit PR in timeline. Return it
|
// found prjgit PR in timeline. Return it
|
||||||
@@ -88,17 +156,19 @@ func LastPrjGitRefOnTimeline(gitea GiteaPRTimelineFetcher, org, repo string, num
|
|||||||
return nil, Timeline_RefIssueNotFound
|
return nil, Timeline_RefIssueNotFound
|
||||||
}
|
}
|
||||||
|
|
||||||
func FetchPRSet(user string, gitea GiteaPRTimelineFetcher, org, repo string, num int64, config *AutogitConfig) (*PRSet, error) {
|
func FetchPRSet(user string, gitea GiteaPRTimelineReviewFetcher, org, repo string, num int64, config *AutogitConfig) (*PRSet, error) {
|
||||||
var pr *models.PullRequest
|
var pr *models.PullRequest
|
||||||
var err error
|
var err error
|
||||||
|
|
||||||
|
gitea.ResetTimelineCache(org, repo, num)
|
||||||
|
|
||||||
prjGitOrg, prjGitRepo, _ := config.GetPrjGit()
|
prjGitOrg, prjGitRepo, _ := config.GetPrjGit()
|
||||||
if prjGitOrg == org && prjGitRepo == repo {
|
if prjGitOrg == org && prjGitRepo == repo {
|
||||||
if pr, err = gitea.GetPullRequest(org, repo, num); err != nil {
|
if pr, err = gitea.GetPullRequest(org, repo, num); err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
if pr, err = LastPrjGitRefOnTimeline(gitea, org, repo, num, prjGitOrg, prjGitRepo); err != nil && err != Timeline_RefIssueNotFound {
|
if pr, err = LastPrjGitRefOnTimeline(user, gitea, org, repo, num, config); err != nil && err != Timeline_RefIssueNotFound {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -114,6 +184,16 @@ func FetchPRSet(user string, gitea GiteaPRTimelineFetcher, org, repo string, num
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
|
for _, pr := range prs {
|
||||||
|
org, repo, idx := pr.PRComponents()
|
||||||
|
gitea.ResetTimelineCache(org, repo, idx)
|
||||||
|
reviews, err := FetchGiteaReviews(gitea, org, repo, idx)
|
||||||
|
if err != nil {
|
||||||
|
LogError("Error fetching reviews for", PRtoString(pr.PR), ":", err)
|
||||||
|
}
|
||||||
|
pr.Reviews = reviews
|
||||||
|
}
|
||||||
|
|
||||||
return &PRSet{
|
return &PRSet{
|
||||||
PRs: prs,
|
PRs: prs,
|
||||||
Config: config,
|
Config: config,
|
||||||
@@ -121,6 +201,12 @@ func FetchPRSet(user string, gitea GiteaPRTimelineFetcher, org, repo string, num
|
|||||||
}, nil
|
}, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (prset *PRSet) RemoveReviewers(gitea GiteaUnreviewTimelineFetcher, reviewers []string) {
|
||||||
|
for _, prinfo := range prset.PRs {
|
||||||
|
prinfo.RemoveReviewers(gitea, reviewers, prset.BotUser)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func (rs *PRSet) Find(pr *models.PullRequest) (*PRInfo, bool) {
|
func (rs *PRSet) Find(pr *models.PullRequest) (*PRInfo, bool) {
|
||||||
for _, p := range rs.PRs {
|
for _, p := range rs.PRs {
|
||||||
if p.PR.Base.RepoID == pr.Base.RepoID &&
|
if p.PR.Base.RepoID == pr.Base.RepoID &&
|
||||||
@@ -206,67 +292,150 @@ next_rs:
|
|||||||
}
|
}
|
||||||
|
|
||||||
for _, pr := range prjpr_set {
|
for _, pr := range prjpr_set {
|
||||||
if prinfo.PR.Base.Repo.Owner.UserName == pr.Org && prinfo.PR.Base.Repo.Name == pr.Repo && prinfo.PR.Index == pr.Num {
|
if strings.EqualFold(prinfo.PR.Base.Repo.Owner.UserName, pr.Org) && strings.EqualFold(prinfo.PR.Base.Repo.Name, pr.Repo) && prinfo.PR.Index == pr.Num {
|
||||||
continue next_rs
|
continue next_rs
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
LogDebug(" PR: ", PRtoString(prinfo.PR), "not found in project git PRSet")
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
|
|
||||||
func (rs *PRSet) AssignReviewers(gitea GiteaReviewFetcherAndRequester, maintainers MaintainershipData) error {
|
func (rs *PRSet) FindMissingAndExtraReviewers(maintainers MaintainershipData, idx int) (missing, extra []string) {
|
||||||
configReviewers := ParseReviewers(rs.Config.Reviewers)
|
configReviewers := ParseReviewers(rs.Config.Reviewers)
|
||||||
|
|
||||||
for _, pr := range rs.PRs {
|
// remove reviewers that were already requested and are not stale
|
||||||
reviewers := []string{}
|
prjMaintainers := maintainers.ListProjectMaintainers(nil)
|
||||||
|
LogDebug("project maintainers:", prjMaintainers)
|
||||||
|
|
||||||
if rs.IsPrjGitPR(pr.PR) {
|
pr := rs.PRs[idx]
|
||||||
reviewers = slices.Concat(configReviewers.Prj, configReviewers.PrjOptional)
|
if rs.IsPrjGitPR(pr.PR) {
|
||||||
LogDebug("PrjGit submitter:", pr.PR.User.UserName)
|
missing = slices.Concat(configReviewers.Prj, configReviewers.PrjOptional)
|
||||||
if len(rs.PRs) == 1 {
|
if rs.HasAutoStaging {
|
||||||
reviewers = slices.Concat(reviewers, maintainers.ListProjectMaintainers())
|
missing = append(missing, Bot_BuildReview)
|
||||||
}
|
}
|
||||||
|
LogDebug("PrjGit submitter:", pr.PR.User.UserName)
|
||||||
|
// only need project maintainer reviews if:
|
||||||
|
// * not created by a bot and has other PRs, or
|
||||||
|
// * not created by maintainer
|
||||||
|
noReviewPRCreators := []string{}
|
||||||
|
if !rs.Config.ReviewRequired {
|
||||||
|
noReviewPRCreators = prjMaintainers
|
||||||
|
}
|
||||||
|
if len(rs.PRs) > 1 {
|
||||||
|
noReviewPRCreators = append(noReviewPRCreators, rs.BotUser)
|
||||||
|
}
|
||||||
|
if slices.Contains(noReviewPRCreators, pr.PR.User.UserName) || pr.Reviews.IsReviewedByOneOf(prjMaintainers...) {
|
||||||
|
LogDebug("Project already reviewed by a project maintainer, remove rest")
|
||||||
|
// do not remove reviewers if they are also maintainers
|
||||||
|
prjMaintainers = slices.DeleteFunc(prjMaintainers, func(m string) bool { return slices.Contains(missing, m) })
|
||||||
|
extra = slices.Concat(prjMaintainers, []string{rs.BotUser})
|
||||||
} else {
|
} else {
|
||||||
pkg := pr.PR.Base.Repo.Name
|
// if bot not created PrjGit or prj maintainer, we need to add project reviewers here
|
||||||
reviewers = slices.Concat(configReviewers.Pkg, maintainers.ListProjectMaintainers(), maintainers.ListPackageMaintainers(pkg), configReviewers.PkgOptional)
|
if slices.Contains(noReviewPRCreators, pr.PR.User.UserName) {
|
||||||
}
|
LogDebug("No need for project maintainers")
|
||||||
|
extra = slices.Concat(prjMaintainers, []string{rs.BotUser})
|
||||||
slices.Sort(reviewers)
|
|
||||||
reviewers = slices.Compact(reviewers)
|
|
||||||
|
|
||||||
// submitters do not need to review their own work
|
|
||||||
if idx := slices.Index(reviewers, pr.PR.User.UserName); idx != -1 {
|
|
||||||
reviewers = slices.Delete(reviewers, idx, idx+1)
|
|
||||||
}
|
|
||||||
|
|
||||||
LogDebug("PR: ", pr.PR.Base.Repo.Name, pr.PR.Index)
|
|
||||||
LogDebug("reviewers for PR:", reviewers)
|
|
||||||
|
|
||||||
// remove reviewers that were already requested and are not stale
|
|
||||||
reviews, err := FetchGiteaReviews(gitea, reviewers, pr.PR.Base.Repo.Owner.UserName, pr.PR.Base.Repo.Name, pr.PR.Index)
|
|
||||||
if err != nil {
|
|
||||||
LogError("Error fetching reviews:", err)
|
|
||||||
return err
|
|
||||||
}
|
|
||||||
|
|
||||||
for idx := 0; idx < len(reviewers); {
|
|
||||||
user := reviewers[idx]
|
|
||||||
if reviews.HasPendingReviewBy(user) || reviews.IsReviewedBy(user) {
|
|
||||||
reviewers = slices.Delete(reviewers, idx, idx+1)
|
|
||||||
LogDebug("removing reviewer:", user)
|
|
||||||
} else {
|
} else {
|
||||||
idx++
|
LogDebug("Adding prjMaintainers to PrjGit")
|
||||||
|
missing = append(missing, prjMaintainers...)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
} else {
|
||||||
|
pkg := pr.PR.Base.Repo.Name
|
||||||
|
pkgMaintainers := maintainers.ListPackageMaintainers(pkg, nil)
|
||||||
|
Maintainers := slices.Concat(prjMaintainers, pkgMaintainers)
|
||||||
|
noReviewPkgPRCreators := []string{}
|
||||||
|
if !rs.Config.ReviewRequired {
|
||||||
|
noReviewPkgPRCreators = pkgMaintainers
|
||||||
|
}
|
||||||
|
|
||||||
// get maintainers associated with the PR too
|
LogDebug("packakge maintainers:", Maintainers)
|
||||||
if len(reviewers) > 0 {
|
|
||||||
LogDebug("Requesting reviews from:", reviewers)
|
missing = slices.Concat(configReviewers.Pkg, configReviewers.PkgOptional)
|
||||||
|
if slices.Contains(noReviewPkgPRCreators, pr.PR.User.UserName) || pr.Reviews.IsReviewedByOneOf(Maintainers...) {
|
||||||
|
// submitter is maintainer or already reviewed
|
||||||
|
LogDebug("Package reviewed by maintainer (or subitter is maintainer), remove the rest of them")
|
||||||
|
// do not remove reviewers if they are also maintainers
|
||||||
|
Maintainers = slices.DeleteFunc(Maintainers, func(m string) bool { return slices.Contains(missing, m) })
|
||||||
|
extra = slices.Concat(Maintainers, []string{rs.BotUser})
|
||||||
|
} else {
|
||||||
|
// maintainer review is missing
|
||||||
|
LogDebug("Adding package maintainers to package git")
|
||||||
|
missing = append(missing, pkgMaintainers...)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
slices.Sort(missing)
|
||||||
|
missing = slices.Compact(missing)
|
||||||
|
|
||||||
|
slices.Sort(extra)
|
||||||
|
extra = slices.Compact(extra)
|
||||||
|
|
||||||
|
// submitters cannot review their own work
|
||||||
|
if idx := slices.Index(missing, pr.PR.User.UserName); idx != -1 {
|
||||||
|
missing = slices.Delete(missing, idx, idx+1)
|
||||||
|
}
|
||||||
|
|
||||||
|
LogDebug("PR: ", PRtoString(pr.PR))
|
||||||
|
LogDebug(" preliminary add reviewers for PR:", missing)
|
||||||
|
LogDebug(" preliminary rm reviewers for PR:", extra)
|
||||||
|
|
||||||
|
// remove missing reviewers that are already done or already pending
|
||||||
|
for idx := 0; idx < len(missing); {
|
||||||
|
user := missing[idx]
|
||||||
|
if pr.Reviews.HasPendingReviewBy(user) || pr.Reviews.IsReviewedBy(user) {
|
||||||
|
missing = slices.Delete(missing, idx, idx+1)
|
||||||
|
LogDebug(" removing done/pending reviewer:", user)
|
||||||
|
} else {
|
||||||
|
idx++
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// remove extra reviews that are actually only pending, and only pending by us
|
||||||
|
for idx := 0; idx < len(extra); {
|
||||||
|
user := extra[idx]
|
||||||
|
rr := pr.Reviews.FindReviewRequester(user)
|
||||||
|
if rr != nil && rr.User.UserName == rs.BotUser && pr.Reviews.HasPendingReviewBy(user) {
|
||||||
|
// good to remove this review
|
||||||
|
idx++
|
||||||
|
} else {
|
||||||
|
// this review should not be considered as extra by us
|
||||||
|
LogDebug(" - cannot find? to remove", user)
|
||||||
|
if rr != nil {
|
||||||
|
LogDebug(" ", rr.User.UserName, "vs.", rs.BotUser, pr.Reviews.HasPendingReviewBy(user))
|
||||||
|
}
|
||||||
|
extra = slices.Delete(extra, idx, idx+1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
LogDebug(" add reviewers for PR:", missing)
|
||||||
|
LogDebug(" rm reviewers for PR:", extra)
|
||||||
|
|
||||||
|
return missing, extra
|
||||||
|
}
|
||||||
|
|
||||||
|
func (rs *PRSet) AssignReviewers(gitea GiteaReviewFetcherAndRequesterAndUnrequester, maintainers MaintainershipData) error {
|
||||||
|
for idx, pr := range rs.PRs {
|
||||||
|
missingReviewers, extraReviewers := rs.FindMissingAndExtraReviewers(maintainers, idx)
|
||||||
|
|
||||||
|
if len(missingReviewers) > 0 {
|
||||||
|
LogDebug(" Requesting reviews from:", missingReviewers)
|
||||||
if !IsDryRun {
|
if !IsDryRun {
|
||||||
for _, r := range reviewers {
|
for _, r := range missingReviewers {
|
||||||
if _, err := gitea.RequestReviews(pr.PR, r); err != nil {
|
if _, err := gitea.RequestReviews(pr.PR, r); err != nil {
|
||||||
LogError("Cannot create reviews on", fmt.Sprintf("%s/%s#%d for [%s]", pr.PR.Base.Repo.Owner.UserName, pr.PR.Base.Repo.Name, pr.PR.Index, strings.Join(reviewers, ", ")), err)
|
LogError("Cannot create reviews on", PRtoString(pr.PR), "for user:", r, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if len(extraReviewers) > 0 {
|
||||||
|
LogDebug(" UnRequesting reviews from:", extraReviewers)
|
||||||
|
if !IsDryRun {
|
||||||
|
for _, r := range extraReviewers {
|
||||||
|
org, repo, idx := pr.PRComponents()
|
||||||
|
if err := gitea.UnrequestReview(org, repo, idx, r); err != nil {
|
||||||
|
LogError("Cannot unrequest reviews on", PRtoString(pr.PR), "for user:", r, err)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -275,21 +444,29 @@ func (rs *PRSet) AssignReviewers(gitea GiteaReviewFetcherAndRequester, maintaine
|
|||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (rs *PRSet) RemoveClosedPRs() {
|
||||||
|
rs.PRs = slices.DeleteFunc(rs.PRs, func(pr *PRInfo) bool {
|
||||||
|
return pr.PR.State != "open"
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
func (rs *PRSet) IsApproved(gitea GiteaPRChecker, maintainers MaintainershipData) bool {
|
func (rs *PRSet) IsApproved(gitea GiteaPRChecker, maintainers MaintainershipData) bool {
|
||||||
configReviewers := ParseReviewers(rs.Config.Reviewers)
|
configReviewers := ParseReviewers(rs.Config.Reviewers)
|
||||||
|
|
||||||
is_manually_reviewed_ok := false
|
is_manually_reviewed_ok := false
|
||||||
|
|
||||||
if need_manual_review := rs.Config.ManualMergeOnly || rs.Config.ManualMergeProject; need_manual_review {
|
if need_manual_review := rs.Config.ManualMergeOnly || rs.Config.ManualMergeProject; need_manual_review {
|
||||||
|
// Groups are expanded here because any group member can issue "merge ok" to the BotUser
|
||||||
|
groups := rs.Config.ReviewGroups
|
||||||
prjgit, err := rs.GetPrjGitPR()
|
prjgit, err := rs.GetPrjGitPR()
|
||||||
if err == nil && prjgit != nil {
|
if err == nil && prjgit != nil {
|
||||||
reviewers := slices.Concat(configReviewers.Prj, maintainers.ListProjectMaintainers())
|
reviewers := slices.Concat(configReviewers.Prj, maintainers.ListProjectMaintainers(groups))
|
||||||
LogDebug("Fetching reviews for", prjgit.PR.Base.Repo.Owner.UserName, prjgit.PR.Base.Repo.Name, prjgit.PR.Index)
|
LogDebug("Fetching reviews for", prjgit.PR.Base.Repo.Owner.UserName, prjgit.PR.Base.Repo.Name, prjgit.PR.Index)
|
||||||
r, err := FetchGiteaReviews(gitea, reviewers, prjgit.PR.Base.Repo.Owner.UserName, prjgit.PR.Base.Repo.Name, prjgit.PR.Index)
|
r, err := FetchGiteaReviews(gitea, prjgit.PR.Base.Repo.Owner.UserName, prjgit.PR.Base.Repo.Name, prjgit.PR.Index)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
LogError("Cannot fetch gita reaviews for PR:", err)
|
LogError("Cannot fetch gita reaviews for PR:", err)
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
r.RequestedReviewers = reviewers
|
||||||
prjgit.Reviews = r
|
prjgit.Reviews = r
|
||||||
if prjgit.Reviews.IsManualMergeOK() {
|
if prjgit.Reviews.IsManualMergeOK() {
|
||||||
is_manually_reviewed_ok = true
|
is_manually_reviewed_ok = true
|
||||||
@@ -303,13 +480,14 @@ func (rs *PRSet) IsApproved(gitea GiteaPRChecker, maintainers MaintainershipData
|
|||||||
}
|
}
|
||||||
|
|
||||||
pkg := pr.PR.Base.Repo.Name
|
pkg := pr.PR.Base.Repo.Name
|
||||||
reviewers := slices.Concat(configReviewers.Pkg, maintainers.ListPackageMaintainers(pkg))
|
reviewers := slices.Concat(configReviewers.Pkg, maintainers.ListPackageMaintainers(pkg, groups))
|
||||||
LogDebug("Fetching reviews for", pr.PR.Base.Repo.Owner.UserName, pr.PR.Base.Repo.Name, pr.PR.Index)
|
LogDebug("Fetching reviews for", pr.PR.Base.Repo.Owner.UserName, pr.PR.Base.Repo.Name, pr.PR.Index)
|
||||||
r, err := FetchGiteaReviews(gitea, reviewers, pr.PR.Base.Repo.Owner.UserName, pr.PR.Base.Repo.Name, pr.PR.Index)
|
r, err := FetchGiteaReviews(gitea, pr.PR.Base.Repo.Owner.UserName, pr.PR.Base.Repo.Name, pr.PR.Index)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
LogError("Cannot fetch gita reaviews for PR:", err)
|
LogError("Cannot fetch gita reaviews for PR:", err)
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
r.RequestedReviewers = reviewers
|
||||||
pr.Reviews = r
|
pr.Reviews = r
|
||||||
if !pr.Reviews.IsManualMergeOK() {
|
if !pr.Reviews.IsManualMergeOK() {
|
||||||
LogInfo("Not approved manual merge. PR:", pr.PR.URL)
|
LogInfo("Not approved manual merge. PR:", pr.PR.URL)
|
||||||
@@ -331,6 +509,9 @@ func (rs *PRSet) IsApproved(gitea GiteaPRChecker, maintainers MaintainershipData
|
|||||||
var pkg string
|
var pkg string
|
||||||
if rs.IsPrjGitPR(pr.PR) {
|
if rs.IsPrjGitPR(pr.PR) {
|
||||||
reviewers = configReviewers.Prj
|
reviewers = configReviewers.Prj
|
||||||
|
if rs.HasAutoStaging {
|
||||||
|
reviewers = append(reviewers, Bot_BuildReview)
|
||||||
|
}
|
||||||
pkg = ""
|
pkg = ""
|
||||||
} else {
|
} else {
|
||||||
reviewers = configReviewers.Pkg
|
reviewers = configReviewers.Pkg
|
||||||
@@ -342,20 +523,25 @@ func (rs *PRSet) IsApproved(gitea GiteaPRChecker, maintainers MaintainershipData
|
|||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
r, err := FetchGiteaReviews(gitea, reviewers, pr.PR.Base.Repo.Owner.UserName, pr.PR.Base.Repo.Name, pr.PR.Index)
|
r, err := FetchGiteaReviews(gitea, pr.PR.Base.Repo.Owner.UserName, pr.PR.Base.Repo.Name, pr.PR.Index)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
LogError("Cannot fetch gita reaviews for PR:", err)
|
LogError("Cannot fetch gitea reaviews for PR:", err)
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
r.RequestedReviewers = reviewers
|
||||||
|
|
||||||
is_manually_reviewed_ok = r.IsApproved()
|
is_manually_reviewed_ok = r.IsApproved()
|
||||||
LogDebug(pr.PR.Base.Repo.Name, is_manually_reviewed_ok)
|
LogDebug("PR to", pr.PR.Base.Repo.Name, "reviewed?", is_manually_reviewed_ok)
|
||||||
if !is_manually_reviewed_ok {
|
if !is_manually_reviewed_ok {
|
||||||
|
if GetLoggingLevel() > LogLevelInfo {
|
||||||
|
LogDebug("missing reviewers:", r.MissingReviews())
|
||||||
|
}
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
if need_maintainer_review := !rs.IsPrjGitPR(pr.PR) || pr.PR.User.UserName != rs.BotUser; need_maintainer_review {
|
if need_maintainer_review := !rs.IsPrjGitPR(pr.PR) || pr.PR.User.UserName != rs.BotUser; need_maintainer_review {
|
||||||
if is_manually_reviewed_ok = maintainers.IsApproved(pkg, r.reviews, pr.PR.User.UserName); !is_manually_reviewed_ok {
|
// Do not expand groups here, as the group-review-bot will ACK if group has reviewed.
|
||||||
|
if is_manually_reviewed_ok = maintainers.IsApproved(pkg, r.Reviews, pr.PR.User.UserName, nil); !is_manually_reviewed_ok {
|
||||||
LogDebug(" not approved?", pkg)
|
LogDebug(" not approved?", pkg)
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
@@ -366,6 +552,145 @@ func (rs *PRSet) IsApproved(gitea GiteaPRChecker, maintainers MaintainershipData
|
|||||||
return is_manually_reviewed_ok
|
return is_manually_reviewed_ok
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (rs *PRSet) AddMergeCommit(git Git, remote string, pr int) bool {
|
||||||
|
prinfo := rs.PRs[pr]
|
||||||
|
|
||||||
|
LogDebug("Adding merge commit for %s", PRtoString(prinfo.PR))
|
||||||
|
if !prinfo.PR.AllowMaintainerEdit {
|
||||||
|
LogError(" PR is not editable by maintainer")
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
repo := prinfo.PR.Base.Repo
|
||||||
|
head := prinfo.PR.Head
|
||||||
|
br := rs.Config.Branch
|
||||||
|
if len(br) == 0 {
|
||||||
|
br = prinfo.PR.Base.Name
|
||||||
|
}
|
||||||
|
|
||||||
|
msg := fmt.Sprintf("Merge branch '%s' into %s", br, head.Name)
|
||||||
|
if err := git.GitExec(repo.Name, "merge", "--no-ff", "--no-commit", "-X", "theirs", head.Sha); err != nil {
|
||||||
|
if err := git.GitExec(repo.Name, "merge", "--no-ff", "--no-commit", "--allow-unrelated-histories", "-X", "theirs", head.Sha); err != nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
LogError("WARNING: Merging unrelated histories")
|
||||||
|
}
|
||||||
|
|
||||||
|
// ensure only files that are in head.Sha are kept
|
||||||
|
git.GitExecOrPanic(repo.Name, "read-tree", "-m", head.Sha)
|
||||||
|
git.GitExecOrPanic(repo.Name, "commit", "-m", msg)
|
||||||
|
git.GitExecOrPanic(repo.Name, "clean", "-fxd")
|
||||||
|
|
||||||
|
if !IsDryRun {
|
||||||
|
git.GitExecOrPanic(repo.Name, "push", remote, "HEAD:"+head.Name)
|
||||||
|
prinfo.PR.Head.Sha = strings.TrimSpace(git.GitExecWithOutputOrPanic(repo.Name, "rev-list", "-1", "HEAD")) // need to update as it's pushed but pr not refetched
|
||||||
|
}
|
||||||
|
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
func (rs *PRSet) HasMerge(git Git, pr int) bool {
|
||||||
|
prinfo := rs.PRs[pr]
|
||||||
|
|
||||||
|
repo := prinfo.PR.Base.Repo
|
||||||
|
head := prinfo.PR.Head
|
||||||
|
br := rs.Config.Branch
|
||||||
|
if len(br) == 0 {
|
||||||
|
br = prinfo.PR.Base.Name
|
||||||
|
}
|
||||||
|
|
||||||
|
parents, err := git.GitExecWithOutput(repo.Name, "show", "-s", "--format=%P", head.Sha)
|
||||||
|
if err == nil {
|
||||||
|
p := strings.Fields(strings.TrimSpace(parents))
|
||||||
|
if len(p) == 2 {
|
||||||
|
targetHead, _ := git.GitExecWithOutput(repo.Name, "rev-parse", "HEAD")
|
||||||
|
targetHead = strings.TrimSpace(targetHead)
|
||||||
|
if p[0] == targetHead || p[1] == targetHead {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
func (rs *PRSet) PrepareForMerge(git Git) bool {
|
||||||
|
// verify that package can merge here. Checkout current target branch of each PRSet, make a temporary branch
|
||||||
|
// PR_#_mergetest and perform the merge based
|
||||||
|
|
||||||
|
if rs.Config.MergeMode == MergeModeDevel {
|
||||||
|
return true // always can merge as we set branch here, not merge anything
|
||||||
|
} else {
|
||||||
|
// make sure that all the package PRs are in mergeable state
|
||||||
|
for idx, prinfo := range rs.PRs {
|
||||||
|
if rs.IsPrjGitPR(prinfo.PR) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
repo := prinfo.PR.Base.Repo
|
||||||
|
head := prinfo.PR.Head
|
||||||
|
br := rs.Config.Branch
|
||||||
|
if len(br) == 0 {
|
||||||
|
br = prinfo.PR.Base.Name
|
||||||
|
}
|
||||||
|
|
||||||
|
remote, err := git.GitClone(repo.Name, br, repo.SSHURL)
|
||||||
|
if err != nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
git.GitExecOrPanic(repo.Name, "fetch", remote, head.Sha)
|
||||||
|
switch rs.Config.MergeMode {
|
||||||
|
case MergeModeFF:
|
||||||
|
if err := git.GitExec(repo.Name, "merge-base", "--is-ancestor", "HEAD", head.Sha); err != nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
case MergeModeReplace:
|
||||||
|
Verify:
|
||||||
|
if err := git.GitExec(repo.Name, "merge-base", "--is-ancestor", "HEAD", head.Sha); err != nil {
|
||||||
|
if !rs.HasMerge(git, idx) {
|
||||||
|
forkRemote, err := git.GitClone(repo.Name, head.Name, head.Repo.SSHURL)
|
||||||
|
if err != nil {
|
||||||
|
LogError("Failed to clone head repo:", head.Name, head.Repo.SSHURL)
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
LogDebug("Merge commit is missing and this is not FF merge possibility")
|
||||||
|
git.GitExecOrPanic(repo.Name, "checkout", remote+"/"+br)
|
||||||
|
if !rs.AddMergeCommit(git, forkRemote, idx) {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
if !IsDryRun {
|
||||||
|
goto Verify
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// now we check project git if mergeable
|
||||||
|
prjgit_info, err := rs.GetPrjGitPR()
|
||||||
|
if err != nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
prjgit := prjgit_info.PR
|
||||||
|
|
||||||
|
_, _, prjgitBranch := rs.Config.GetPrjGit()
|
||||||
|
remote, err := git.GitClone(DefaultGitPrj, prjgitBranch, prjgit.Base.Repo.SSHURL)
|
||||||
|
if err != nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
testBranch := fmt.Sprintf("PR_%d_mergetest", prjgit.Index)
|
||||||
|
git.GitExecOrPanic(DefaultGitPrj, "fetch", remote, prjgit.Head.Sha)
|
||||||
|
if err := git.GitExec(DefaultGitPrj, "checkout", "-B", testBranch, prjgit.Base.Sha); err != nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
if err := git.GitExec(DefaultGitPrj, "merge", "--no-ff", "--no-commit", prjgit.Head.Sha); err != nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
func (rs *PRSet) Merge(gitea GiteaReviewUnrequester, git Git) error {
|
func (rs *PRSet) Merge(gitea GiteaReviewUnrequester, git Git) error {
|
||||||
prjgit_info, err := rs.GetPrjGitPR()
|
prjgit_info, err := rs.GetPrjGitPR()
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -373,7 +698,8 @@ func (rs *PRSet) Merge(gitea GiteaReviewUnrequester, git Git) error {
|
|||||||
}
|
}
|
||||||
prjgit := prjgit_info.PR
|
prjgit := prjgit_info.PR
|
||||||
|
|
||||||
remote, err := git.GitClone(DefaultGitPrj, rs.Config.Branch, prjgit.Base.Repo.SSHURL)
|
_, _, prjgitBranch := rs.Config.GetPrjGit()
|
||||||
|
remote, err := git.GitClone(DefaultGitPrj, prjgitBranch, prjgit.Base.Repo.SSHURL)
|
||||||
PanicOnError(err)
|
PanicOnError(err)
|
||||||
git.GitExecOrPanic(DefaultGitPrj, "fetch", remote, prjgit.Head.Sha)
|
git.GitExecOrPanic(DefaultGitPrj, "fetch", remote, prjgit.Head.Sha)
|
||||||
|
|
||||||
@@ -390,7 +716,7 @@ func (rs *PRSet) Merge(gitea GiteaReviewUnrequester, git Git) error {
|
|||||||
panic("FIXME")
|
panic("FIXME")
|
||||||
}
|
}
|
||||||
*/
|
*/
|
||||||
msg := fmt.Sprintf("Merging\n\nPR: %s/%s#%d", prjgit.Base.Repo.Owner.UserName, prjgit.Base.Repo.Name, prjgit.Index)
|
msg := fmt.Sprintf("Merging\n\nPR: %s/%s!%d", prjgit.Base.Repo.Owner.UserName, prjgit.Base.Repo.Name, prjgit.Index)
|
||||||
|
|
||||||
err = git.GitExec(DefaultGitPrj, "merge", "--no-ff", "-m", msg, prjgit.Head.Sha)
|
err = git.GitExec(DefaultGitPrj, "merge", "--no-ff", "-m", msg, prjgit.Head.Sha)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -402,6 +728,7 @@ func (rs *PRSet) Merge(gitea GiteaReviewUnrequester, git Git) error {
|
|||||||
// we can only resolve conflicts with .gitmodules
|
// we can only resolve conflicts with .gitmodules
|
||||||
for _, s := range status {
|
for _, s := range status {
|
||||||
if s.Status == GitStatus_Unmerged {
|
if s.Status == GitStatus_Unmerged {
|
||||||
|
panic("Can't handle conflicts yet")
|
||||||
if s.Path != ".gitmodules" {
|
if s.Path != ".gitmodules" {
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
@@ -492,10 +819,22 @@ func (rs *PRSet) Merge(gitea GiteaReviewUnrequester, git Git) error {
|
|||||||
if rs.IsPrjGitPR(prinfo.PR) {
|
if rs.IsPrjGitPR(prinfo.PR) {
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
prinfo.RemoteName, err = git.GitClone(repo.Name, rs.Config.Branch, repo.SSHURL)
|
br := rs.Config.Branch
|
||||||
|
if len(br) == 0 {
|
||||||
|
// if branch is unspecified, take it from the PR as it
|
||||||
|
// matches default branch already
|
||||||
|
br = prinfo.PR.Base.Name
|
||||||
|
} else if br != prinfo.PR.Base.Name {
|
||||||
|
panic(prinfo.PR.Base.Name + " is expected to match " + br)
|
||||||
|
}
|
||||||
|
prinfo.RemoteName, err = git.GitClone(repo.Name, br, repo.SSHURL)
|
||||||
PanicOnError(err)
|
PanicOnError(err)
|
||||||
git.GitExecOrPanic(repo.Name, "fetch", prinfo.RemoteName, head.Sha)
|
if rs.Config.MergeMode == MergeModeDevel {
|
||||||
git.GitExecOrPanic(repo.Name, "merge", "--ff", head.Sha)
|
git.GitExecOrPanic(repo.Name, "checkout", "-B", br, head.Sha)
|
||||||
|
} else {
|
||||||
|
git.GitExecOrPanic(repo.Name, "fetch", prinfo.RemoteName, head.Sha)
|
||||||
|
git.GitExecOrPanic(repo.Name, "merge", "--ff", head.Sha)
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -512,7 +851,12 @@ func (rs *PRSet) Merge(gitea GiteaReviewUnrequester, git Git) error {
|
|||||||
repo := prinfo.PR.Base.Repo
|
repo := prinfo.PR.Base.Repo
|
||||||
|
|
||||||
if !IsDryRun {
|
if !IsDryRun {
|
||||||
git.GitExecOrPanic(repo.Name, "push", prinfo.RemoteName)
|
params := []string{"push"}
|
||||||
|
if rs.Config.MergeMode == MergeModeDevel {
|
||||||
|
params = append(params, "-f")
|
||||||
|
}
|
||||||
|
params = append(params, prinfo.RemoteName)
|
||||||
|
git.GitExecOrPanic(repo.Name, params...)
|
||||||
} else {
|
} else {
|
||||||
LogInfo("*** WOULD push", repo.Name, "to", prinfo.RemoteName)
|
LogInfo("*** WOULD push", repo.Name, "to", prinfo.RemoteName)
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,48 +0,0 @@
|
|||||||
package common
|
|
||||||
|
|
||||||
import (
|
|
||||||
"errors"
|
|
||||||
"strings"
|
|
||||||
)
|
|
||||||
|
|
||||||
var UnknownParser error = errors.New("Cannot parse path")
|
|
||||||
|
|
||||||
type PRConflictResolver interface {
|
|
||||||
/*
|
|
||||||
stage_content -> { merge_base (stage1), head (stage2), merge_head (stage3) }
|
|
||||||
*/
|
|
||||||
Resolve(path string, stage_contents [3]string) error
|
|
||||||
}
|
|
||||||
|
|
||||||
var resolvers []PRConflictResolver = []PRConflictResolver{
|
|
||||||
&submodule_conflict_resolver{},
|
|
||||||
}
|
|
||||||
|
|
||||||
func ResolveMergeConflict(path string, file_contents [3]string) error {
|
|
||||||
for _, r := range resolvers {
|
|
||||||
if err := r.Resolve(path, file_contents); err != UnknownParser {
|
|
||||||
return err
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return UnknownParser
|
|
||||||
}
|
|
||||||
|
|
||||||
type submodule_conflict_resolver struct{}
|
|
||||||
|
|
||||||
func (*submodule_conflict_resolver) Resolve(path string, stage [3]string) error {
|
|
||||||
if path != ".gitmodules" {
|
|
||||||
return UnknownParser
|
|
||||||
}
|
|
||||||
return UnknownParser
|
|
||||||
}
|
|
||||||
|
|
||||||
type changes_file_resolver struct{}
|
|
||||||
|
|
||||||
func (*changes_file_resolver) Resolve(path string, stage [3]string) error {
|
|
||||||
if !strings.HasSuffix(path, ".changes") {
|
|
||||||
return UnknownParser
|
|
||||||
}
|
|
||||||
|
|
||||||
return UnknownParser
|
|
||||||
}
|
|
||||||
@@ -1,10 +0,0 @@
|
|||||||
package common_test
|
|
||||||
|
|
||||||
import "testing"
|
|
||||||
|
|
||||||
func ResolveSubmoduleConflicts(t *testing.T) {
|
|
||||||
}
|
|
||||||
|
|
||||||
func ResolveChangesFileConflict(t *testing.T) {
|
|
||||||
}
|
|
||||||
|
|
||||||
1151
common/pr_test.go
1151
common/pr_test.go
File diff suppressed because it is too large
Load Diff
238
common/rabbitmq.go
Normal file
238
common/rabbitmq.go
Normal file
@@ -0,0 +1,238 @@
|
|||||||
|
package common
|
||||||
|
|
||||||
|
/*
|
||||||
|
* This file is part of Autogits.
|
||||||
|
*
|
||||||
|
* Copyright © 2024 SUSE LLC
|
||||||
|
*
|
||||||
|
* Autogits is free software: you can redistribute it and/or modify it under
|
||||||
|
* the terms of the GNU General Public License as published by the Free Software
|
||||||
|
* Foundation, either version 2 of the License, or (at your option) any later
|
||||||
|
* version.
|
||||||
|
*
|
||||||
|
* Autogits is distributed in the hope that it will be useful, but WITHOUT ANY
|
||||||
|
* WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
|
||||||
|
* PARTICULAR PURPOSE. See the GNU General Public License for more details.
|
||||||
|
*
|
||||||
|
* You should have received a copy of the GNU General Public License along with
|
||||||
|
* Foobar. If not, see <https://www.gnu.org/licenses/>.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import (
|
||||||
|
"crypto/tls"
|
||||||
|
"fmt"
|
||||||
|
"net/url"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
rabbitmq "github.com/rabbitmq/amqp091-go"
|
||||||
|
)
|
||||||
|
|
||||||
|
type RabbitConnection struct {
|
||||||
|
RabbitURL *url.URL // amqps://user:password@host/queue
|
||||||
|
|
||||||
|
queueName string
|
||||||
|
ch *rabbitmq.Channel
|
||||||
|
|
||||||
|
topics []string
|
||||||
|
topicSubChanges chan string // +topic = subscribe, -topic = unsubscribe
|
||||||
|
}
|
||||||
|
|
||||||
|
type RabbitProcessor interface {
|
||||||
|
GenerateTopics() []string
|
||||||
|
|
||||||
|
Connection() *RabbitConnection
|
||||||
|
ProcessRabbitMessage(msg RabbitMessage) error
|
||||||
|
}
|
||||||
|
|
||||||
|
type RabbitMessage rabbitmq.Delivery
|
||||||
|
|
||||||
|
func (l *RabbitConnection) ProcessTopicChanges() {
|
||||||
|
for {
|
||||||
|
topic, ok := <-l.topicSubChanges
|
||||||
|
if !ok {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
LogDebug(" topic change:", topic)
|
||||||
|
switch topic[0] {
|
||||||
|
case '+':
|
||||||
|
if err := l.ch.QueueBind(l.queueName, topic[1:], "pubsub", false, nil); err != nil {
|
||||||
|
LogError(err)
|
||||||
|
}
|
||||||
|
case '-':
|
||||||
|
if err := l.ch.QueueUnbind(l.queueName, topic[1:], "pubsub", nil); err != nil {
|
||||||
|
LogError(err)
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
LogInfo("Ignoring unknown topic change:", topic)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (l *RabbitConnection) ProcessRabbitMQ(msgCh chan<- RabbitMessage) error {
|
||||||
|
queueName := l.RabbitURL.Path
|
||||||
|
l.RabbitURL.Path = ""
|
||||||
|
|
||||||
|
if len(queueName) > 0 && queueName[0] == '/' {
|
||||||
|
queueName = queueName[1:]
|
||||||
|
}
|
||||||
|
|
||||||
|
connection, err := rabbitmq.DialTLS(l.RabbitURL.String(), &tls.Config{
|
||||||
|
ServerName: l.RabbitURL.Hostname(),
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("Cannot connect to %s . Err: %w", l.RabbitURL.Hostname(), err)
|
||||||
|
}
|
||||||
|
defer connection.Close()
|
||||||
|
|
||||||
|
l.ch, err = connection.Channel()
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("Cannot create a channel. Err: %w", err)
|
||||||
|
}
|
||||||
|
defer l.ch.Close()
|
||||||
|
|
||||||
|
if err = l.ch.ExchangeDeclarePassive("pubsub", "topic", true, false, false, false, nil); err != nil {
|
||||||
|
return fmt.Errorf("Cannot find pubsub exchange? Err: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
var q rabbitmq.Queue
|
||||||
|
if len(queueName) == 0 {
|
||||||
|
q, err = l.ch.QueueDeclare("", false, true, true, false, nil)
|
||||||
|
} else {
|
||||||
|
q, err = l.ch.QueueDeclarePassive(queueName, true, false, true, false, nil)
|
||||||
|
if err != nil {
|
||||||
|
LogInfo("queue not found .. trying to create it:", err)
|
||||||
|
if l.ch.IsClosed() {
|
||||||
|
l.ch, err = connection.Channel()
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("Channel cannot be re-opened. Err: %w", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
q, err = l.ch.QueueDeclare(queueName, true, false, true, false, nil)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
LogInfo("can't create persistent queue ... falling back to temporaty queue:", err)
|
||||||
|
if l.ch.IsClosed() {
|
||||||
|
l.ch, err = connection.Channel()
|
||||||
|
return fmt.Errorf("Channel cannot be re-opened. Err: %w", err)
|
||||||
|
}
|
||||||
|
q, err = l.ch.QueueDeclare("", false, true, true, false, nil)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("Cannot declare queue. Err: %w", err)
|
||||||
|
}
|
||||||
|
// log.Printf("queue: %s:%d", q.Name, q.Consumers)
|
||||||
|
|
||||||
|
LogDebug(" -- listening to topics:")
|
||||||
|
l.topicSubChanges = make(chan string)
|
||||||
|
defer close(l.topicSubChanges)
|
||||||
|
go l.ProcessTopicChanges()
|
||||||
|
|
||||||
|
for _, topic := range l.topics {
|
||||||
|
l.topicSubChanges <- "+" + topic
|
||||||
|
}
|
||||||
|
|
||||||
|
msgs, err := l.ch.Consume(q.Name, "", true, true, false, false, nil)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("Cannot start consumer. Err: %w", err)
|
||||||
|
}
|
||||||
|
// log.Printf("queue: %s:%d", q.Name, q.Consumers)
|
||||||
|
|
||||||
|
for {
|
||||||
|
msg, ok := <-msgs
|
||||||
|
if !ok {
|
||||||
|
return fmt.Errorf("channel/connection closed?\n")
|
||||||
|
}
|
||||||
|
|
||||||
|
msgCh <- RabbitMessage(msg)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (l *RabbitConnection) ConnectAndProcessRabbitMQ(ch chan<- RabbitMessage) {
|
||||||
|
defer func() {
|
||||||
|
if r := recover(); r != nil {
|
||||||
|
LogError(r)
|
||||||
|
LogError("'crash' RabbitMQ worker. Recovering... reconnecting...")
|
||||||
|
time.Sleep(5 * time.Second)
|
||||||
|
go l.ConnectAndProcessRabbitMQ(ch)
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
|
||||||
|
for {
|
||||||
|
err := l.ProcessRabbitMQ(ch)
|
||||||
|
if err != nil {
|
||||||
|
LogError("Error in RabbitMQ connection:", err)
|
||||||
|
LogInfo("Reconnecting in 2 seconds...")
|
||||||
|
time.Sleep(2 * time.Second)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (l *RabbitConnection) ConnectToRabbitMQ(processor RabbitProcessor) <-chan RabbitMessage {
|
||||||
|
LogInfo("RabbitMQ connection:", l.RabbitURL.String())
|
||||||
|
|
||||||
|
l.RabbitURL.User = url.UserPassword(rabbitUser, rabbitPassword)
|
||||||
|
l.topics = processor.GenerateTopics()
|
||||||
|
|
||||||
|
ch := make(chan RabbitMessage, 100)
|
||||||
|
go l.ConnectAndProcessRabbitMQ(ch)
|
||||||
|
|
||||||
|
return ch
|
||||||
|
}
|
||||||
|
|
||||||
|
func (l *RabbitConnection) UpdateTopics(processor RabbitProcessor) {
|
||||||
|
newTopics := processor.GenerateTopics()
|
||||||
|
|
||||||
|
j := 0
|
||||||
|
next_new_topic:
|
||||||
|
for i := 0; i < len(newTopics); i++ {
|
||||||
|
topic := newTopics[i]
|
||||||
|
|
||||||
|
for j < len(l.topics) {
|
||||||
|
cmp := strings.Compare(topic, l.topics[j])
|
||||||
|
|
||||||
|
if cmp == 0 {
|
||||||
|
j++
|
||||||
|
continue next_new_topic
|
||||||
|
}
|
||||||
|
|
||||||
|
if cmp < 0 {
|
||||||
|
l.topicSubChanges <- "+" + topic
|
||||||
|
break
|
||||||
|
}
|
||||||
|
|
||||||
|
l.topicSubChanges <- "-" + l.topics[j]
|
||||||
|
j++
|
||||||
|
}
|
||||||
|
|
||||||
|
if j == len(l.topics) {
|
||||||
|
l.topicSubChanges <- "+" + topic
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for j < len(l.topics) {
|
||||||
|
l.topicSubChanges <- "-" + l.topics[j]
|
||||||
|
j++
|
||||||
|
}
|
||||||
|
|
||||||
|
l.topics = newTopics
|
||||||
|
}
|
||||||
|
|
||||||
|
func ProcessRabbitMQEvents(processor RabbitProcessor) error {
|
||||||
|
ch := processor.Connection().ConnectToRabbitMQ(processor)
|
||||||
|
|
||||||
|
for {
|
||||||
|
msg, ok := <-ch
|
||||||
|
if !ok {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
LogDebug("event:", msg.RoutingKey)
|
||||||
|
if err := processor.ProcessRabbitMessage(msg); err != nil {
|
||||||
|
LogError("Error processing", msg.RoutingKey, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
129
common/rabbitmq_gitea.go
Normal file
129
common/rabbitmq_gitea.go
Normal file
@@ -0,0 +1,129 @@
|
|||||||
|
package common
|
||||||
|
|
||||||
|
/*
|
||||||
|
* This file is part of Autogits.
|
||||||
|
*
|
||||||
|
* Copyright © 2024 SUSE LLC
|
||||||
|
*
|
||||||
|
* Autogits is free software: you can redistribute it and/or modify it under
|
||||||
|
* the terms of the GNU General Public License as published by the Free Software
|
||||||
|
* Foundation, either version 2 of the License, or (at your option) any later
|
||||||
|
* version.
|
||||||
|
*
|
||||||
|
* Autogits is distributed in the hope that it will be useful, but WITHOUT ANY
|
||||||
|
* WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
|
||||||
|
* PARTICULAR PURPOSE. See the GNU General Public License for more details.
|
||||||
|
*
|
||||||
|
* You should have received a copy of the GNU General Public License along with
|
||||||
|
* Foobar. If not, see <https://www.gnu.org/licenses/>.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"runtime/debug"
|
||||||
|
"slices"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
const RequestType_CreateBrachTag = "create"
|
||||||
|
const RequestType_DeleteBranchTag = "delete"
|
||||||
|
const RequestType_Fork = "fork"
|
||||||
|
const RequestType_Issue = "issues"
|
||||||
|
const RequestType_IssueAssign = "issue_assign"
|
||||||
|
const RequestType_IssueComment = "issue_comment"
|
||||||
|
const RequestType_IssueLabel = "issue_label"
|
||||||
|
const RequestType_IssueMilestone = "issue_milestone"
|
||||||
|
const RequestType_Push = "push"
|
||||||
|
const RequestType_Repository = "repository"
|
||||||
|
const RequestType_Release = "release"
|
||||||
|
const RequestType_PR = "pull_request"
|
||||||
|
const RequestType_PRAssign = "pull_request_assign"
|
||||||
|
const RequestType_PRLabel = "pull_request_label"
|
||||||
|
const RequestType_PRComment = "pull_request_comment"
|
||||||
|
const RequestType_PRMilestone = "pull_request_milestone"
|
||||||
|
const RequestType_PRSync = "pull_request_sync"
|
||||||
|
const RequestType_PRReviewAccepted = "pull_request_review_approved"
|
||||||
|
const RequestType_PRReviewRejected = "pull_request_review_rejected"
|
||||||
|
const RequestType_PRReviewRequest = "pull_request_review_request"
|
||||||
|
const RequestType_PRReviewComment = "pull_request_review_comment"
|
||||||
|
const RequestType_Status = "status"
|
||||||
|
const RequestType_Wiki = "wiki"
|
||||||
|
|
||||||
|
type RequestProcessor interface {
|
||||||
|
ProcessFunc(*Request) error
|
||||||
|
}
|
||||||
|
|
||||||
|
type RabbitMQGiteaEventsProcessor struct {
|
||||||
|
Handlers map[string]RequestProcessor
|
||||||
|
Orgs []string
|
||||||
|
|
||||||
|
c *RabbitConnection
|
||||||
|
}
|
||||||
|
|
||||||
|
func (gitea *RabbitMQGiteaEventsProcessor) Connection() *RabbitConnection {
|
||||||
|
if gitea.c == nil {
|
||||||
|
gitea.c = &RabbitConnection{}
|
||||||
|
}
|
||||||
|
return gitea.c
|
||||||
|
}
|
||||||
|
|
||||||
|
func (gitea *RabbitMQGiteaEventsProcessor) GenerateTopics() []string {
|
||||||
|
topics := make([]string, 0, len(gitea.Handlers)*len(gitea.Orgs))
|
||||||
|
scope := "suse"
|
||||||
|
if gitea.c.RabbitURL.Hostname() == "rabbit.opensuse.org" {
|
||||||
|
scope = "opensuse"
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, org := range gitea.Orgs {
|
||||||
|
for requestType, _ := range gitea.Handlers {
|
||||||
|
topics = append(topics, fmt.Sprintf("%s.src.%s.%s.#", scope, org, requestType))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
slices.Sort(topics)
|
||||||
|
return slices.Compact(topics)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (gitea *RabbitMQGiteaEventsProcessor) ProcessRabbitMessage(msg RabbitMessage) error {
|
||||||
|
route := strings.Split(msg.RoutingKey, ".")
|
||||||
|
if len(route) > 3 {
|
||||||
|
reqType := route[3]
|
||||||
|
org := route[2]
|
||||||
|
|
||||||
|
if !slices.Contains(gitea.Orgs, org) {
|
||||||
|
LogInfo("Got event for unhandeled org:", org)
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
LogDebug("org:", org, "type:", reqType)
|
||||||
|
if handler, found := gitea.Handlers[reqType]; found {
|
||||||
|
req, err := ParseRequestJSON(reqType, msg.Body)
|
||||||
|
if err != nil {
|
||||||
|
LogError("Error parsing request JSON:", err)
|
||||||
|
} else {
|
||||||
|
LogDebug("processing req", req.Type)
|
||||||
|
// h.Request = req
|
||||||
|
ProcessEvent(handler, req)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return fmt.Errorf("Invalid routing key: %s", route)
|
||||||
|
}
|
||||||
|
|
||||||
|
func ProcessEvent(f RequestProcessor, request *Request) {
|
||||||
|
defer func() {
|
||||||
|
if r := recover(); r != nil {
|
||||||
|
LogError("panic caught")
|
||||||
|
if err, ok := r.(error); !ok {
|
||||||
|
LogError(err)
|
||||||
|
}
|
||||||
|
LogError(string(debug.Stack()))
|
||||||
|
}
|
||||||
|
}()
|
||||||
|
|
||||||
|
if err := f.ProcessFunc(request); err != nil {
|
||||||
|
LogError(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
22
common/rabbitmq_obs.go
Normal file
22
common/rabbitmq_obs.go
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
package common
|
||||||
|
|
||||||
|
type RabbitMQObsBuildStatusProcessor struct {
|
||||||
|
c *RabbitConnection
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *RabbitMQObsBuildStatusProcessor) GenerateTopics() []string {
|
||||||
|
return []string{}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *RabbitMQObsBuildStatusProcessor) Connection() *RabbitConnection {
|
||||||
|
if o.c == nil {
|
||||||
|
o.c = &RabbitConnection{}
|
||||||
|
}
|
||||||
|
|
||||||
|
return o.c
|
||||||
|
}
|
||||||
|
|
||||||
|
func (o *RabbitMQObsBuildStatusProcessor) ProcessRabbitMessage(msg RabbitMessage) error {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
@@ -50,11 +50,13 @@ func TestListenDefinitionsTopicUpdate(t *testing.T) {
|
|||||||
u, _ := url.Parse("amqps://rabbit.example.com")
|
u, _ := url.Parse("amqps://rabbit.example.com")
|
||||||
for _, test := range tests {
|
for _, test := range tests {
|
||||||
t.Run(test.name, func(t *testing.T) {
|
t.Run(test.name, func(t *testing.T) {
|
||||||
l := ListenDefinitions{
|
l := &RabbitMQGiteaEventsProcessor{
|
||||||
Orgs: test.orgs1,
|
Orgs: test.orgs1,
|
||||||
Handlers: make(map[string]RequestProcessor),
|
Handlers: make(map[string]RequestProcessor),
|
||||||
topicSubChanges: make(chan string, len(test.topicDelta)*10),
|
c: &RabbitConnection{
|
||||||
RabbitURL: u,
|
RabbitURL: u,
|
||||||
|
topicSubChanges: make(chan string, len(test.topicDelta)*10),
|
||||||
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
slices.Sort(test.topicDelta)
|
slices.Sort(test.topicDelta)
|
||||||
@@ -64,11 +66,11 @@ func TestListenDefinitionsTopicUpdate(t *testing.T) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
changes := []string{}
|
changes := []string{}
|
||||||
l.UpdateTopics()
|
l.c.UpdateTopics(l)
|
||||||
a:
|
a:
|
||||||
for {
|
for {
|
||||||
select {
|
select {
|
||||||
case c := <-l.topicSubChanges:
|
case c := <-l.c.topicSubChanges:
|
||||||
changes = append(changes, c)
|
changes = append(changes, c)
|
||||||
default:
|
default:
|
||||||
changes = []string{}
|
changes = []string{}
|
||||||
@@ -78,13 +80,13 @@ func TestListenDefinitionsTopicUpdate(t *testing.T) {
|
|||||||
|
|
||||||
l.Orgs = test.orgs2
|
l.Orgs = test.orgs2
|
||||||
|
|
||||||
l.UpdateTopics()
|
l.c.UpdateTopics(l)
|
||||||
changes = []string{}
|
changes = []string{}
|
||||||
|
|
||||||
b:
|
b:
|
||||||
for {
|
for {
|
||||||
select {
|
select {
|
||||||
case c := <-l.topicSubChanges:
|
case c := <-l.c.topicSubChanges:
|
||||||
changes = append(changes, c)
|
changes = append(changes, c)
|
||||||
default:
|
default:
|
||||||
slices.Sort(changes)
|
slices.Sort(changes)
|
||||||
62
common/request_status.go
Normal file
62
common/request_status.go
Normal file
@@ -0,0 +1,62 @@
|
|||||||
|
package common
|
||||||
|
|
||||||
|
/*
|
||||||
|
* This file is part of Autogits.
|
||||||
|
*
|
||||||
|
* Copyright © 2024 SUSE LLC
|
||||||
|
*
|
||||||
|
* Autogits is free software: you can redistribute it and/or modify it under
|
||||||
|
* the terms of the GNU General Public License as published by the Free Software
|
||||||
|
* Foundation, either version 2 of the License, or (at your option) any later
|
||||||
|
* version.
|
||||||
|
*
|
||||||
|
* Autogits is distributed in the hope that it will be useful, but WITHOUT ANY
|
||||||
|
* WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
|
||||||
|
* PARTICULAR PURPOSE. See the GNU General Public License for more details.
|
||||||
|
*
|
||||||
|
* You should have received a copy of the GNU General Public License along with
|
||||||
|
* Foobar. If not, see <https://www.gnu.org/licenses/>.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
)
|
||||||
|
|
||||||
|
type Status struct {
|
||||||
|
}
|
||||||
|
|
||||||
|
type StatusWebhookEvent struct {
|
||||||
|
Id uint64
|
||||||
|
Context string
|
||||||
|
Description string
|
||||||
|
Sha string
|
||||||
|
State string
|
||||||
|
TargetUrl string
|
||||||
|
|
||||||
|
Commit Commit
|
||||||
|
Repository Repository
|
||||||
|
Sender *User
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *StatusWebhookEvent) GetAction() string {
|
||||||
|
return s.State
|
||||||
|
}
|
||||||
|
|
||||||
|
func (h *RequestHandler) ParseStatusRequest(data io.Reader) (*StatusWebhookEvent, error) {
|
||||||
|
action := new(StatusWebhookEvent)
|
||||||
|
err := json.NewDecoder(data).Decode(&action)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("Got error while parsing: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
h.StdLogger.Printf("Request status for repo: %s#%s\n", action.Repository.Full_Name, action.Sha)
|
||||||
|
h.Request = &Request{
|
||||||
|
Type: RequestType_Status,
|
||||||
|
Data: action,
|
||||||
|
}
|
||||||
|
|
||||||
|
return action, nil
|
||||||
|
}
|
||||||
40
common/request_status_test.go
Normal file
40
common/request_status_test.go
Normal file
@@ -0,0 +1,40 @@
|
|||||||
|
package common_test
|
||||||
|
|
||||||
|
import (
|
||||||
|
"os"
|
||||||
|
"strings"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"src.opensuse.org/autogits/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestStatusRequestParsing(t *testing.T) {
|
||||||
|
t.Run("parsing repo creation message", func(t *testing.T) {
|
||||||
|
var h common.RequestHandler
|
||||||
|
|
||||||
|
h.StdLogger, h.ErrLogger = common.CreateStdoutLogger(os.Stdout, os.Stdout)
|
||||||
|
json, err := h.ParseStatusRequest(strings.NewReader(requestStatusJSON))
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Can't parse struct: %s", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if json.GetAction() != "pending" {
|
||||||
|
t.Fatalf("json.action is '%#v'", json)
|
||||||
|
}
|
||||||
|
|
||||||
|
if json.Repository.Full_Name != "autogits/nodejs-common" ||
|
||||||
|
json.Repository.Parent == nil ||
|
||||||
|
json.Repository.Parent.Parent != nil ||
|
||||||
|
len(json.Repository.Ssh_Url) < 10 ||
|
||||||
|
json.Repository.Default_Branch != "factory" ||
|
||||||
|
json.Repository.Object_Format_Name != "sha256" {
|
||||||
|
|
||||||
|
t.Fatalf("invalid repository parse: %#v", json.Repository)
|
||||||
|
}
|
||||||
|
|
||||||
|
if json.Sha != "e637d86cbbdd438edbf60148e28f9d75a74d51b27b01f75610f247cd18394c8e" {
|
||||||
|
t.Fatal("Invalid SHA:", json.Sha)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
17
common/review_group.go
Normal file
17
common/review_group.go
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
package common
|
||||||
|
|
||||||
|
import (
|
||||||
|
"slices"
|
||||||
|
)
|
||||||
|
|
||||||
|
func (group *ReviewGroup) ExpandMaintainers(maintainers []string) []string {
|
||||||
|
idx := slices.Index(maintainers, group.Name)
|
||||||
|
if idx == -1 {
|
||||||
|
return maintainers
|
||||||
|
}
|
||||||
|
|
||||||
|
expandedMaintainers := slices.Replace(maintainers, idx, idx+1, group.Reviewers...)
|
||||||
|
slices.Sort(expandedMaintainers)
|
||||||
|
return slices.Compact(expandedMaintainers)
|
||||||
|
}
|
||||||
|
|
||||||
62
common/review_group_test.go
Normal file
62
common/review_group_test.go
Normal file
@@ -0,0 +1,62 @@
|
|||||||
|
package common_test
|
||||||
|
|
||||||
|
import (
|
||||||
|
"slices"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"src.opensuse.org/autogits/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestMaintainerGroupReplacer(t *testing.T) {
|
||||||
|
GroupName := "my_group"
|
||||||
|
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
reviewers []string
|
||||||
|
group_members []string
|
||||||
|
|
||||||
|
output []string
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "empty",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "group not maintainer",
|
||||||
|
reviewers: []string{"a", "b"},
|
||||||
|
group_members: []string{"g1", "g2"},
|
||||||
|
output: []string{"a", "b"},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "group maintainer",
|
||||||
|
reviewers: []string{"b", "my_group"},
|
||||||
|
group_members: []string{"g1", "g2"},
|
||||||
|
output: []string{"b", "g1", "g2"},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "sorted group maintainer",
|
||||||
|
reviewers: []string{"my_group", "b"},
|
||||||
|
group_members: []string{"g1", "g2"},
|
||||||
|
output: []string{"b", "g1", "g2"},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "group maintainer dedup",
|
||||||
|
reviewers: []string{"my_group", "g2", "b"},
|
||||||
|
group_members: []string{"g1", "g2"},
|
||||||
|
output: []string{"b", "g1", "g2"},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, test := range tests {
|
||||||
|
t.Run(test.name, func(t *testing.T) {
|
||||||
|
g := &common.ReviewGroup{
|
||||||
|
Name: GroupName,
|
||||||
|
Reviewers: test.group_members,
|
||||||
|
}
|
||||||
|
|
||||||
|
expandedList := g.ExpandMaintainers(test.reviewers)
|
||||||
|
if slices.Compare(expandedList, test.output) != 0 {
|
||||||
|
t.Error("Expected:", test.output, "but have", expandedList)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,9 +1,5 @@
|
|||||||
package common
|
package common
|
||||||
|
|
||||||
import (
|
|
||||||
"slices"
|
|
||||||
)
|
|
||||||
|
|
||||||
type Reviewers struct {
|
type Reviewers struct {
|
||||||
Prj []string
|
Prj []string
|
||||||
Pkg []string
|
Pkg []string
|
||||||
@@ -36,10 +32,5 @@ func ParseReviewers(input []string) *Reviewers {
|
|||||||
*pkg = append(*pkg, reviewer)
|
*pkg = append(*pkg, reviewer)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if !slices.Contains(r.Prj, Bot_BuildReview) {
|
|
||||||
r.Prj = append(r.Prj, Bot_BuildReview)
|
|
||||||
}
|
|
||||||
|
|
||||||
return r
|
return r
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,14 +21,14 @@ func TestReviewers(t *testing.T) {
|
|||||||
name: "project and package reviewers",
|
name: "project and package reviewers",
|
||||||
input: []string{"1", "2", "3", "*5", "+6", "-7"},
|
input: []string{"1", "2", "3", "*5", "+6", "-7"},
|
||||||
|
|
||||||
prj: []string{"5", "7", common.Bot_BuildReview},
|
prj: []string{"5", "7"},
|
||||||
pkg: []string{"1", "2", "3", "5", "6"},
|
pkg: []string{"1", "2", "3", "5", "6"},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
name: "optional project and package reviewers",
|
name: "optional project and package reviewers",
|
||||||
input: []string{"~1", "2", "3", "~*5", "+6", "-7"},
|
input: []string{"~1", "2", "3", "~*5", "+6", "-7"},
|
||||||
|
|
||||||
prj: []string{"7", common.Bot_BuildReview},
|
prj: []string{"7"},
|
||||||
pkg: []string{"2", "3", "6"},
|
pkg: []string{"2", "3", "6"},
|
||||||
prj_optional: []string{"5"},
|
prj_optional: []string{"5"},
|
||||||
pkg_optional: []string{"1", "5"},
|
pkg_optional: []string{"1", "5"},
|
||||||
|
|||||||
@@ -9,12 +9,14 @@ import (
|
|||||||
)
|
)
|
||||||
|
|
||||||
type PRReviews struct {
|
type PRReviews struct {
|
||||||
reviews []*models.PullReview
|
Reviews []*models.PullReview
|
||||||
reviewers []string
|
RequestedReviewers []string
|
||||||
comments []*models.TimelineComment
|
Comments []*models.TimelineComment
|
||||||
|
|
||||||
|
FullTimeline []*models.TimelineComment
|
||||||
}
|
}
|
||||||
|
|
||||||
func FetchGiteaReviews(rf GiteaReviewTimelineFetcher, reviewers []string, org, repo string, no int64) (*PRReviews, error) {
|
func FetchGiteaReviews(rf GiteaReviewTimelineFetcher, org, repo string, no int64) (*PRReviews, error) {
|
||||||
timeline, err := rf.GetTimeline(org, repo, no)
|
timeline, err := rf.GetTimeline(org, repo, no)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
@@ -25,10 +27,14 @@ func FetchGiteaReviews(rf GiteaReviewTimelineFetcher, reviewers []string, org, r
|
|||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
reviews := make([]*models.PullReview, 0, len(reviewers))
|
reviews := make([]*models.PullReview, 0, 10)
|
||||||
|
needNewReviews := []string{}
|
||||||
var comments []*models.TimelineComment
|
var comments []*models.TimelineComment
|
||||||
|
|
||||||
alreadyHaveUserReview := func(user string) bool {
|
alreadyHaveUserReview := func(user string) bool {
|
||||||
|
if slices.Contains(needNewReviews, user) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
for _, r := range reviews {
|
for _, r := range reviews {
|
||||||
if r.User != nil && r.User.UserName == user {
|
if r.User != nil && r.User.UserName == user {
|
||||||
return true
|
return true
|
||||||
@@ -37,32 +43,40 @@ func FetchGiteaReviews(rf GiteaReviewTimelineFetcher, reviewers []string, org, r
|
|||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
|
LogDebug("FetchingGiteaReviews for", org, repo, no)
|
||||||
|
LogDebug("Number of reviews:", len(rawReviews))
|
||||||
|
LogDebug("Number of items in timeline:", len(timeline))
|
||||||
|
|
||||||
|
cutOffIdx := len(timeline)
|
||||||
for idx, item := range timeline {
|
for idx, item := range timeline {
|
||||||
if item.Type == TimelineCommentType_Review {
|
if item.Type == TimelineCommentType_Review || item.Type == TimelineCommentType_ReviewRequested {
|
||||||
for _, r := range rawReviews {
|
for _, r := range rawReviews {
|
||||||
if r.ID == item.ReviewID {
|
if r.ID == item.ReviewID {
|
||||||
if !alreadyHaveUserReview(r.User.UserName) {
|
if !alreadyHaveUserReview(r.User.UserName) {
|
||||||
reviews = append(reviews, r)
|
if item.Type == TimelineCommentType_Review && idx > cutOffIdx {
|
||||||
|
needNewReviews = append(needNewReviews, r.User.UserName)
|
||||||
|
} else {
|
||||||
|
reviews = append(reviews, r)
|
||||||
|
}
|
||||||
}
|
}
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
} else if item.Type == TimelineCommentType_Comment {
|
} else if item.Type == TimelineCommentType_Comment && cutOffIdx > idx {
|
||||||
comments = append(comments, item)
|
comments = append(comments, item)
|
||||||
} else if item.Type == TimelineCommentType_PushPull {
|
} else if item.Type == TimelineCommentType_PushPull && cutOffIdx == len(timeline) {
|
||||||
LogDebug("cut-off", item.Created)
|
LogDebug("cut-off", item.Created, "@", idx)
|
||||||
timeline = timeline[0:idx]
|
cutOffIdx = idx
|
||||||
break
|
|
||||||
} else {
|
} else {
|
||||||
LogDebug("Unhandled timeline type:", item.Type)
|
LogDebug("Unhandled timeline type:", item.Type)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
LogDebug("num comments:", len(comments), "reviews:", len(reviews), len(timeline))
|
LogDebug("num comments:", len(comments), "timeline:", len(reviews))
|
||||||
|
|
||||||
return &PRReviews{
|
return &PRReviews{
|
||||||
reviews: reviews,
|
Reviews: reviews,
|
||||||
reviewers: reviewers,
|
Comments: comments,
|
||||||
comments: comments,
|
FullTimeline: timeline,
|
||||||
}, nil
|
}, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -81,23 +95,27 @@ func bodyCommandManualMergeOK(body string) bool {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func (r *PRReviews) IsManualMergeOK() bool {
|
func (r *PRReviews) IsManualMergeOK() bool {
|
||||||
for _, c := range r.comments {
|
if r == nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, c := range r.Comments {
|
||||||
if c.Updated != c.Created {
|
if c.Updated != c.Created {
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
LogDebug("comment:", c.User.UserName, c.Body)
|
LogDebug("comment:", c.User.UserName, c.Body)
|
||||||
if slices.Contains(r.reviewers, c.User.UserName) {
|
if slices.Contains(r.RequestedReviewers, c.User.UserName) {
|
||||||
if bodyCommandManualMergeOK(c.Body) {
|
if bodyCommandManualMergeOK(c.Body) {
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
for _, c := range r.reviews {
|
for _, c := range r.Reviews {
|
||||||
if c.Updated != c.Submitted {
|
if c.Updated != c.Submitted {
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
if slices.Contains(r.reviewers, c.User.UserName) {
|
if slices.Contains(r.RequestedReviewers, c.User.UserName) {
|
||||||
if bodyCommandManualMergeOK(c.Body) {
|
if bodyCommandManualMergeOK(c.Body) {
|
||||||
return true
|
return true
|
||||||
}
|
}
|
||||||
@@ -108,11 +126,14 @@ func (r *PRReviews) IsManualMergeOK() bool {
|
|||||||
}
|
}
|
||||||
|
|
||||||
func (r *PRReviews) IsApproved() bool {
|
func (r *PRReviews) IsApproved() bool {
|
||||||
|
if r == nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
goodReview := true
|
goodReview := true
|
||||||
|
|
||||||
for _, reviewer := range r.reviewers {
|
for _, reviewer := range r.RequestedReviewers {
|
||||||
goodReview = false
|
goodReview = false
|
||||||
for _, review := range r.reviews {
|
for _, review := range r.Reviews {
|
||||||
if review.User.UserName == reviewer && review.State == ReviewStateApproved && !review.Stale && !review.Dismissed {
|
if review.User.UserName == reviewer && review.State == ReviewStateApproved && !review.Stale && !review.Dismissed {
|
||||||
LogDebug(" -- found review: ", review.User.UserName)
|
LogDebug(" -- found review: ", review.User.UserName)
|
||||||
goodReview = true
|
goodReview = true
|
||||||
@@ -128,45 +149,78 @@ func (r *PRReviews) IsApproved() bool {
|
|||||||
return goodReview
|
return goodReview
|
||||||
}
|
}
|
||||||
|
|
||||||
func (r *PRReviews) HasPendingReviewBy(reviewer string) bool {
|
func (r *PRReviews) MissingReviews() []string {
|
||||||
if !slices.Contains(r.reviewers, reviewer) {
|
missing := []string{}
|
||||||
return false
|
if r == nil {
|
||||||
|
return missing
|
||||||
}
|
}
|
||||||
|
|
||||||
isPending := false
|
for _, reviewer := range r.RequestedReviewers {
|
||||||
for _, r := range r.reviews {
|
if !r.IsReviewedBy(reviewer) {
|
||||||
if r.User.UserName == reviewer && !r.Stale {
|
missing = append(missing, reviewer)
|
||||||
switch r.State {
|
}
|
||||||
case ReviewStateApproved:
|
}
|
||||||
fallthrough
|
return missing
|
||||||
case ReviewStateRequestChanges:
|
}
|
||||||
return false
|
|
||||||
case ReviewStateRequestReview:
|
func (r *PRReviews) FindReviewRequester(reviewer string) *models.TimelineComment {
|
||||||
fallthrough
|
if r == nil {
|
||||||
case ReviewStatePending:
|
return nil
|
||||||
isPending = true
|
}
|
||||||
}
|
|
||||||
|
for _, r := range r.FullTimeline {
|
||||||
|
if r.Type == TimelineCommentType_ReviewRequested && r.Assignee.UserName == reviewer {
|
||||||
|
return r
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
return isPending
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func (r *PRReviews) IsReviewedBy(reviewer string) bool {
|
func (r *PRReviews) HasPendingReviewBy(reviewer string) bool {
|
||||||
if !slices.Contains(r.reviewers, reviewer) {
|
if r == nil {
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
for _, r := range r.reviews {
|
for _, r := range r.Reviews {
|
||||||
if r.User.UserName == reviewer && !r.Stale {
|
if r.User.UserName == reviewer {
|
||||||
switch r.State {
|
switch r.State {
|
||||||
case ReviewStateApproved:
|
case ReviewStateRequestReview, ReviewStatePending:
|
||||||
return true
|
|
||||||
case ReviewStateRequestChanges:
|
|
||||||
return true
|
return true
|
||||||
|
default:
|
||||||
|
return false
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (r *PRReviews) IsReviewedBy(reviewer string) bool {
|
||||||
|
if r == nil {
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, r := range r.Reviews {
|
||||||
|
if r.User.UserName == reviewer && !r.Stale {
|
||||||
|
switch r.State {
|
||||||
|
case ReviewStateApproved, ReviewStateRequestChanges:
|
||||||
|
return true
|
||||||
|
default:
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
func (r *PRReviews) IsReviewedByOneOf(reviewers ...string) bool {
|
||||||
|
for _, reviewer := range reviewers {
|
||||||
|
if r.IsReviewedBy(reviewer) {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|||||||
@@ -62,11 +62,23 @@ func TestReviews(t *testing.T) {
|
|||||||
{
|
{
|
||||||
name: "Two reviewer, one stale and pending",
|
name: "Two reviewer, one stale and pending",
|
||||||
reviews: []*models.PullReview{
|
reviews: []*models.PullReview{
|
||||||
&models.PullReview{State: common.ReviewStateRequestReview, User: &models.User{UserName: "user1"}, Stale: true},
|
{State: common.ReviewStateRequestReview, User: &models.User{UserName: "user1"}, Stale: true},
|
||||||
},
|
},
|
||||||
reviewers: []string{"user1", "user2"},
|
reviewers: []string{"user1", "user2"},
|
||||||
isApproved: false,
|
isApproved: false,
|
||||||
isPendingByTest1: false,
|
isPendingByTest1: true,
|
||||||
|
isReviewedByTest1: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Two reviewer, one stale and pending, other done",
|
||||||
|
reviews: []*models.PullReview{
|
||||||
|
{State: common.ReviewStateRequestReview, User: &models.User{UserName: "user1"}},
|
||||||
|
{State: common.ReviewStateRequestChanges, User: &models.User{UserName: "user1"}},
|
||||||
|
{State: common.ReviewStateApproved, User: &models.User{UserName: "user2"}},
|
||||||
|
},
|
||||||
|
reviewers: []string{"user1", "user2"},
|
||||||
|
isApproved: false,
|
||||||
|
isPendingByTest1: true,
|
||||||
isReviewedByTest1: false,
|
isReviewedByTest1: false,
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -139,7 +151,7 @@ func TestReviews(t *testing.T) {
|
|||||||
rf.EXPECT().GetTimeline("test", "pr", int64(1)).Return(test.timeline, nil)
|
rf.EXPECT().GetTimeline("test", "pr", int64(1)).Return(test.timeline, nil)
|
||||||
rf.EXPECT().GetPullRequestReviews("test", "pr", int64(1)).Return(test.reviews, test.fetchErr)
|
rf.EXPECT().GetPullRequestReviews("test", "pr", int64(1)).Return(test.reviews, test.fetchErr)
|
||||||
|
|
||||||
reviews, err := common.FetchGiteaReviews(rf, test.reviewers, "test", "pr", 1)
|
reviews, err := common.FetchGiteaReviews(rf, "test", "pr", 1)
|
||||||
|
|
||||||
if test.fetchErr != nil {
|
if test.fetchErr != nil {
|
||||||
if err != test.fetchErr {
|
if err != test.fetchErr {
|
||||||
@@ -147,6 +159,7 @@ func TestReviews(t *testing.T) {
|
|||||||
}
|
}
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
reviews.RequestedReviewers = test.reviewers
|
||||||
|
|
||||||
if r := reviews.IsApproved(); r != test.isApproved {
|
if r := reviews.IsApproved(); r != test.isApproved {
|
||||||
t.Fatal("Unexpected IsReviewed():", r, "vs. expected", test.isApproved)
|
t.Fatal("Unexpected IsReviewed():", r, "vs. expected", test.isApproved)
|
||||||
|
|||||||
@@ -113,6 +113,10 @@ func (s *Submodule) parseKeyValue(line string) error {
|
|||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (s *Submodule) ManifestSubmodulePath(manifest *Manifest) string {
|
||||||
|
return manifest.SubdirForPackage(s.Path)
|
||||||
|
}
|
||||||
|
|
||||||
func ParseSubmodulesFile(reader io.Reader) ([]Submodule, error) {
|
func ParseSubmodulesFile(reader io.Reader) ([]Submodule, error) {
|
||||||
data, err := io.ReadAll(reader)
|
data, err := io.ReadAll(reader)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
|||||||
@@ -1,6 +1,10 @@
|
|||||||
#!/usr/bin/bash
|
#!/usr/bin/bash
|
||||||
|
|
||||||
git init -q --bare --object-format=sha256
|
git init -q --bare --object-format=sha256
|
||||||
|
git config user.email test@example.com
|
||||||
|
git config user.name Test
|
||||||
|
export GIT_AUTHOR_DATE=2025-10-27T14:20:07+01:00
|
||||||
|
export GIT_COMMITTER_DATE=2025-10-27T14:20:07+01:00
|
||||||
|
|
||||||
# 81aba862107f1e2f5312e165453955485f424612f313d6c2fb1b31fef9f82a14
|
# 81aba862107f1e2f5312e165453955485f424612f313d6c2fb1b31fef9f82a14
|
||||||
blobA=$(echo "help" | git hash-object --stdin -w)
|
blobA=$(echo "help" | git hash-object --stdin -w)
|
||||||
|
|||||||
116
common/utils.go
116
common/utils.go
@@ -27,10 +27,87 @@ import (
|
|||||||
"regexp"
|
"regexp"
|
||||||
"slices"
|
"slices"
|
||||||
"strings"
|
"strings"
|
||||||
|
"unicode"
|
||||||
|
|
||||||
"src.opensuse.org/autogits/common/gitea-generated/models"
|
"src.opensuse.org/autogits/common/gitea-generated/models"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
type NewRepos struct {
|
||||||
|
Repos []struct {
|
||||||
|
Organization, Repository, Branch string
|
||||||
|
PackageName string
|
||||||
|
}
|
||||||
|
IsMaintainer bool
|
||||||
|
}
|
||||||
|
|
||||||
|
const maintainership_line = "MAINTAINER"
|
||||||
|
|
||||||
|
var true_lines []string = []string{"1", "TRUE", "YES", "OK", "T"}
|
||||||
|
|
||||||
|
func HasSpace(s string) bool {
|
||||||
|
return strings.IndexFunc(s, unicode.IsSpace) >= 0
|
||||||
|
}
|
||||||
|
|
||||||
|
func FindNewReposInIssueBody(body string) *NewRepos {
|
||||||
|
Issues := &NewRepos{}
|
||||||
|
for _, line := range strings.Split(body, "\n") {
|
||||||
|
line = strings.TrimSpace(line)
|
||||||
|
if ul := strings.ToUpper(line); strings.HasPrefix(ul, "MAINTAINER") {
|
||||||
|
value := ""
|
||||||
|
if idx := strings.IndexRune(ul, ':'); idx > 0 && len(ul) > idx+2 {
|
||||||
|
value = ul[idx+1:]
|
||||||
|
} else if idx := strings.IndexRune(ul, ' '); idx > 0 && len(ul) > idx+2 {
|
||||||
|
value = ul[idx+1:]
|
||||||
|
}
|
||||||
|
|
||||||
|
if slices.Contains(true_lines, strings.TrimSpace(value)) {
|
||||||
|
Issues.IsMaintainer = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// line = strings.TrimSpace(line)
|
||||||
|
issue := struct{ Organization, Repository, Branch, PackageName string }{}
|
||||||
|
|
||||||
|
branch := strings.Split(line, "#")
|
||||||
|
repo := strings.Split(branch[0], "/")
|
||||||
|
|
||||||
|
if len(branch) == 2 {
|
||||||
|
issue.Branch = strings.TrimSpace(branch[1])
|
||||||
|
}
|
||||||
|
if len(repo) == 2 {
|
||||||
|
issue.Organization = strings.TrimSpace(repo[0])
|
||||||
|
issue.Repository = strings.TrimSpace(repo[1])
|
||||||
|
issue.PackageName = issue.Repository
|
||||||
|
|
||||||
|
if idx := strings.Index(strings.ToUpper(issue.Branch), " AS "); idx > 0 && len(issue.Branch) > idx+5 {
|
||||||
|
issue.PackageName = strings.TrimSpace(issue.Branch[idx+3:])
|
||||||
|
issue.Branch = strings.TrimSpace(issue.Branch[0:idx])
|
||||||
|
}
|
||||||
|
|
||||||
|
if HasSpace(issue.Organization) || HasSpace(issue.Repository) || HasSpace(issue.PackageName) || HasSpace(issue.Branch) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
Issues.Repos = append(Issues.Repos, issue)
|
||||||
|
//PackageNameIdx := strings.Index(strings.ToUpper(line), " AS ")
|
||||||
|
//words := strings.Split(line)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(Issues.Repos) == 0 {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
return Issues
|
||||||
|
}
|
||||||
|
|
||||||
|
func IssueToString(issue *models.Issue) string {
|
||||||
|
if issue == nil {
|
||||||
|
return "(nil)"
|
||||||
|
}
|
||||||
|
|
||||||
|
return fmt.Sprintf("%s/%s#%d", issue.Repository.Owner, issue.Repository.Name, issue.Index)
|
||||||
|
}
|
||||||
|
|
||||||
func SplitLines(str string) []string {
|
func SplitLines(str string) []string {
|
||||||
return SplitStringNoEmpty(str, "\n")
|
return SplitStringNoEmpty(str, "\n")
|
||||||
}
|
}
|
||||||
@@ -54,6 +131,10 @@ func TranslateHttpsToSshUrl(url string) (string, error) {
|
|||||||
url2_len = len(url2)
|
url2_len = len(url2)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
if len(url) > 10 && (url[0:10] == "gitea@src." || url[0:10] == "ssh://gite") {
|
||||||
|
return url, nil
|
||||||
|
}
|
||||||
|
|
||||||
if len(url) > url1_len && url[0:url1_len] == url1 {
|
if len(url) > url1_len && url[0:url1_len] == url1 {
|
||||||
return "ssh://gitea@src.opensuse.org/" + url[url1_len:], nil
|
return "ssh://gitea@src.opensuse.org/" + url[url1_len:], nil
|
||||||
}
|
}
|
||||||
@@ -132,7 +213,7 @@ func PRtoString(pr *models.PullRequest) string {
|
|||||||
return "(null)"
|
return "(null)"
|
||||||
}
|
}
|
||||||
|
|
||||||
return fmt.Sprintf("%s/%s#%d", pr.Base.Repo.Owner.UserName, pr.Base.Repo.Name, pr.Index)
|
return fmt.Sprintf("%s/%s!%d", pr.Base.Repo.Owner.UserName, pr.Base.Repo.Name, pr.Index)
|
||||||
}
|
}
|
||||||
|
|
||||||
type DevelProject struct {
|
type DevelProject struct {
|
||||||
@@ -164,9 +245,10 @@ func FetchDevelProjects() (DevelProjects, error) {
|
|||||||
}
|
}
|
||||||
|
|
||||||
var DevelProjectNotFound = errors.New("Devel project not found")
|
var DevelProjectNotFound = errors.New("Devel project not found")
|
||||||
|
|
||||||
func (d DevelProjects) GetDevelProject(pkg string) (string, error) {
|
func (d DevelProjects) GetDevelProject(pkg string) (string, error) {
|
||||||
for _, item := range d {
|
for _, item := range d {
|
||||||
if item.Package == pkg {
|
if item.Package == pkg {
|
||||||
return item.Project, nil
|
return item.Project, nil
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -174,3 +256,33 @@ func (d DevelProjects) GetDevelProject(pkg string) (string, error) {
|
|||||||
return "", DevelProjectNotFound
|
return "", DevelProjectNotFound
|
||||||
}
|
}
|
||||||
|
|
||||||
|
var removedBranchNameSuffixes []string = []string{
|
||||||
|
"-rm",
|
||||||
|
"-removed",
|
||||||
|
"-deleted",
|
||||||
|
}
|
||||||
|
|
||||||
|
func findRemovedBranchSuffix(branchName string) string {
|
||||||
|
branchName = strings.ToLower(branchName)
|
||||||
|
|
||||||
|
for _, suffix := range removedBranchNameSuffixes {
|
||||||
|
if len(suffix) < len(branchName) && strings.HasSuffix(branchName, suffix) {
|
||||||
|
return suffix
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return ""
|
||||||
|
}
|
||||||
|
|
||||||
|
func IsRemovedBranch(branchName string) bool {
|
||||||
|
return len(findRemovedBranchSuffix(branchName)) > 0
|
||||||
|
}
|
||||||
|
|
||||||
|
func TrimRemovedBranchSuffix(branchName string) string {
|
||||||
|
suffix := findRemovedBranchSuffix(branchName)
|
||||||
|
if len(suffix) > 0 {
|
||||||
|
return branchName[0 : len(branchName)-len(suffix)]
|
||||||
|
}
|
||||||
|
|
||||||
|
return branchName
|
||||||
|
}
|
||||||
|
|||||||
@@ -1,6 +1,7 @@
|
|||||||
package common_test
|
package common_test
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
"reflect"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"src.opensuse.org/autogits/common"
|
"src.opensuse.org/autogits/common"
|
||||||
@@ -165,3 +166,142 @@ func TestRemoteName(t *testing.T) {
|
|||||||
})
|
})
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestRemovedBranchName(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
branchName string
|
||||||
|
isRemoved bool
|
||||||
|
regularName string
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "Empty branch",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Removed suffix only",
|
||||||
|
branchName: "-rm",
|
||||||
|
isRemoved: false,
|
||||||
|
regularName: "-rm",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Capital suffix",
|
||||||
|
branchName: "Foo-Rm",
|
||||||
|
isRemoved: true,
|
||||||
|
regularName: "Foo",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Other suffixes",
|
||||||
|
isRemoved: true,
|
||||||
|
branchName: "Goo-Rm-DeleteD",
|
||||||
|
regularName: "Goo-Rm",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Other suffixes",
|
||||||
|
isRemoved: true,
|
||||||
|
branchName: "main-REMOVED",
|
||||||
|
regularName: "main",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Not removed separator",
|
||||||
|
isRemoved: false,
|
||||||
|
branchName: "main;REMOVED",
|
||||||
|
regularName: "main;REMOVED",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, test := range tests {
|
||||||
|
t.Run(test.name, func(t *testing.T) {
|
||||||
|
if r := common.IsRemovedBranch(test.branchName); r != test.isRemoved {
|
||||||
|
t.Error("Expecting isRemoved:", test.isRemoved, "but received", r)
|
||||||
|
}
|
||||||
|
|
||||||
|
if tn := common.TrimRemovedBranchSuffix(test.branchName); tn != test.regularName {
|
||||||
|
t.Error("Expected stripped branch name to be:", test.regularName, "but have:", tn)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestNewPackageIssueParsing(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
name string
|
||||||
|
input string
|
||||||
|
issues *common.NewRepos
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
name: "Nothing",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Basic repo",
|
||||||
|
input: "org/repo#branch",
|
||||||
|
issues: &common.NewRepos{
|
||||||
|
Repos: []struct{ Organization, Repository, Branch, PackageName string }{
|
||||||
|
{Organization: "org", Repository: "repo", Branch: "branch", PackageName: "repo"},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Default branch and junk lines and approval for maintainership",
|
||||||
|
input: "\n\nsome comments\n\norg1/repo2\n\nmaintainership: yes",
|
||||||
|
issues: &common.NewRepos{
|
||||||
|
Repos: []struct{ Organization, Repository, Branch, PackageName string }{
|
||||||
|
{Organization: "org1", Repository: "repo2", Branch: "", PackageName: "repo2"},
|
||||||
|
},
|
||||||
|
IsMaintainer: true,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Default branch and junk lines and no maintainership",
|
||||||
|
input: "\n\nsome comments\n\norg1/repo2\n\nmaintainership: NEVER",
|
||||||
|
issues: &common.NewRepos{
|
||||||
|
Repos: []struct{ Organization, Repository, Branch, PackageName string }{
|
||||||
|
{Organization: "org1", Repository: "repo2", Branch: "", PackageName: "repo2"},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "3 repos with comments and maintainership",
|
||||||
|
input: "\n\nsome comments for org1/repo2 are here and more\n\norg1/repo2#master\n org2/repo3#master\n some/repo3#m\nMaintainer ok",
|
||||||
|
issues: &common.NewRepos{
|
||||||
|
Repos: []struct{ Organization, Repository, Branch, PackageName string }{
|
||||||
|
{Organization: "org1", Repository: "repo2", Branch: "master", PackageName: "repo2"},
|
||||||
|
{Organization: "org2", Repository: "repo3", Branch: "master", PackageName: "repo3"},
|
||||||
|
{Organization: "some", Repository: "repo3", Branch: "m", PackageName: "repo3"},
|
||||||
|
},
|
||||||
|
IsMaintainer: true,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Invalid repos with spaces",
|
||||||
|
input: "or g/repo#branch\norg/r epo#branch\norg/repo#br anch\norg/repo#branch As foo ++",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Valid repos with spaces",
|
||||||
|
input: " org / repo # branch",
|
||||||
|
issues: &common.NewRepos{
|
||||||
|
Repos: []struct{ Organization, Repository, Branch, PackageName string }{
|
||||||
|
{Organization: "org", Repository: "repo", Branch: "branch", PackageName: "repo"},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "Package name is not repo name",
|
||||||
|
input: " org / repo # branch as repo++ \nmaintainer true",
|
||||||
|
issues: &common.NewRepos{
|
||||||
|
Repos: []struct{ Organization, Repository, Branch, PackageName string }{
|
||||||
|
{Organization: "org", Repository: "repo", Branch: "branch", PackageName: "repo++"},
|
||||||
|
},
|
||||||
|
IsMaintainer: true,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, test := range tests {
|
||||||
|
t.Run(test.name, func(t *testing.T) {
|
||||||
|
issue := common.FindNewReposInIssueBody(test.input)
|
||||||
|
if !reflect.DeepEqual(test.issues, issue) {
|
||||||
|
t.Error("Expected", test.issues, "but have", issue)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
8
containers/Makefile
Normal file
8
containers/Makefile
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
all: ../workflow-direct/workflow-direct
|
||||||
|
cp ../workflow-direct/workflow-direct workflow-direct
|
||||||
|
podman build --pull=always -t workflow-direct workflow-direct
|
||||||
|
|
||||||
|
pr:
|
||||||
|
cp ../workflow-pr/workflow-pr workflow-pr
|
||||||
|
podman build --pull=always -t workflow-pr workflow-pr
|
||||||
|
|
||||||
1
containers/workflow-direct/.gitignore
vendored
Normal file
1
containers/workflow-direct/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
workflow-direct
|
||||||
14
containers/workflow-direct/Containerfile
Normal file
14
containers/workflow-direct/Containerfile
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
FROM registry.suse.com/bci/bci-base
|
||||||
|
RUN zypper install -y openssh-clients git-core
|
||||||
|
RUN mkdir /root/.ssh
|
||||||
|
RUN mkdir /repos
|
||||||
|
RUN ln -s /data/workflow-direct.key /root/.ssh/id_ed25519
|
||||||
|
RUN ln -s /data/workflow-direct.key.pub /root/.ssh/id_ed25519.pub
|
||||||
|
ADD known_hosts /root/.ssh/known_hosts
|
||||||
|
ADD workflow-direct /srv/workflow-direct
|
||||||
|
ENV AMQP_USERNAME=opensuse
|
||||||
|
ENV AMQP_PASSWORD=opensuse
|
||||||
|
VOLUME /data
|
||||||
|
VOLUME /repos
|
||||||
|
ENTRYPOINT /srv/workflow-direct -config /data/config.json -repo-path /repos -debug -check-on-start
|
||||||
|
|
||||||
4
containers/workflow-direct/known_hosts
Normal file
4
containers/workflow-direct/known_hosts
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
src.opensuse.org,195.135.223.224 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDJ8V51MVIFUkQqQOdHwC3SP9NPqp1ZWYoEbcjvZ7HhSFi2XF8ALo/h1Mk+q8kT2O75/goeTsKFbcU8zrYFeOh0=
|
||||||
|
src.opensuse.org,195.135.223.224 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkVeXePin0haffC085V2L0jvILfwbB2Mt1fpVe21QAOcWNM+/jOC5RwtWweV/LigHImB39/KvkuPa9yLoDf+eLhdZQckSSauRfDjxtlKeFLPrfJKSA0XeVJT3kJcOvDT/3ANFhYeBbAUBTAeQt5bi2hHC1twMPbaaEdJ2jiMaIBztFf6aE9K58uoS+7Y2tTv87Mv/7lqoBW6BFMoDmjQFWgjik6ZMCvIM/7bj7AgqHk/rjmr5zKS4ag5wtHtYLm1L3LBmHdj7d0VFsOpPQexIOEnnjzKqlwmAxT6eYJ/t3qgBlT8KRfshBFgEuUZ5GJOC7TOne4PfB0bboPMZzIRo3WE9dPGRR8kAIme8XqhFbmjdJ+WsTjg0Lj+415tIbyRQoNkLtawrJxozvevs6wFEFcA/YG6o03Z577tiLT3WxOguCcD5vrALH48SyZb8jDUtcVgTWMW0to/n63S8JGUNyF7Bkw9HQWUx+GO1cv2GNzKpk22KS5dlNUVGE9E/7Ydc=
|
||||||
|
src.opensuse.org,195.135.223.224 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFKNThLRPznU5Io1KrAYHmYpaoLQEMGM9nwpKyYQCkPx
|
||||||
|
|
||||||
1
containers/workflow-pr/.gitignore
vendored
Normal file
1
containers/workflow-pr/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
workflow-pr
|
||||||
14
containers/workflow-pr/Containerfile
Normal file
14
containers/workflow-pr/Containerfile
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
FROM registry.suse.com/bci/bci-base
|
||||||
|
RUN zypper install -y openssh-clients git-core
|
||||||
|
RUN mkdir /root/.ssh
|
||||||
|
RUN mkdir /repos
|
||||||
|
RUN ln -s /data/workflow-pr.key /root/.ssh/id_ed25519
|
||||||
|
RUN ln -s /data/workflow-pr.key.pub /root/.ssh/id_ed25519.pub
|
||||||
|
ADD known_hosts /root/.ssh/known_hosts
|
||||||
|
ADD workflow-pr /srv/workflow-pr
|
||||||
|
ENV AMQP_USERNAME=opensuse
|
||||||
|
ENV AMQP_PASSWORD=opensuse
|
||||||
|
VOLUME /data
|
||||||
|
VOLUME /repos
|
||||||
|
ENTRYPOINT /srv/workflow-pr -config /data/config.json -repo-path /repos -debug -check-on-start
|
||||||
|
|
||||||
4
containers/workflow-pr/known_hosts
Normal file
4
containers/workflow-pr/known_hosts
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
src.opensuse.org,195.135.223.224 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDJ8V51MVIFUkQqQOdHwC3SP9NPqp1ZWYoEbcjvZ7HhSFi2XF8ALo/h1Mk+q8kT2O75/goeTsKFbcU8zrYFeOh0=
|
||||||
|
src.opensuse.org,195.135.223.224 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkVeXePin0haffC085V2L0jvILfwbB2Mt1fpVe21QAOcWNM+/jOC5RwtWweV/LigHImB39/KvkuPa9yLoDf+eLhdZQckSSauRfDjxtlKeFLPrfJKSA0XeVJT3kJcOvDT/3ANFhYeBbAUBTAeQt5bi2hHC1twMPbaaEdJ2jiMaIBztFf6aE9K58uoS+7Y2tTv87Mv/7lqoBW6BFMoDmjQFWgjik6ZMCvIM/7bj7AgqHk/rjmr5zKS4ag5wtHtYLm1L3LBmHdj7d0VFsOpPQexIOEnnjzKqlwmAxT6eYJ/t3qgBlT8KRfshBFgEuUZ5GJOC7TOne4PfB0bboPMZzIRo3WE9dPGRR8kAIme8XqhFbmjdJ+WsTjg0Lj+415tIbyRQoNkLtawrJxozvevs6wFEFcA/YG6o03Z577tiLT3WxOguCcD5vrALH48SyZb8jDUtcVgTWMW0to/n63S8JGUNyF7Bkw9HQWUx+GO1cv2GNzKpk22KS5dlNUVGE9E/7Ydc=
|
||||||
|
src.opensuse.org,195.135.223.224 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFKNThLRPznU5Io1KrAYHmYpaoLQEMGM9nwpKyYQCkPx
|
||||||
|
|
||||||
1
devel-importer/.gitignore
vendored
1
devel-importer/.gitignore
vendored
@@ -2,3 +2,4 @@ devel-importer
|
|||||||
Factory
|
Factory
|
||||||
git
|
git
|
||||||
git-migrated
|
git-migrated
|
||||||
|
git-importer
|
||||||
|
|||||||
239
devel-importer/find_factory_commit.pl
Executable file
239
devel-importer/find_factory_commit.pl
Executable file
@@ -0,0 +1,239 @@
|
|||||||
|
#!/usr/bin/perl
|
||||||
|
use strict;
|
||||||
|
use warnings;
|
||||||
|
use IPC::Open2;
|
||||||
|
use JSON;
|
||||||
|
|
||||||
|
sub FindFactoryCommit {
|
||||||
|
my ($package) = @_;
|
||||||
|
|
||||||
|
# Execute osc cat and capture output
|
||||||
|
my $osc_cmd = "osc cat openSUSE:Factory $package $package.changes";
|
||||||
|
open( my $osc_fh, "$osc_cmd |" ) or die "Failed to run osc: $!";
|
||||||
|
my $data = do { local $/; <$osc_fh> };
|
||||||
|
close($osc_fh);
|
||||||
|
|
||||||
|
# Calculate size
|
||||||
|
my $size = length($data);
|
||||||
|
|
||||||
|
# Create blob header
|
||||||
|
my $blob = "blob $size\0$data";
|
||||||
|
|
||||||
|
# Open a pipe to openssl to compute the hash
|
||||||
|
my ( $reader, $writer );
|
||||||
|
my $pid = open2( $reader, $writer, "openssl sha256" );
|
||||||
|
|
||||||
|
# Send blob data
|
||||||
|
print $writer $blob;
|
||||||
|
close $writer;
|
||||||
|
|
||||||
|
# Read the hash result and extract it
|
||||||
|
my $hash_line = <$reader>;
|
||||||
|
waitpid( $pid, 0 );
|
||||||
|
my ($hash) = $hash_line =~ /([a-fA-F0-9]{64})/;
|
||||||
|
|
||||||
|
# Run git search command with the hash
|
||||||
|
print("looking for hash: $hash\n");
|
||||||
|
my @hashes;
|
||||||
|
my $git_cmd =
|
||||||
|
"git -C $package rev-list --all pool/HEAD | while read commit; do git -C $package ls-tree \"\$commit\" | grep -q '^100644 blob $hash' && echo \"\$commit\"; done";
|
||||||
|
open( my $git_fh, "$git_cmd |" ) or die "Failed to run git search: $!";
|
||||||
|
while ( my $commit = <$git_fh> ) {
|
||||||
|
chomp $commit;
|
||||||
|
print "Found commit $commit\n";
|
||||||
|
push( @hashes, $commit );
|
||||||
|
}
|
||||||
|
close($git_fh);
|
||||||
|
return @hashes;
|
||||||
|
}
|
||||||
|
|
||||||
|
sub ListPackages {
|
||||||
|
my ($project) = @_;
|
||||||
|
open( my $osc_fh,
|
||||||
|
"curl -s https://src.opensuse.org/openSUSE/Factory/raw/branch/main/pkgs/_meta/devel_packages | awk '{ if ( \$2 == \"$project\" ) print \$1 }' |" )
|
||||||
|
or die "Failed to run curl: $!";
|
||||||
|
my @packages = <$osc_fh>;
|
||||||
|
chomp @packages;
|
||||||
|
close($osc_fh);
|
||||||
|
return @packages;
|
||||||
|
}
|
||||||
|
|
||||||
|
sub FactoryMd5 {
|
||||||
|
my ($package) = @_;
|
||||||
|
my $out = "";
|
||||||
|
|
||||||
|
if (system("osc ls openSUSE:Factory $package | grep -q build.specials.obscpio") == 0) {
|
||||||
|
system("mkdir _extract") == 0 || die "_extract exists or can't make it. Aborting.";
|
||||||
|
chdir("_extract") || die;
|
||||||
|
system("osc cat openSUSE:Factory $package build.specials.obscpio | cpio -dium 2> /dev/null") == 0 || die;
|
||||||
|
system("rm .* 2> /dev/null");
|
||||||
|
open( my $fh, "find -type f -exec /usr/bin/basename {} \\; | xargs md5sum | awk '{print \$1 FS \$2}' | grep -v d41d8cd98f00b204e9800998ecf8427e |") or die;
|
||||||
|
while ( my $l = <$fh>) {
|
||||||
|
$out = $out.$l;
|
||||||
|
}
|
||||||
|
close($fh);
|
||||||
|
chdir("..") && system("rm -rf _extract") == 0 || die;
|
||||||
|
}
|
||||||
|
open( my $fh, "osc ls -v openSUSE:Factory $package | awk '{print \$1 FS \$7}' | grep -v -F '_scmsync.obsinfo\nbuild.specials.obscpio' |") or die;
|
||||||
|
while (my $l = <$fh>) {
|
||||||
|
$out = $out.$l;
|
||||||
|
}
|
||||||
|
close($fh);
|
||||||
|
return $out;
|
||||||
|
}
|
||||||
|
|
||||||
|
# Read project from first argument
|
||||||
|
sub Usage {
|
||||||
|
die "Usage: $0 <OBS Project> [org [package]]";
|
||||||
|
}
|
||||||
|
|
||||||
|
my $project = shift or Usage();
|
||||||
|
my $org = shift;
|
||||||
|
|
||||||
|
if (not defined($org)) {
|
||||||
|
$org = `osc meta prj $project | grep scmsync | sed -e 's,^.*src.opensuse.org/\\(.*\\)/_ObsPrj.*,\\1,'`;
|
||||||
|
chomp($org);
|
||||||
|
}
|
||||||
|
|
||||||
|
my @packages = ListPackages($project);
|
||||||
|
my $pkg = shift;
|
||||||
|
@packages = ($pkg) if defined $pkg;
|
||||||
|
|
||||||
|
my @tomove;
|
||||||
|
my @toremove;
|
||||||
|
|
||||||
|
if ( ! -e $org ) {
|
||||||
|
mkdir($org);
|
||||||
|
}
|
||||||
|
chdir($org);
|
||||||
|
print "Verify packages in /pool for $org package in $project\n";
|
||||||
|
|
||||||
|
my $super_user = $ENV{SUPER};
|
||||||
|
if (defined($super_user)) {
|
||||||
|
$super_user = "-G $super_user";
|
||||||
|
} else {
|
||||||
|
$super_user = "";
|
||||||
|
}
|
||||||
|
|
||||||
|
my @missing;
|
||||||
|
|
||||||
|
# verify that packages in devel project is a fork from pool.
|
||||||
|
for my $pkg ( sort(@packages) ) {
|
||||||
|
my $data = `git obs api /repos/$org/$pkg 2> /dev/null`;
|
||||||
|
if ( length($data) == 0 ) {
|
||||||
|
print "***** Repo missing in $org: $pkg\n";
|
||||||
|
push(@missing, $pkg);
|
||||||
|
next;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
my $repo = decode_json($data);
|
||||||
|
if ( !$repo->{parent}
|
||||||
|
|| $repo->{parent}->{owner}->{username} ne "pool" )
|
||||||
|
{
|
||||||
|
if ( system("git obs api /repos/pool/$pkg > /dev/null 2> /dev/null") == 0 ) {
|
||||||
|
print "=== $pkg NOT A FORK of exiting package\n";
|
||||||
|
push( @toremove, $pkg );
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
print "$pkg NEEDS transfer\n";
|
||||||
|
push( @tomove, $pkg );
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if ( scalar @missing > 0 ) {
|
||||||
|
for my $pkg (@missing) {
|
||||||
|
my $index = 0;
|
||||||
|
$index++ until $packages[$index] eq $pkg;
|
||||||
|
splice(@packages, $index, 1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if ( scalar @toremove > 0 ) {
|
||||||
|
print "ABORTING. Need repos removed.\n";
|
||||||
|
print "@toremove\n";
|
||||||
|
exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
if ( scalar @tomove > 0 ) {
|
||||||
|
for my $pkg (@tomove) {
|
||||||
|
system("git obs $super_user api -X POST --data '{\"reparent\": true, \"organization\": \"pool\"}' /repos/$org/$pkg/forks") == 0 and
|
||||||
|
system("git clone gitea\@src.opensuse.org:pool/$pkg") == 0 and
|
||||||
|
system("git -C $pkg checkout -B factory HEAD") == 0 and
|
||||||
|
system("git -C $pkg push origin factory") == 0 and
|
||||||
|
system("git obs $super_user api -X PATCH --data '{\"default_branch\": \"factory\"}' /repos/pool/$pkg") == 0
|
||||||
|
or die "Error in creating a pool repo";
|
||||||
|
system("for i in \$(git -C $pkg for-each-ref --format='%(refname:lstrip=3)' refs/remotes/origin/ | grep -v '\\(^HEAD\$\\|^factory\$\\)'); do git -C $pkg push origin :\$i; done") == 0 or die "failed to cull branches";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
print "Verify complete.\n";
|
||||||
|
|
||||||
|
for my $package ( sort(@packages) ) {
|
||||||
|
print " ----- PROCESSING $package\n";
|
||||||
|
my $url = "https://src.opensuse.org/$org/$package.git";
|
||||||
|
my $push_url = "gitea\@src.opensuse.org:pool/$package.git";
|
||||||
|
if ( not -e $package ) {
|
||||||
|
print("cloning...\n");
|
||||||
|
system("git clone --origin pool $url") == 0
|
||||||
|
or die "Can't clone $org/$package";
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
print("adding remote...\n");
|
||||||
|
system("git -C $package remote rm pool > /dev/null");
|
||||||
|
system("git -C $package remote add pool $url") == 0
|
||||||
|
or die "Can't add pool for $package";
|
||||||
|
}
|
||||||
|
system("git -C $package remote set-url pool --push $push_url") == 0
|
||||||
|
or die "Can't add push remote for $package";
|
||||||
|
print("fetching remote...\n");
|
||||||
|
system("git -C $package fetch pool") == 0
|
||||||
|
or ( push( @tomove, $package ) and die "Can't fetch pool for $package" );
|
||||||
|
|
||||||
|
my @commits = FindFactoryCommit($package);
|
||||||
|
my $Md5Hashes = FactoryMd5($package);
|
||||||
|
my $c;
|
||||||
|
my $match = 0;
|
||||||
|
for my $commit (@commits) {
|
||||||
|
if ( length($commit) != 64 ) {
|
||||||
|
print("Failed to find factory commit. Aborting.");
|
||||||
|
exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (
|
||||||
|
system("git -C $package lfs fetch pool $commit") == 0
|
||||||
|
and system("git -C $package checkout -B factory $commit") == 0
|
||||||
|
and system("git -C $package lfs checkout") == 0
|
||||||
|
and chdir($package)) {
|
||||||
|
|
||||||
|
open(my $fh, "|-", "md5sum -c --quiet") or die $!;
|
||||||
|
print $fh $Md5Hashes;
|
||||||
|
close $fh;
|
||||||
|
if ($? >> 8 != 0) {
|
||||||
|
chdir("..") || die;
|
||||||
|
next;
|
||||||
|
}
|
||||||
|
open($fh, "|-", "awk '{print \$2}' | sort | bash -c \"diff <(ls -1 | sort) -\"") or die $!;
|
||||||
|
print $fh $Md5Hashes;
|
||||||
|
close $fh;
|
||||||
|
my $ec = $? >> 8;
|
||||||
|
chdir("..") || die;
|
||||||
|
|
||||||
|
if ($ec == 0) {
|
||||||
|
$c = $commit;
|
||||||
|
$match = 1;
|
||||||
|
last;
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if ( !$match ) {
|
||||||
|
die "Match not found. Aborting.";
|
||||||
|
}
|
||||||
|
|
||||||
|
system ("git -C $package push -f pool factory");
|
||||||
|
print "$package: $c\n";
|
||||||
|
}
|
||||||
|
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -1,15 +1,25 @@
|
|||||||
|
Java:packages
|
||||||
Kernel:firmware
|
Kernel:firmware
|
||||||
Kernel:kdump
|
Kernel:kdump
|
||||||
|
devel:gcc
|
||||||
devel:languages:clojure
|
devel:languages:clojure
|
||||||
devel:languages:erlang
|
devel:languages:erlang
|
||||||
devel:languages:erlang:Factory
|
devel:languages:erlang:Factory
|
||||||
devel:languages:hare
|
devel:languages:hare
|
||||||
devel:languages:javascript
|
devel:languages:javascript
|
||||||
devel:languages:lua
|
devel:languages:lua
|
||||||
|
devel:languages:nodejs
|
||||||
|
devel:languages:perl
|
||||||
|
devel:languages:python:Factory
|
||||||
|
devel:languages:python:mailman
|
||||||
|
devel:languages:python:pytest
|
||||||
|
devel:openSUSE:Factory
|
||||||
|
network:chromium
|
||||||
network:dhcp
|
network:dhcp
|
||||||
network:im:whatsapp
|
network:im:whatsapp
|
||||||
network:messaging:xmpp
|
network:messaging:xmpp
|
||||||
|
science:HPC
|
||||||
|
server:dns
|
||||||
systemsmanagement:cockpit
|
systemsmanagement:cockpit
|
||||||
systemsmanagement:wbem
|
|
||||||
X11:lxde
|
X11:lxde
|
||||||
|
|
||||||
|
|||||||
@@ -298,6 +298,22 @@ func parseRequestJSONOrg(reqType string, data []byte) (org *common.Organization,
|
|||||||
org = pr.Repository.Owner
|
org = pr.Repository.Owner
|
||||||
extraAction = ""
|
extraAction = ""
|
||||||
|
|
||||||
|
case common.RequestType_Status:
|
||||||
|
status := common.StatusWebhookEvent{}
|
||||||
|
if err = json.Unmarshal(data, &status); err != nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
switch status.State {
|
||||||
|
case "pending", "success", "error", "failure":
|
||||||
|
break
|
||||||
|
default:
|
||||||
|
err = fmt.Errorf("Unknown Status' state: %s", status.State)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
org = status.Repository.Owner
|
||||||
|
extraAction = status.State
|
||||||
|
|
||||||
case common.RequestType_Wiki:
|
case common.RequestType_Wiki:
|
||||||
wiki := common.WikiWebhookEvent{}
|
wiki := common.WikiWebhookEvent{}
|
||||||
if err = json.Unmarshal(data, &wiki); err != nil {
|
if err = json.Unmarshal(data, &wiki); err != nil {
|
||||||
|
|||||||
46
gitea_status_proxy/config.go
Normal file
46
gitea_status_proxy/config.go
Normal file
@@ -0,0 +1,46 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"os"
|
||||||
|
|
||||||
|
"github.com/tailscale/hujson"
|
||||||
|
)
|
||||||
|
|
||||||
|
type Config struct {
|
||||||
|
ForgeEndpoint string `json:"forge_url"`
|
||||||
|
Keys []string `json:"keys"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type contextKey string
|
||||||
|
|
||||||
|
const configKey contextKey = "config"
|
||||||
|
|
||||||
|
func ReadConfig(reader io.Reader) (*Config, error) {
|
||||||
|
data, err := io.ReadAll(reader)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error reading config data: %w", err)
|
||||||
|
}
|
||||||
|
config := Config{}
|
||||||
|
data, err = hujson.Standardize(data)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to parse json: %w", err)
|
||||||
|
}
|
||||||
|
if err := json.Unmarshal(data, &config); err != nil {
|
||||||
|
return nil, fmt.Errorf("error parsing json to api keys and target url: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return &config, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func ReadConfigFile(filename string) (*Config, error) {
|
||||||
|
file, err := os.Open(filename)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("cannot open config file for reading. err: %w", err)
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
return ReadConfig(file)
|
||||||
|
}
|
||||||
15
gitea_status_proxy/handlers.go
Normal file
15
gitea_status_proxy/handlers.go
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"net/http"
|
||||||
|
)
|
||||||
|
|
||||||
|
func ConfigMiddleWare(cfg *Config) func(http.Handler) http.Handler {
|
||||||
|
return func(next http.Handler) http.Handler {
|
||||||
|
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
ctx := context.WithValue(r.Context(), configKey, cfg)
|
||||||
|
next.ServeHTTP(w, r.WithContext(ctx))
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
164
gitea_status_proxy/main.go
Normal file
164
gitea_status_proxy/main.go
Normal file
@@ -0,0 +1,164 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"encoding/json"
|
||||||
|
"flag"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"net/http"
|
||||||
|
"os"
|
||||||
|
"slices"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"src.opensuse.org/autogits/common"
|
||||||
|
)
|
||||||
|
|
||||||
|
type StatusInput struct {
|
||||||
|
Description string `json:"description"`
|
||||||
|
Context string `json:"context"`
|
||||||
|
State string `json:"state"`
|
||||||
|
TargetUrl string `json:"target_url"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
configFile := flag.String("config", "", "status proxy config file")
|
||||||
|
flag.Parse()
|
||||||
|
|
||||||
|
if *configFile == "" {
|
||||||
|
common.LogError("missing required argument config")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
config, err := ReadConfigFile(*configFile)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
common.LogError("Failed to read config file", err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
mux := http.NewServeMux()
|
||||||
|
|
||||||
|
mux.Handle("/repos/{owner}/{repo}/statuses/{sha}", ConfigMiddleWare(config)(http.HandlerFunc(StatusProxy)))
|
||||||
|
|
||||||
|
common.LogInfo("server up and listening on :3000")
|
||||||
|
err = http.ListenAndServe(":3000", mux)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
common.LogError("Server failed to start up", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
func StatusProxy(w http.ResponseWriter, r *http.Request) {
|
||||||
|
if r.Method == http.MethodPost {
|
||||||
|
config, ok := r.Context().Value(configKey).(*Config)
|
||||||
|
|
||||||
|
if !ok {
|
||||||
|
common.LogDebug("Config missing from context")
|
||||||
|
http.Error(w, http.StatusText(http.StatusInternalServerError), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
header := r.Header.Get("Authorization")
|
||||||
|
if header == "" {
|
||||||
|
common.LogDebug("Authorization header not found")
|
||||||
|
http.Error(w, http.StatusText(http.StatusUnauthorized), http.StatusUnauthorized)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
token_arr := strings.Split(header, " ")
|
||||||
|
if len(token_arr) != 2 {
|
||||||
|
common.LogDebug("Authorization header malformed")
|
||||||
|
http.Error(w, http.StatusText(http.StatusUnauthorized), http.StatusUnauthorized)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if !strings.EqualFold(token_arr[0], "token") {
|
||||||
|
common.LogDebug("Token not found in Authorization header")
|
||||||
|
http.Error(w, http.StatusText(http.StatusUnauthorized), http.StatusUnauthorized)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
token := token_arr[1]
|
||||||
|
|
||||||
|
if !slices.Contains(config.Keys, token) {
|
||||||
|
common.LogDebug("Provided token is not known")
|
||||||
|
http.Error(w, http.StatusText(http.StatusUnauthorized), http.StatusUnauthorized)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
owner := r.PathValue("owner")
|
||||||
|
repo := r.PathValue("repo")
|
||||||
|
sha := r.PathValue("sha")
|
||||||
|
|
||||||
|
if !ok {
|
||||||
|
common.LogError("Failed to get config from context, is it set?")
|
||||||
|
http.Error(w, http.StatusText(http.StatusInternalServerError), http.StatusInternalServerError)
|
||||||
|
}
|
||||||
|
|
||||||
|
posturl := fmt.Sprintf("%s/repos/%s/%s/statuses/%s", config.ForgeEndpoint, owner, repo, sha)
|
||||||
|
decoder := json.NewDecoder(r.Body)
|
||||||
|
var statusinput StatusInput
|
||||||
|
err := decoder.Decode(&statusinput)
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, http.StatusText(http.StatusBadRequest), http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
status_payload, err := json.Marshal(statusinput)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, http.StatusText(http.StatusBadRequest), http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
client := &http.Client{}
|
||||||
|
req, err := http.NewRequest("POST", posturl, bytes.NewBuffer(status_payload))
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, http.StatusText(http.StatusBadRequest), http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
ForgeToken := os.Getenv("GITEA_TOKEN")
|
||||||
|
|
||||||
|
if ForgeToken == "" {
|
||||||
|
http.Error(w, http.StatusText(http.StatusInternalServerError), http.StatusInternalServerError)
|
||||||
|
common.LogError("GITEA_TOKEN was not set, all requests will fail")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
req.Header.Add("Content-Type", "application/json")
|
||||||
|
req.Header.Add("Authorization", fmt.Sprintf("token %s", ForgeToken))
|
||||||
|
|
||||||
|
resp, err := client.Do(req)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
common.LogError(fmt.Sprintf("Request to forge endpoint failed: %v", err))
|
||||||
|
http.Error(w, http.StatusText(http.StatusBadGateway), http.StatusBadGateway)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
w.WriteHeader(resp.StatusCode)
|
||||||
|
|
||||||
|
/*
|
||||||
|
the commented out section sets every key
|
||||||
|
value from the headers, unsure if this
|
||||||
|
leaks information from gitea
|
||||||
|
|
||||||
|
for k, v := range resp.Header {
|
||||||
|
for _, vv := range v {
|
||||||
|
w.Header().Add(k, vv)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
*/
|
||||||
|
|
||||||
|
_, err = io.Copy(w, resp.Body)
|
||||||
|
if err != nil {
|
||||||
|
common.LogError("Error copying response body: %v", err)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
http.Error(w, http.StatusText(http.StatusMethodNotAllowed), http.StatusMethodNotAllowed)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
48
gitea_status_proxy/readme.md
Normal file
48
gitea_status_proxy/readme.md
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
# gitea_status_proxy
|
||||||
|
|
||||||
|
Allows bots without code owner permission to set Gitea's commit status
|
||||||
|
|
||||||
|
## Basic usage
|
||||||
|
|
||||||
|
To beging, you need the json config and a Gitea token with permissions to the repository you want to write to.
|
||||||
|
|
||||||
|
Keys should be randomly generated, i.e by using openssl: `openssl rand -base64 48`
|
||||||
|
|
||||||
|
Generate a json config file, with the key generated from running the command above, save as example.json:
|
||||||
|
|
||||||
|
```
|
||||||
|
{
|
||||||
|
"forge_url": "https://src.opensuse.org/api/v1",
|
||||||
|
"keys": ["$YOUR_TOKEN_GOES_HERE"]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### start the proxy:
|
||||||
|
|
||||||
|
```
|
||||||
|
GITEA_TOKEN=YOURTOKEN ./gitea_status_proxy -config example.json
|
||||||
|
2025/10/30 12:53:18 [I] server up and listening on :3000
|
||||||
|
```
|
||||||
|
|
||||||
|
Now the proxy should be able to accept requests under: `localhost:3000/repos/{owner}/{repo}/statuses/{sha}`, the token to be used when authenticating to the proxy must be in the `keys` list of the configuration json file (example.json above)
|
||||||
|
|
||||||
|
### example:
|
||||||
|
|
||||||
|
On a separate terminal, you can use curl to post a status to the proxy, if the GITEA_TOKEN has permissions on the target
|
||||||
|
repository, it will result in a new status being set for the given commit
|
||||||
|
|
||||||
|
```
|
||||||
|
curl -X 'POST' \
|
||||||
|
'localhost:3000/repos/szarate/test-actions-gitea/statuses/cd5847c92fb65a628bdd6015f96ee7e569e1ad6e4fc487acc149b52e788262f9' \
|
||||||
|
-H 'accept: application/json' \
|
||||||
|
-H 'Authorization: token $YOUR_TOKEN_GOES_HERE' \
|
||||||
|
-H 'Content-Type: application/json' \
|
||||||
|
-d '{
|
||||||
|
"context": "Proxy test",
|
||||||
|
"description": "Status posted from the proxy",
|
||||||
|
"state": "success",
|
||||||
|
"target_url": "https://src.opensuse.org"
|
||||||
|
}'
|
||||||
|
```
|
||||||
|
|
||||||
|
After this you should be able to the results in the pull request, e.g from above: https://src.opensuse.org/szarate/test-actions-gitea/pulls/1
|
||||||
5
go.mod
5
go.mod
@@ -10,12 +10,16 @@ require (
|
|||||||
github.com/go-openapi/validate v0.24.0
|
github.com/go-openapi/validate v0.24.0
|
||||||
github.com/opentracing/opentracing-go v1.2.0
|
github.com/opentracing/opentracing-go v1.2.0
|
||||||
github.com/rabbitmq/amqp091-go v1.10.0
|
github.com/rabbitmq/amqp091-go v1.10.0
|
||||||
|
github.com/redis/go-redis/v9 v9.11.0
|
||||||
github.com/tailscale/hujson v0.0.0-20250226034555-ec1d1c113d33
|
github.com/tailscale/hujson v0.0.0-20250226034555-ec1d1c113d33
|
||||||
go.uber.org/mock v0.5.0
|
go.uber.org/mock v0.5.0
|
||||||
|
gopkg.in/yaml.v3 v3.0.1
|
||||||
)
|
)
|
||||||
|
|
||||||
require (
|
require (
|
||||||
github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2 // indirect
|
github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2 // indirect
|
||||||
|
github.com/cespare/xxhash/v2 v2.3.0 // indirect
|
||||||
|
github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f // indirect
|
||||||
github.com/go-logr/logr v1.4.1 // indirect
|
github.com/go-logr/logr v1.4.1 // indirect
|
||||||
github.com/go-logr/stdr v1.2.2 // indirect
|
github.com/go-logr/stdr v1.2.2 // indirect
|
||||||
github.com/go-openapi/analysis v0.23.0 // indirect
|
github.com/go-openapi/analysis v0.23.0 // indirect
|
||||||
@@ -33,5 +37,4 @@ require (
|
|||||||
go.opentelemetry.io/otel/metric v1.24.0 // indirect
|
go.opentelemetry.io/otel/metric v1.24.0 // indirect
|
||||||
go.opentelemetry.io/otel/trace v1.24.0 // indirect
|
go.opentelemetry.io/otel/trace v1.24.0 // indirect
|
||||||
golang.org/x/sync v0.7.0 // indirect
|
golang.org/x/sync v0.7.0 // indirect
|
||||||
gopkg.in/yaml.v3 v3.0.1 // indirect
|
|
||||||
)
|
)
|
||||||
|
|||||||
10
go.sum
10
go.sum
@@ -1,8 +1,16 @@
|
|||||||
github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2 h1:DklsrG3dyBCFEj5IhUbnKptjxatkF07cF2ak3yi77so=
|
github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2 h1:DklsrG3dyBCFEj5IhUbnKptjxatkF07cF2ak3yi77so=
|
||||||
github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2/go.mod h1:WaHUgvxTVq04UNunO+XhnAqY/wQc+bxr74GqbsZ/Jqw=
|
github.com/asaskevich/govalidator v0.0.0-20230301143203-a9d515a09cc2/go.mod h1:WaHUgvxTVq04UNunO+XhnAqY/wQc+bxr74GqbsZ/Jqw=
|
||||||
|
github.com/bsm/ginkgo/v2 v2.12.0 h1:Ny8MWAHyOepLGlLKYmXG4IEkioBysk6GpaRTLC8zwWs=
|
||||||
|
github.com/bsm/ginkgo/v2 v2.12.0/go.mod h1:SwYbGRRDovPVboqFv0tPTcG1sN61LM1Z4ARdbAV9g4c=
|
||||||
|
github.com/bsm/gomega v1.27.10 h1:yeMWxP2pV2fG3FgAODIY8EiRE3dy0aeFYt4l7wh6yKA=
|
||||||
|
github.com/bsm/gomega v1.27.10/go.mod h1:JyEr/xRbxbtgWNi8tIEVPUYZ5Dzef52k01W3YH0H+O0=
|
||||||
|
github.com/cespare/xxhash/v2 v2.3.0 h1:UL815xU9SqsFlibzuggzjXhog7bL6oX9BbNZnL2UFvs=
|
||||||
|
github.com/cespare/xxhash/v2 v2.3.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
|
||||||
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||||
|
github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f h1:lO4WD4F/rVNCu3HqELle0jiPLLBs70cWOduZpkS1E78=
|
||||||
|
github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f/go.mod h1:cuUVRXasLTGF7a8hSLbxyZXjz+1KgoB3wDUb6vlszIc=
|
||||||
github.com/go-logr/logr v1.2.2/go.mod h1:jdQByPbusPIv2/zmleS9BjJVeZ6kBagPoEUsqbVz/1A=
|
github.com/go-logr/logr v1.2.2/go.mod h1:jdQByPbusPIv2/zmleS9BjJVeZ6kBagPoEUsqbVz/1A=
|
||||||
github.com/go-logr/logr v1.4.1 h1:pKouT5E8xu9zeFC39JXRDukb6JFQPXM5p5I91188VAQ=
|
github.com/go-logr/logr v1.4.1 h1:pKouT5E8xu9zeFC39JXRDukb6JFQPXM5p5I91188VAQ=
|
||||||
github.com/go-logr/logr v1.4.1/go.mod h1:9T104GzyrTigFIr8wt5mBrctHMim0Nb2HLGrmQ40KvY=
|
github.com/go-logr/logr v1.4.1/go.mod h1:9T104GzyrTigFIr8wt5mBrctHMim0Nb2HLGrmQ40KvY=
|
||||||
@@ -50,6 +58,8 @@ github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZb
|
|||||||
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||||
github.com/rabbitmq/amqp091-go v1.10.0 h1:STpn5XsHlHGcecLmMFCtg7mqq0RnD+zFr4uzukfVhBw=
|
github.com/rabbitmq/amqp091-go v1.10.0 h1:STpn5XsHlHGcecLmMFCtg7mqq0RnD+zFr4uzukfVhBw=
|
||||||
github.com/rabbitmq/amqp091-go v1.10.0/go.mod h1:Hy4jKW5kQART1u+JkDTF9YYOQUHXqMuhrgxOEeS7G4o=
|
github.com/rabbitmq/amqp091-go v1.10.0/go.mod h1:Hy4jKW5kQART1u+JkDTF9YYOQUHXqMuhrgxOEeS7G4o=
|
||||||
|
github.com/redis/go-redis/v9 v9.11.0 h1:E3S08Gl/nJNn5vkxd2i78wZxWAPNZgUNTp8WIJUAiIs=
|
||||||
|
github.com/redis/go-redis/v9 v9.11.0/go.mod h1:huWgSWd8mW6+m0VPhJjSSQ+d6Nh1VICQ6Q5lHuCH/Iw=
|
||||||
github.com/rogpeppe/go-internal v1.11.0 h1:cWPaGQEPrBb5/AsnsZesgZZ9yb1OQ+GOISoDNXVBh4M=
|
github.com/rogpeppe/go-internal v1.11.0 h1:cWPaGQEPrBb5/AsnsZesgZZ9yb1OQ+GOISoDNXVBh4M=
|
||||||
github.com/rogpeppe/go-internal v1.11.0/go.mod h1:ddIwULY96R17DhadqLgMfk9H9tvdUzkipdSkR5nkCZA=
|
github.com/rogpeppe/go-internal v1.11.0/go.mod h1:ddIwULY96R17DhadqLgMfk9H9tvdUzkipdSkR5nkCZA=
|
||||||
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
||||||
|
|||||||
@@ -1,24 +1,65 @@
|
|||||||
Group Review Bot
|
Group Review Bot
|
||||||
================
|
================
|
||||||
|
|
||||||
Areas of responsibility
|
This workaround is mainly needed because Gitea does not track which team member performed a review on behalf of a team.
|
||||||
-----------------------
|
|
||||||
|
|
||||||
1. Is used to handle reviews associated with groups defined in the
|
Main Tasks
|
||||||
ProjectGit.
|
----------
|
||||||
|
|
||||||
2. Assumes: workflow-pr needs to associate and define the PR set from
|
Awaits a comment in the format “@groupreviewbot-name: approve”, then approves the PR with the comment “<user> approved a review on behalf of <groupreviewbot-name>.”
|
||||||
which the groups.json is read (Base of the PrjGit PR)
|
|
||||||
|
|
||||||
Target Usage
|
Target Usage
|
||||||
------------
|
------------
|
||||||
|
|
||||||
Projects where policy reviews are required.
|
Projects where policy reviews are required.
|
||||||
|
|
||||||
|
Configuration
|
||||||
|
--------------
|
||||||
|
|
||||||
|
The bot is configured via the `ReviewGroups` field in the `workflow.config` file, located in the ProjectGit repository.
|
||||||
|
|
||||||
|
See `ReviewGroups` in the [workflow-pr configuration](../workflow-pr/README.md#config-file).
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
...
|
||||||
|
"ReviewGroups": [
|
||||||
|
{
|
||||||
|
"Name": "name of the group user",
|
||||||
|
"Reviewers": ["members", "of", "group"],
|
||||||
|
"Silent": "(true, false) -- if true, do not explicitly require review requests of group members"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
...
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Server configuration
|
||||||
|
--------------------------
|
||||||
|
|
||||||
|
**Configuration file:**
|
||||||
|
|
||||||
|
| Field | Type | Notes |
|
||||||
|
| ----- | ----- | ----- |
|
||||||
|
| root | Array of string | Format **org/repo\#branch** |
|
||||||
|
|
||||||
Requirements
|
Requirements
|
||||||
------------
|
------------
|
||||||
* Gitea token to:
|
Gitea token with following permissions:
|
||||||
+ R/W PullRequest
|
- R/W PullRequest
|
||||||
+ R/W Notification
|
- R/W Notification
|
||||||
+ R User
|
- R User
|
||||||
|
|
||||||
|
Env Variables
|
||||||
|
-------------
|
||||||
|
The following variables can be used (and override) command line parameters.
|
||||||
|
|
||||||
|
* `AUTOGITS_CONFIG` - config file location
|
||||||
|
* `AUTOGITS_URL` - Gitea URL
|
||||||
|
* `AUTOGITS_RABBITURL` - RabbitMQ url
|
||||||
|
* `AUTOGITS_DEBUG` - when set, debug level logging enabled
|
||||||
|
|
||||||
|
Authentication env variables
|
||||||
|
* `GITEA_TOKEN` - Gitea user token
|
||||||
|
* `AMQP_USERNAME`, `AMQP_PASSWORD` - username and password for rabbitmq
|
||||||
|
|
||||||
|
|||||||
@@ -5,52 +5,69 @@ import (
|
|||||||
"fmt"
|
"fmt"
|
||||||
"log"
|
"log"
|
||||||
"net/url"
|
"net/url"
|
||||||
|
"os"
|
||||||
"regexp"
|
"regexp"
|
||||||
"runtime/debug"
|
"runtime/debug"
|
||||||
"slices"
|
"slices"
|
||||||
"strconv"
|
"strconv"
|
||||||
"strings"
|
"strings"
|
||||||
"time"
|
"time"
|
||||||
|
"unicode"
|
||||||
|
|
||||||
"src.opensuse.org/autogits/common"
|
"src.opensuse.org/autogits/common"
|
||||||
"src.opensuse.org/autogits/common/gitea-generated/models"
|
"src.opensuse.org/autogits/common/gitea-generated/models"
|
||||||
)
|
)
|
||||||
|
|
||||||
var configs common.AutogitConfigs
|
type ReviewBot struct {
|
||||||
var acceptRx *regexp.Regexp
|
configs common.AutogitConfigs
|
||||||
var rejectRx *regexp.Regexp
|
acceptRx *regexp.Regexp
|
||||||
var groupName string
|
rejectRx *regexp.Regexp
|
||||||
|
groupName string
|
||||||
func InitRegex(groupName string) {
|
gitea common.Gitea
|
||||||
acceptRx = regexp.MustCompile("\\s*:\\s*LGTM")
|
|
||||||
rejectRx = regexp.MustCompile("\\s*:\\s*")
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func ParseReviewLine(reviewText string) (bool, string) {
|
func (bot *ReviewBot) InitRegex(newGroupName string) {
|
||||||
|
bot.groupName = newGroupName
|
||||||
|
bot.acceptRx = regexp.MustCompile("^:\\s*(LGTM|approved?)")
|
||||||
|
bot.rejectRx = regexp.MustCompile("^:\\s*")
|
||||||
|
}
|
||||||
|
|
||||||
|
func (bot *ReviewBot) ParseReviewLine(reviewText string) (bool, string) {
|
||||||
line := strings.TrimSpace(reviewText)
|
line := strings.TrimSpace(reviewText)
|
||||||
groupTextName := "@" + groupName
|
groupTextName := "@" + bot.groupName
|
||||||
glen := len(groupTextName)
|
glen := len(groupTextName)
|
||||||
if len(line) < glen || line[0:glen] != groupTextName {
|
if len(line) < glen || line[0:glen] != groupTextName {
|
||||||
return false, line
|
return false, line
|
||||||
}
|
}
|
||||||
|
|
||||||
return true, line[glen:]
|
l := line[glen:]
|
||||||
|
for idx, r := range l {
|
||||||
|
if unicode.IsSpace(r) {
|
||||||
|
continue
|
||||||
|
} else if r == ':' {
|
||||||
|
return true, l[idx:]
|
||||||
|
} else {
|
||||||
|
return false, line
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false, line
|
||||||
}
|
}
|
||||||
|
|
||||||
func ReviewAccepted(reviewText string) bool {
|
func (bot *ReviewBot) ReviewAccepted(reviewText string) bool {
|
||||||
for _, line := range common.SplitStringNoEmpty(reviewText, "\n") {
|
for _, line := range common.SplitStringNoEmpty(reviewText, "\n") {
|
||||||
if matched, reviewLine := ParseReviewLine(line); matched {
|
if matched, reviewLine := bot.ParseReviewLine(line); matched {
|
||||||
return acceptRx.MatchString(reviewLine)
|
return bot.acceptRx.MatchString(reviewLine)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
func ReviewRejected(reviewText string) bool {
|
func (bot *ReviewBot) ReviewRejected(reviewText string) bool {
|
||||||
for _, line := range common.SplitStringNoEmpty(reviewText, "\n") {
|
for _, line := range common.SplitStringNoEmpty(reviewText, "\n") {
|
||||||
if matched, reviewLine := ParseReviewLine(line); matched {
|
if matched, reviewLine := bot.ParseReviewLine(line); matched {
|
||||||
if rejectRx.MatchString(reviewLine) {
|
if bot.rejectRx.MatchString(reviewLine) {
|
||||||
return !acceptRx.MatchString(reviewLine)
|
return !bot.acceptRx.MatchString(reviewLine)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -100,10 +117,10 @@ var commentStrings = []string{
|
|||||||
"change_time_estimate",
|
"change_time_estimate",
|
||||||
}*/
|
}*/
|
||||||
|
|
||||||
func FindAcceptableReviewInTimeline(user string, timeline []*models.TimelineComment, reviews []*models.PullReview) *models.TimelineComment {
|
func (bot *ReviewBot) FindAcceptableReviewInTimeline(user string, timeline []*models.TimelineComment, reviews []*models.PullReview) *models.TimelineComment {
|
||||||
for _, t := range timeline {
|
for _, t := range timeline {
|
||||||
if t.Type == common.TimelineCommentType_Comment && t.User.UserName == user && t.Created == t.Updated {
|
if t.Type == common.TimelineCommentType_Comment && t.User.UserName == user && t.Created == t.Updated {
|
||||||
if ReviewAccepted(t.Body) || ReviewRejected(t.Body) {
|
if bot.ReviewAccepted(t.Body) || bot.ReviewRejected(t.Body) {
|
||||||
return t
|
return t
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -112,13 +129,23 @@ func FindAcceptableReviewInTimeline(user string, timeline []*models.TimelineComm
|
|||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func UnrequestReviews(gitea common.Gitea, org, repo string, id int64, users []string) {
|
func (bot *ReviewBot) FindOurLastReviewInTimeline(timeline []*models.TimelineComment) *models.TimelineComment {
|
||||||
if err := gitea.UnrequestReview(org, repo, id, users...); err != nil {
|
for _, t := range timeline {
|
||||||
|
if t.Type == common.TimelineCommentType_Review && t.User.UserName == bot.groupName && t.Created == t.Updated {
|
||||||
|
return t
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (bot *ReviewBot) UnrequestReviews(org, repo string, id int64, users []string) {
|
||||||
|
if err := bot.gitea.UnrequestReview(org, repo, id, users...); err != nil {
|
||||||
common.LogError("Can't remove reviewrs after a review:", err)
|
common.LogError("Can't remove reviewrs after a review:", err)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func ProcessNotifications(notification *models.NotificationThread, gitea common.Gitea) {
|
func (bot *ReviewBot) ProcessNotifications(notification *models.NotificationThread) {
|
||||||
defer func() {
|
defer func() {
|
||||||
if r := recover(); r != nil {
|
if r := recover(); r != nil {
|
||||||
common.LogInfo("panic cought --- recovered")
|
common.LogInfo("panic cought --- recovered")
|
||||||
@@ -126,7 +153,7 @@ func ProcessNotifications(notification *models.NotificationThread, gitea common.
|
|||||||
}
|
}
|
||||||
}()
|
}()
|
||||||
|
|
||||||
rx := regexp.MustCompile(`^/?api/v\d+/repos/(?<org>[_a-zA-Z0-9-]+)/(?<project>[_a-zA-Z0-9-]+)/(?:issues|pulls)/(?<num>[0-9]+)$`)
|
rx := regexp.MustCompile(`^/?api/v\d+/repos/(?<org>[_\.a-zA-Z0-9-]+)/(?<project>[_\.a-zA-Z0-9-]+)/(?:issues|pulls)/(?<num>[0-9]+)$`)
|
||||||
subject := notification.Subject
|
subject := notification.Subject
|
||||||
u, err := url.Parse(notification.Subject.URL)
|
u, err := url.Parse(notification.Subject.URL)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
@@ -144,99 +171,110 @@ func ProcessNotifications(notification *models.NotificationThread, gitea common.
|
|||||||
repo := match[2]
|
repo := match[2]
|
||||||
id, _ := strconv.ParseInt(match[3], 10, 64)
|
id, _ := strconv.ParseInt(match[3], 10, 64)
|
||||||
|
|
||||||
common.LogInfo("processing:", fmt.Sprintf("%s/%s#%d", org, repo, id))
|
common.LogInfo("processing:", fmt.Sprintf("%s/%s!%d", org, repo, id))
|
||||||
pr, err := gitea.GetPullRequest(org, repo, id)
|
pr, err := bot.gitea.GetPullRequest(org, repo, id)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
common.LogError(" ** Cannot fetch PR associated with review:", subject.URL, "Error:", err)
|
common.LogError(" ** Cannot fetch PR associated with review:", subject.URL, "Error:", err)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if err := bot.ProcessPR(pr); err == nil && !common.IsDryRun {
|
||||||
|
if err := bot.gitea.SetNotificationRead(notification.ID); err != nil {
|
||||||
|
common.LogDebug(" Cannot set notification as read", err)
|
||||||
|
}
|
||||||
|
} else if err != nil && err != ReviewNotFinished {
|
||||||
|
common.LogError(err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var ReviewNotFinished = fmt.Errorf("Review is not finished")
|
||||||
|
|
||||||
|
func (bot *ReviewBot) ProcessPR(pr *models.PullRequest) error {
|
||||||
|
org := pr.Base.Repo.Owner.UserName
|
||||||
|
repo := pr.Base.Repo.Name
|
||||||
|
id := pr.Index
|
||||||
|
|
||||||
found := false
|
found := false
|
||||||
for _, reviewer := range pr.RequestedReviewers {
|
for _, reviewer := range pr.RequestedReviewers {
|
||||||
if reviewer != nil && reviewer.UserName == groupName {
|
if reviewer != nil && reviewer.UserName == bot.groupName {
|
||||||
found = true
|
found = true
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if !found {
|
if !found {
|
||||||
common.LogInfo(" review is not requested for", groupName)
|
common.LogInfo(" review is not requested for", bot.groupName)
|
||||||
if !common.IsDryRun {
|
return nil
|
||||||
gitea.SetNotificationRead(notification.ID)
|
|
||||||
}
|
|
||||||
return
|
|
||||||
}
|
}
|
||||||
|
|
||||||
config := configs.GetPrjGitConfig(org, repo, pr.Base.Name)
|
config := bot.configs.GetPrjGitConfig(org, repo, pr.Base.Name)
|
||||||
if config == nil {
|
if config == nil {
|
||||||
common.LogError("Cannot find config for:", fmt.Sprintf("%s/%s#%s", org, repo, pr.Base.Name))
|
return fmt.Errorf("Cannot find config for: %s", pr.URL)
|
||||||
return
|
|
||||||
}
|
}
|
||||||
if pr.State == "closed" {
|
if pr.State == "closed" {
|
||||||
// dismiss the review
|
// dismiss the review
|
||||||
common.LogInfo(" -- closed request, so nothing to review")
|
common.LogInfo(" -- closed request, so nothing to review")
|
||||||
if !common.IsDryRun {
|
return nil
|
||||||
gitea.SetNotificationRead(notification.ID)
|
|
||||||
}
|
|
||||||
return
|
|
||||||
}
|
}
|
||||||
|
|
||||||
reviews, err := gitea.GetPullRequestReviews(org, repo, id)
|
reviews, err := bot.gitea.GetPullRequestReviews(org, repo, id)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
common.LogInfo(" ** No reviews associated with request:", subject.URL, "Error:", err)
|
return fmt.Errorf("Failed to fetch reviews for: %v: %w", pr.URL, err)
|
||||||
return
|
|
||||||
}
|
}
|
||||||
|
|
||||||
timeline, err := common.FetchTimelineSinceReviewRequestOrPush(gitea, groupName, pr.Head.Sha, org, repo, id)
|
timeline, err := common.FetchTimelineSinceReviewRequestOrPush(bot.gitea, bot.groupName, pr.Head.Sha, org, repo, id)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
common.LogError(err)
|
return fmt.Errorf("Failed to fetch timeline to review. %w", err)
|
||||||
return
|
|
||||||
}
|
}
|
||||||
|
|
||||||
requestReviewers, err := config.GetReviewGroupMembers(groupName)
|
groupConfig, err := config.GetReviewGroup(bot.groupName)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
common.LogError(err)
|
return fmt.Errorf("Failed to fetch review group. %w", err)
|
||||||
return
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// submitter cannot be reviewer
|
// submitter cannot be reviewer
|
||||||
|
requestReviewers := slices.Clone(groupConfig.Reviewers)
|
||||||
requestReviewers = slices.DeleteFunc(requestReviewers, func(u string) bool { return u == pr.User.UserName })
|
requestReviewers = slices.DeleteFunc(requestReviewers, func(u string) bool { return u == pr.User.UserName })
|
||||||
// pr.Head.Sha
|
// pr.Head.Sha
|
||||||
|
|
||||||
for _, reviewer := range requestReviewers {
|
for _, reviewer := range requestReviewers {
|
||||||
if review := FindAcceptableReviewInTimeline(reviewer, timeline, reviews); review != nil {
|
if review := bot.FindAcceptableReviewInTimeline(reviewer, timeline, reviews); review != nil {
|
||||||
if ReviewAccepted(review.Body) {
|
if bot.ReviewAccepted(review.Body) {
|
||||||
if !common.IsDryRun {
|
if !common.IsDryRun {
|
||||||
gitea.AddReviewComment(pr, common.ReviewStateApproved, "Signed off by: "+reviewer)
|
text := reviewer + " approved a review on behalf of " + bot.groupName
|
||||||
UnrequestReviews(gitea, org, repo, id, requestReviewers)
|
if review := bot.FindOurLastReviewInTimeline(timeline); review == nil || review.Body != text {
|
||||||
if !common.IsDryRun {
|
_, err := bot.gitea.AddReviewComment(pr, common.ReviewStateApproved, text)
|
||||||
if err := gitea.SetNotificationRead(notification.ID); err != nil {
|
if err != nil {
|
||||||
common.LogDebug(" Cannot set notification as read", err)
|
common.LogError(" -> failed to write approval comment", err)
|
||||||
}
|
}
|
||||||
|
bot.UnrequestReviews(org, repo, id, requestReviewers)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
common.LogInfo(" -> approved by", reviewer)
|
common.LogInfo(" -> approved by", reviewer)
|
||||||
common.LogInfo(" review at", review.Created)
|
common.LogInfo(" review at", review.Created)
|
||||||
return
|
return nil
|
||||||
} else if ReviewRejected(review.Body) {
|
} else if bot.ReviewRejected(review.Body) {
|
||||||
if !common.IsDryRun {
|
if !common.IsDryRun {
|
||||||
gitea.AddReviewComment(pr, common.ReviewStateRequestChanges, "Changes requested. See review by: "+reviewer)
|
text := reviewer + " requested changes on behalf of " + bot.groupName + ". See " + review.HTMLURL
|
||||||
UnrequestReviews(gitea, org, repo, id, requestReviewers)
|
if review := bot.FindOurLastReviewInTimeline(timeline); review == nil || review.Body != text {
|
||||||
if err := gitea.SetNotificationRead(notification.ID); err != nil {
|
_, err := bot.gitea.AddReviewComment(pr, common.ReviewStateRequestChanges, text)
|
||||||
common.LogDebug(" Cannot set notification as read", err)
|
if err != nil {
|
||||||
|
common.LogError(" -> failed to write rejecting comment", err)
|
||||||
|
}
|
||||||
|
bot.UnrequestReviews(org, repo, id, requestReviewers)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
common.LogInfo(" -> declined by", reviewer)
|
common.LogInfo(" -> declined by", reviewer)
|
||||||
return
|
return nil
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// request group member reviews, if missing
|
// request group member reviews, if missing
|
||||||
common.LogDebug(" Review incomplete...")
|
common.LogDebug(" Review incomplete...")
|
||||||
if len(requestReviewers) > 0 {
|
if !groupConfig.Silent && len(requestReviewers) > 0 {
|
||||||
common.LogDebug(" Requesting reviews for:", requestReviewers)
|
common.LogDebug(" Requesting reviews for:", requestReviewers)
|
||||||
if !common.IsDryRun {
|
if !common.IsDryRun {
|
||||||
if _, err := gitea.RequestReviews(pr, requestReviewers...); err != nil {
|
if _, err := bot.gitea.RequestReviews(pr, requestReviewers...); err != nil {
|
||||||
common.LogDebug(" -> err:", err)
|
common.LogDebug(" -> err:", err)
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
@@ -249,40 +287,67 @@ func ProcessNotifications(notification *models.NotificationThread, gitea common.
|
|||||||
// add a helpful comment, if not yet added
|
// add a helpful comment, if not yet added
|
||||||
found_help_comment := false
|
found_help_comment := false
|
||||||
for _, t := range timeline {
|
for _, t := range timeline {
|
||||||
if t.Type == common.TimelineCommentType_Comment && t.User != nil && t.User.UserName == groupName {
|
if t.Type == common.TimelineCommentType_Comment && t.User != nil && t.User.UserName == bot.groupName {
|
||||||
found_help_comment = true
|
found_help_comment = true
|
||||||
break
|
break
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if !found_help_comment && !common.IsDryRun {
|
if !found_help_comment && !common.IsDryRun {
|
||||||
helpComment := fmt.Sprintln("Review by", groupName, "represents a group of reviewers:", strings.Join(requestReviewers, ", "), ". To review as part of this group, create a comment with contents @"+groupName+": LGTM on a separate line to accept a review. To request changes, write @"+groupName+": followed by reason for rejection. Do not use reviews to review as a group. Editing a comment invalidates that comment.")
|
helpComment := fmt.Sprintln("Review by", bot.groupName, "represents a group of reviewers:", strings.Join(requestReviewers, ", "), ".\n\n"+
|
||||||
gitea.AddComment(pr, helpComment)
|
"Do **not** use standard review interface to review on behalf of the group.\n"+
|
||||||
|
"To accept the review on behalf of the group, create the following comment: `@"+bot.groupName+": approve`.\n"+
|
||||||
|
"To request changes on behalf of the group, create the following comment: `@"+bot.groupName+": decline` followed with lines justifying the decision.\n"+
|
||||||
|
"Future edits of the comments are ignored, a new comment is required to change the review state.")
|
||||||
|
if slices.Contains(groupConfig.Reviewers, pr.User.UserName) {
|
||||||
|
helpComment = helpComment + "\n\n" +
|
||||||
|
"Submitter is member of this review group, hence they are excluded from being one of the reviewers here"
|
||||||
|
}
|
||||||
|
bot.gitea.AddComment(pr, helpComment)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
return ReviewNotFinished
|
||||||
}
|
}
|
||||||
|
|
||||||
func PeriodReviewCheck(gitea common.Gitea) {
|
func (bot *ReviewBot) PeriodReviewCheck() {
|
||||||
notifications, err := gitea.GetNotifications(common.GiteaNotificationType_Pull, nil)
|
notifications, err := bot.gitea.GetNotifications(common.GiteaNotificationType_Pull, nil)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
common.LogError(" Error fetching unread notifications: %w", err)
|
common.LogError(" Error fetching unread notifications: %w", err)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
for _, notification := range notifications {
|
for _, notification := range notifications {
|
||||||
ProcessNotifications(notification, gitea)
|
bot.ProcessNotifications(notification)
|
||||||
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
giteaUrl := flag.String("gitea-url", "https://src.opensuse.org", "Gitea instance used for reviews")
|
giteaUrl := flag.String("gitea-url", "https://src.opensuse.org", "Gitea instance used for reviews")
|
||||||
rabbitMqHost := flag.String("rabbit-url", "amqps://rabbit.opensuse.org", "RabbitMQ instance where Gitea webhook notifications are sent")
|
rabbitMqHost := flag.String("rabbit-url", "amqps://rabbit.opensuse.org", "RabbitMQ instance where Gitea webhook notifications are sent")
|
||||||
interval := flag.Int64("interval", 5, "Notification polling interval in minutes (min 1 min)")
|
interval := flag.Int64("interval", 10, "Notification polling interval in minutes (min 1 min)")
|
||||||
configFile := flag.String("config", "", "PrjGit listing config file")
|
configFile := flag.String("config", "", "PrjGit listing config file")
|
||||||
logging := flag.String("logging", "info", "Logging level: [none, error, info, debug]")
|
logging := flag.String("logging", "info", "Logging level: [none, error, info, debug]")
|
||||||
flag.BoolVar(&common.IsDryRun, "dry", false, "Dry run, no effect. For debugging")
|
flag.BoolVar(&common.IsDryRun, "dry", false, "Dry run, no effect. For debugging")
|
||||||
flag.Parse()
|
flag.Parse()
|
||||||
|
|
||||||
|
if err := common.SetLoggingLevelFromString(*logging); err != nil {
|
||||||
|
common.LogError(err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if cf := os.Getenv("AUTOGITS_CONFIG"); len(cf) > 0 {
|
||||||
|
*configFile = cf
|
||||||
|
}
|
||||||
|
if url := os.Getenv("AUTOGITS_URL"); len(url) > 0 {
|
||||||
|
*giteaUrl = url
|
||||||
|
}
|
||||||
|
if url := os.Getenv("AUTOGITS_RABBITURL"); len(url) > 0 {
|
||||||
|
*rabbitMqHost = url
|
||||||
|
}
|
||||||
|
if debug := os.Getenv("AUTOGITS_DEBUG"); len(debug) > 0 {
|
||||||
|
common.SetLoggingLevel(common.LogLevelDebug)
|
||||||
|
}
|
||||||
|
|
||||||
args := flag.Args()
|
args := flag.Args()
|
||||||
if len(args) != 1 {
|
if len(args) != 1 {
|
||||||
log.Println(" syntax:")
|
log.Println(" syntax:")
|
||||||
@@ -291,7 +356,7 @@ func main() {
|
|||||||
flag.Usage()
|
flag.Usage()
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
groupName = args[0]
|
targetGroupName := args[0]
|
||||||
|
|
||||||
if *configFile == "" {
|
if *configFile == "" {
|
||||||
common.LogError("Missing config file")
|
common.LogError("Missing config file")
|
||||||
@@ -314,36 +379,35 @@ func main() {
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
gitea := common.AllocateGiteaTransport(*giteaUrl)
|
giteaTransport := common.AllocateGiteaTransport(*giteaUrl)
|
||||||
configs, err = common.ResolveWorkflowConfigs(gitea, configData)
|
configs, err := common.ResolveWorkflowConfigs(giteaTransport, configData)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
common.LogError("Cannot parse workflow configs:", err)
|
common.LogError("Cannot parse workflow configs:", err)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
reviewer, err := gitea.GetCurrentUser()
|
reviewer, err := giteaTransport.GetCurrentUser()
|
||||||
if err != nil {
|
if err != nil {
|
||||||
common.LogError("Cannot fetch review user:", err)
|
common.LogError("Cannot fetch review user:", err)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
if err := common.SetLoggingLevelFromString(*logging); err != nil {
|
|
||||||
common.LogError(err.Error())
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
if *interval < 1 {
|
if *interval < 1 {
|
||||||
*interval = 1
|
*interval = 1
|
||||||
}
|
}
|
||||||
|
|
||||||
InitRegex(groupName)
|
bot := &ReviewBot{
|
||||||
|
gitea: giteaTransport,
|
||||||
|
configs: configs,
|
||||||
|
}
|
||||||
|
bot.InitRegex(targetGroupName)
|
||||||
|
|
||||||
common.LogInfo(" ** processing group reviews for group:", groupName)
|
common.LogInfo(" ** processing group reviews for group:", bot.groupName)
|
||||||
common.LogInfo(" ** username in Gitea:", reviewer.UserName)
|
common.LogInfo(" ** username in Gitea:", reviewer.UserName)
|
||||||
common.LogInfo(" ** polling interval:", *interval, "min")
|
common.LogInfo(" ** polling interval:", *interval, "min")
|
||||||
common.LogInfo(" ** connecting to RabbitMQ:", *rabbitMqHost)
|
common.LogInfo(" ** connecting to RabbitMQ:", *rabbitMqHost)
|
||||||
|
|
||||||
if groupName != reviewer.UserName {
|
if bot.groupName != reviewer.UserName {
|
||||||
common.LogError(" ***** Reviewer does not match group name. Aborting. *****")
|
common.LogError(" ***** Reviewer does not match group name. Aborting. *****")
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
@@ -355,22 +419,28 @@ func main() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
config_update := ConfigUpdatePush{
|
config_update := ConfigUpdatePush{
|
||||||
|
bot: bot,
|
||||||
config_modified: make(chan *common.AutogitConfig),
|
config_modified: make(chan *common.AutogitConfig),
|
||||||
}
|
}
|
||||||
|
|
||||||
configUpdates := &common.ListenDefinitions{
|
process_issue_pr := IssueCommentProcessor{
|
||||||
RabbitURL: u,
|
bot: bot,
|
||||||
Orgs: []string{},
|
}
|
||||||
|
|
||||||
|
configUpdates := &common.RabbitMQGiteaEventsProcessor{
|
||||||
|
Orgs: []string{},
|
||||||
Handlers: map[string]common.RequestProcessor{
|
Handlers: map[string]common.RequestProcessor{
|
||||||
common.RequestType_Push: &config_update,
|
common.RequestType_Push: &config_update,
|
||||||
|
common.RequestType_IssueComment: &process_issue_pr,
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
for _, c := range configs {
|
configUpdates.Connection().RabbitURL = u
|
||||||
|
for _, c := range bot.configs {
|
||||||
if org, _, _ := c.GetPrjGit(); !slices.Contains(configUpdates.Orgs, org) {
|
if org, _, _ := c.GetPrjGit(); !slices.Contains(configUpdates.Orgs, org) {
|
||||||
configUpdates.Orgs = append(configUpdates.Orgs, org)
|
configUpdates.Orgs = append(configUpdates.Orgs, org)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
go configUpdates.ProcessRabbitMQEvents()
|
go common.ProcessRabbitMQEvents(configUpdates)
|
||||||
|
|
||||||
for {
|
for {
|
||||||
config_update_loop:
|
config_update_loop:
|
||||||
@@ -378,17 +448,17 @@ func main() {
|
|||||||
select {
|
select {
|
||||||
case configTouched, ok := <-config_update.config_modified:
|
case configTouched, ok := <-config_update.config_modified:
|
||||||
if ok {
|
if ok {
|
||||||
for idx, c := range configs {
|
for idx, c := range bot.configs {
|
||||||
if c == configTouched {
|
if c == configTouched {
|
||||||
org, repo, branch := c.GetPrjGit()
|
org, repo, branch := c.GetPrjGit()
|
||||||
prj := fmt.Sprintf("%s/%s#%s", org, repo, branch)
|
prj := fmt.Sprintf("%s/%s#%s", org, repo, branch)
|
||||||
common.LogInfo("Detected config update for", prj)
|
common.LogInfo("Detected config update for", prj)
|
||||||
|
|
||||||
new_config, err := common.ReadWorkflowConfig(gitea, prj)
|
new_config, err := common.ReadWorkflowConfig(bot.gitea, prj)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
common.LogError("Failed parsing Project config for", prj, err)
|
common.LogError("Failed parsing Project config for", prj, err)
|
||||||
} else {
|
} else {
|
||||||
configs[idx] = new_config
|
bot.configs[idx] = new_config
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -398,7 +468,7 @@ func main() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
PeriodReviewCheck(gitea)
|
bot.PeriodReviewCheck()
|
||||||
time.Sleep(time.Duration(*interval * int64(time.Minute)))
|
time.Sleep(time.Duration(*interval * int64(time.Minute)))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,7 +1,492 @@
|
|||||||
package main
|
package main
|
||||||
|
|
||||||
import "testing"
|
import (
|
||||||
|
"fmt"
|
||||||
|
"testing"
|
||||||
|
|
||||||
func TestReviews(t *testing.T) {
|
"go.uber.org/mock/gomock"
|
||||||
|
"src.opensuse.org/autogits/common"
|
||||||
|
"src.opensuse.org/autogits/common/gitea-generated/models"
|
||||||
|
mock_common "src.opensuse.org/autogits/common/mock"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestProcessPR(t *testing.T) {
|
||||||
|
ctrl := gomock.NewController(t)
|
||||||
|
defer ctrl.Finish()
|
||||||
|
|
||||||
|
mockGitea := mock_common.NewMockGitea(ctrl)
|
||||||
|
groupName := "testgroup"
|
||||||
|
|
||||||
|
bot := &ReviewBot{
|
||||||
|
gitea: mockGitea,
|
||||||
|
groupName: groupName,
|
||||||
|
}
|
||||||
|
bot.InitRegex(groupName)
|
||||||
|
|
||||||
|
org := "myorg"
|
||||||
|
repo := "myrepo"
|
||||||
|
prIndex := int64(1)
|
||||||
|
headSha := "abcdef123456"
|
||||||
|
|
||||||
|
pr := &models.PullRequest{
|
||||||
|
Index: prIndex,
|
||||||
|
URL: "http://gitea/pr/1",
|
||||||
|
State: "open",
|
||||||
|
Base: &models.PRBranchInfo{
|
||||||
|
Name: "main",
|
||||||
|
Repo: &models.Repository{
|
||||||
|
Name: repo,
|
||||||
|
Owner: &models.User{
|
||||||
|
UserName: org,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Head: &models.PRBranchInfo{
|
||||||
|
Sha: headSha,
|
||||||
|
},
|
||||||
|
User: &models.User{
|
||||||
|
UserName: "submitter",
|
||||||
|
},
|
||||||
|
RequestedReviewers: []*models.User{
|
||||||
|
{UserName: groupName},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
prjConfig := &common.AutogitConfig{
|
||||||
|
GitProjectName: org + "/" + repo + "#main",
|
||||||
|
ReviewGroups: []*common.ReviewGroup{
|
||||||
|
{
|
||||||
|
Name: groupName,
|
||||||
|
Reviewers: []string{"reviewer1", "reviewer2"},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
bot.configs = common.AutogitConfigs{prjConfig}
|
||||||
|
|
||||||
|
t.Run("Review not requested for group", func(t *testing.T) {
|
||||||
|
prNoRequest := *pr
|
||||||
|
prNoRequest.RequestedReviewers = nil
|
||||||
|
err := bot.ProcessPR(&prNoRequest)
|
||||||
|
if err != nil {
|
||||||
|
t.Errorf("Expected no error, got %v", err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("PR is closed", func(t *testing.T) {
|
||||||
|
prClosed := *pr
|
||||||
|
prClosed.State = "closed"
|
||||||
|
err := bot.ProcessPR(&prClosed)
|
||||||
|
if err != nil {
|
||||||
|
t.Errorf("Expected no error, got %v", err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Successful Approval", func(t *testing.T) {
|
||||||
|
common.IsDryRun = false
|
||||||
|
mockGitea.EXPECT().GetPullRequestReviews(org, repo, prIndex).Return(nil, nil)
|
||||||
|
// reviewer1 approved in timeline
|
||||||
|
timeline := []*models.TimelineComment{
|
||||||
|
{
|
||||||
|
Type: common.TimelineCommentType_Comment,
|
||||||
|
User: &models.User{UserName: "reviewer1"},
|
||||||
|
Body: "@" + groupName + ": approve",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
mockGitea.EXPECT().GetTimeline(org, repo, prIndex).Return(timeline, nil)
|
||||||
|
|
||||||
|
expectedText := "reviewer1 approved a review on behalf of " + groupName
|
||||||
|
mockGitea.EXPECT().AddReviewComment(pr, common.ReviewStateApproved, expectedText).Return(nil, nil)
|
||||||
|
mockGitea.EXPECT().UnrequestReview(org, repo, prIndex, gomock.Any()).Return(nil)
|
||||||
|
|
||||||
|
err := bot.ProcessPR(pr)
|
||||||
|
if err != nil {
|
||||||
|
t.Errorf("Expected nil error, got %v", err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Dry Run - No actions taken", func(t *testing.T) {
|
||||||
|
common.IsDryRun = true
|
||||||
|
mockGitea.EXPECT().GetPullRequestReviews(org, repo, prIndex).Return(nil, nil)
|
||||||
|
timeline := []*models.TimelineComment{
|
||||||
|
{
|
||||||
|
Type: common.TimelineCommentType_Comment,
|
||||||
|
User: &models.User{UserName: "reviewer1"},
|
||||||
|
Body: "@" + groupName + ": approve",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
mockGitea.EXPECT().GetTimeline(org, repo, prIndex).Return(timeline, nil)
|
||||||
|
|
||||||
|
// No AddReviewComment or UnrequestReview should be called
|
||||||
|
err := bot.ProcessPR(pr)
|
||||||
|
if err != nil {
|
||||||
|
t.Errorf("Expected nil error, got %v", err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Approval already exists - No new comment", func(t *testing.T) {
|
||||||
|
common.IsDryRun = false
|
||||||
|
mockGitea.EXPECT().GetPullRequestReviews(org, repo, prIndex).Return(nil, nil)
|
||||||
|
|
||||||
|
approvalText := "reviewer1 approved a review on behalf of " + groupName
|
||||||
|
timeline := []*models.TimelineComment{
|
||||||
|
{
|
||||||
|
Type: common.TimelineCommentType_Review,
|
||||||
|
User: &models.User{UserName: groupName},
|
||||||
|
Body: approvalText,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Type: common.TimelineCommentType_Comment,
|
||||||
|
User: &models.User{UserName: "reviewer1"},
|
||||||
|
Body: "@" + groupName + ": approve",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Type: common.TimelineCommentType_Comment,
|
||||||
|
User: &models.User{UserName: groupName},
|
||||||
|
Body: "Help comment",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
mockGitea.EXPECT().GetTimeline(org, repo, prIndex).Return(timeline, nil)
|
||||||
|
|
||||||
|
// No AddReviewComment, UnrequestReview, or AddComment should be called
|
||||||
|
err := bot.ProcessPR(pr)
|
||||||
|
if err != nil {
|
||||||
|
t.Errorf("Expected nil error, got %v", err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Rejection already exists - No new comment", func(t *testing.T) {
|
||||||
|
common.IsDryRun = false
|
||||||
|
mockGitea.EXPECT().GetPullRequestReviews(org, repo, prIndex).Return(nil, nil)
|
||||||
|
|
||||||
|
rejectionText := "reviewer1 requested changes on behalf of " + groupName + ". See http://gitea/comment/123"
|
||||||
|
timeline := []*models.TimelineComment{
|
||||||
|
{
|
||||||
|
Type: common.TimelineCommentType_Review,
|
||||||
|
User: &models.User{UserName: groupName},
|
||||||
|
Body: rejectionText,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Type: common.TimelineCommentType_Comment,
|
||||||
|
User: &models.User{UserName: "reviewer1"},
|
||||||
|
Body: "@" + groupName + ": decline",
|
||||||
|
HTMLURL: "http://gitea/comment/123",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Type: common.TimelineCommentType_Comment,
|
||||||
|
User: &models.User{UserName: groupName},
|
||||||
|
Body: "Help comment",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
mockGitea.EXPECT().GetTimeline(org, repo, prIndex).Return(timeline, nil)
|
||||||
|
|
||||||
|
err := bot.ProcessPR(pr)
|
||||||
|
if err != nil {
|
||||||
|
t.Errorf("Expected nil error, got %v", err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Pending review - Help comment already exists", func(t *testing.T) {
|
||||||
|
common.IsDryRun = false
|
||||||
|
mockGitea.EXPECT().GetPullRequestReviews(org, repo, prIndex).Return(nil, nil)
|
||||||
|
|
||||||
|
timeline := []*models.TimelineComment{
|
||||||
|
{
|
||||||
|
Type: common.TimelineCommentType_Comment,
|
||||||
|
User: &models.User{UserName: groupName},
|
||||||
|
Body: "Some help comment",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
mockGitea.EXPECT().GetTimeline(org, repo, prIndex).Return(timeline, nil)
|
||||||
|
|
||||||
|
// It will try to request reviews
|
||||||
|
mockGitea.EXPECT().RequestReviews(pr, "reviewer1", "reviewer2").Return(nil, nil)
|
||||||
|
|
||||||
|
// AddComment should NOT be called because bot already has a comment in timeline
|
||||||
|
err := bot.ProcessPR(pr)
|
||||||
|
if err != ReviewNotFinished {
|
||||||
|
t.Errorf("Expected ReviewNotFinished error, got %v", err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Submitter is group member - Excluded from review request", func(t *testing.T) {
|
||||||
|
common.IsDryRun = false
|
||||||
|
prSubmitterMember := *pr
|
||||||
|
prSubmitterMember.User = &models.User{UserName: "reviewer1"}
|
||||||
|
mockGitea.EXPECT().GetPullRequestReviews(org, repo, prIndex).Return(nil, nil)
|
||||||
|
mockGitea.EXPECT().GetTimeline(org, repo, prIndex).Return(nil, nil)
|
||||||
|
mockGitea.EXPECT().RequestReviews(&prSubmitterMember, "reviewer2").Return(nil, nil)
|
||||||
|
mockGitea.EXPECT().AddComment(&prSubmitterMember, gomock.Any()).Return(nil)
|
||||||
|
err := bot.ProcessPR(&prSubmitterMember)
|
||||||
|
if err != ReviewNotFinished {
|
||||||
|
t.Errorf("Expected ReviewNotFinished error, got %v", err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Successful Rejection", func(t *testing.T) {
|
||||||
|
common.IsDryRun = false
|
||||||
|
mockGitea.EXPECT().GetPullRequestReviews(org, repo, prIndex).Return(nil, nil)
|
||||||
|
timeline := []*models.TimelineComment{
|
||||||
|
{
|
||||||
|
Type: common.TimelineCommentType_Comment,
|
||||||
|
User: &models.User{UserName: "reviewer2"},
|
||||||
|
Body: "@" + groupName + ": decline",
|
||||||
|
HTMLURL: "http://gitea/comment/999",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
mockGitea.EXPECT().GetTimeline(org, repo, prIndex).Return(timeline, nil)
|
||||||
|
expectedText := "reviewer2 requested changes on behalf of " + groupName + ". See http://gitea/comment/999"
|
||||||
|
mockGitea.EXPECT().AddReviewComment(pr, common.ReviewStateRequestChanges, expectedText).Return(nil, nil)
|
||||||
|
mockGitea.EXPECT().UnrequestReview(org, repo, prIndex, gomock.Any()).Return(nil)
|
||||||
|
err := bot.ProcessPR(pr)
|
||||||
|
if err != nil {
|
||||||
|
t.Errorf("Expected nil error, got %v", err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Config not found", func(t *testing.T) {
|
||||||
|
bot.configs = common.AutogitConfigs{}
|
||||||
|
err := bot.ProcessPR(pr)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("Expected error when config is missing, got nil")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Gitea error in GetPullRequestReviews", func(t *testing.T) {
|
||||||
|
bot.configs = common.AutogitConfigs{prjConfig}
|
||||||
|
mockGitea.EXPECT().GetPullRequestReviews(org, repo, prIndex).Return(nil, fmt.Errorf("gitea error"))
|
||||||
|
err := bot.ProcessPR(pr)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("Expected error from gitea, got nil")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestProcessNotifications(t *testing.T) {
|
||||||
|
ctrl := gomock.NewController(t)
|
||||||
|
defer ctrl.Finish()
|
||||||
|
|
||||||
|
mockGitea := mock_common.NewMockGitea(ctrl)
|
||||||
|
groupName := "testgroup"
|
||||||
|
|
||||||
|
bot := &ReviewBot{
|
||||||
|
gitea: mockGitea,
|
||||||
|
groupName: groupName,
|
||||||
|
}
|
||||||
|
bot.InitRegex(groupName)
|
||||||
|
|
||||||
|
org := "myorg"
|
||||||
|
repo := "myrepo"
|
||||||
|
prIndex := int64(123)
|
||||||
|
notificationID := int64(456)
|
||||||
|
|
||||||
|
notification := &models.NotificationThread{
|
||||||
|
ID: notificationID,
|
||||||
|
Subject: &models.NotificationSubject{
|
||||||
|
URL: fmt.Sprintf("http://gitea/api/v1/repos/%s/%s/pulls/%d", org, repo, prIndex),
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("Notification Success", func(t *testing.T) {
|
||||||
|
common.IsDryRun = false
|
||||||
|
pr := &models.PullRequest{
|
||||||
|
Index: prIndex,
|
||||||
|
Base: &models.PRBranchInfo{
|
||||||
|
Name: "main",
|
||||||
|
Repo: &models.Repository{
|
||||||
|
Name: repo,
|
||||||
|
Owner: &models.User{UserName: org},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
|
||||||
|
Head: &models.PRBranchInfo{
|
||||||
|
Sha: "headsha",
|
||||||
|
Repo: &models.Repository{
|
||||||
|
Name: repo,
|
||||||
|
Owner: &models.User{UserName: org},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
|
||||||
|
User: &models.User{UserName: "submitter"},
|
||||||
|
RequestedReviewers: []*models.User{{UserName: groupName}},
|
||||||
|
}
|
||||||
|
|
||||||
|
mockGitea.EXPECT().GetPullRequest(org, repo, prIndex).Return(pr, nil)
|
||||||
|
|
||||||
|
prjConfig := &common.AutogitConfig{
|
||||||
|
GitProjectName: org + "/" + repo + "#main",
|
||||||
|
ReviewGroups: []*common.ReviewGroup{{Name: groupName, Reviewers: []string{"r1"}}},
|
||||||
|
}
|
||||||
|
bot.configs = common.AutogitConfigs{prjConfig}
|
||||||
|
mockGitea.EXPECT().GetPullRequestReviews(org, repo, prIndex).Return(nil, nil)
|
||||||
|
timeline := []*models.TimelineComment{
|
||||||
|
{
|
||||||
|
Type: common.TimelineCommentType_Comment,
|
||||||
|
User: &models.User{UserName: "r1"},
|
||||||
|
Body: "@" + groupName + ": approve",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
mockGitea.EXPECT().GetTimeline(org, repo, prIndex).Return(timeline, nil)
|
||||||
|
expectedText := "r1 approved a review on behalf of " + groupName
|
||||||
|
mockGitea.EXPECT().AddReviewComment(pr, common.ReviewStateApproved, expectedText).Return(nil, nil)
|
||||||
|
mockGitea.EXPECT().UnrequestReview(org, repo, prIndex, gomock.Any()).Return(nil)
|
||||||
|
|
||||||
|
mockGitea.EXPECT().SetNotificationRead(notificationID).Return(nil)
|
||||||
|
|
||||||
|
bot.ProcessNotifications(notification)
|
||||||
|
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Invalid Notification URL", func(t *testing.T) {
|
||||||
|
badNotification := &models.NotificationThread{
|
||||||
|
Subject: &models.NotificationSubject{
|
||||||
|
URL: "http://gitea/invalid/url",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
bot.ProcessNotifications(badNotification)
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Gitea error in GetPullRequest", func(t *testing.T) {
|
||||||
|
mockGitea.EXPECT().GetPullRequest(org, repo, prIndex).Return(nil, fmt.Errorf("gitea error"))
|
||||||
|
bot.ProcessNotifications(notification)
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestReviewApprovalCheck(t *testing.T) {
|
||||||
|
tests := []struct {
|
||||||
|
Name string
|
||||||
|
GroupName string
|
||||||
|
InString string
|
||||||
|
Approved bool
|
||||||
|
Rejected bool
|
||||||
|
}{
|
||||||
|
{
|
||||||
|
Name: "Empty String",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: "",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "Random Text",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: "some things LGTM",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "Group name with Random Text means disapproval",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: "@group: some things LGTM",
|
||||||
|
Rejected: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "Bad name with Approval",
|
||||||
|
GroupName: "group2",
|
||||||
|
InString: "@group: LGTM",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "Bad name with Approval",
|
||||||
|
GroupName: "group2",
|
||||||
|
InString: "@group: LGTM",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "LGTM approval",
|
||||||
|
GroupName: "group2",
|
||||||
|
InString: "@group2: LGTM",
|
||||||
|
Approved: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "approval",
|
||||||
|
GroupName: "group2",
|
||||||
|
InString: "@group2: approved",
|
||||||
|
Approved: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "approval",
|
||||||
|
GroupName: "group2",
|
||||||
|
InString: "@group2: approve",
|
||||||
|
Approved: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "disapproval",
|
||||||
|
GroupName: "group2",
|
||||||
|
InString: "@group2: disapprove",
|
||||||
|
Rejected: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "Whitespace before colon",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: "@group : LGTM",
|
||||||
|
Approved: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "No whitespace after colon",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: "@group:LGTM",
|
||||||
|
Approved: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "Leading and trailing whitespace on line",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: " @group: LGTM ",
|
||||||
|
Approved: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "Multiline: Approved on second line",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: "Random noise\n@group: approved",
|
||||||
|
Approved: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "Multiline: Multiple group mentions, first wins",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: "@group: decline\n@group: approve",
|
||||||
|
Rejected: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "Multiline: Approved on second line",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: "noise\n@group: approve\nmore noise",
|
||||||
|
Approved: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "Not at start of line (even with whitespace)",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: "Hello @group: approve",
|
||||||
|
Approved: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "Rejecting with reason",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: "@group: decline because of X, Y and Z",
|
||||||
|
Rejected: true,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "No colon after group",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: "@group LGTM",
|
||||||
|
Approved: false,
|
||||||
|
Rejected: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
Name: "Invalid char after group",
|
||||||
|
GroupName: "group",
|
||||||
|
InString: "@group! LGTM",
|
||||||
|
Approved: false,
|
||||||
|
Rejected: false,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
for _, test := range tests {
|
||||||
|
t.Run(test.Name, func(t *testing.T) {
|
||||||
|
bot := &ReviewBot{}
|
||||||
|
bot.InitRegex(test.GroupName)
|
||||||
|
|
||||||
|
if r := bot.ReviewAccepted(test.InString); r != test.Approved {
|
||||||
|
t.Error("ReviewAccepted() returned", r, "expecting", test.Approved)
|
||||||
|
}
|
||||||
|
if r := bot.ReviewRejected(test.InString); r != test.Rejected {
|
||||||
|
t.Error("ReviewRejected() returned", r, "expecting", test.Rejected)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -7,7 +7,29 @@ import (
|
|||||||
"src.opensuse.org/autogits/common"
|
"src.opensuse.org/autogits/common"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
type IssueCommentProcessor struct {
|
||||||
|
bot *ReviewBot
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *IssueCommentProcessor) ProcessFunc(req *common.Request) error {
|
||||||
|
if req.Type != common.RequestType_IssueComment {
|
||||||
|
return fmt.Errorf("Unhandled, ignored request type: %s", req.Type)
|
||||||
|
}
|
||||||
|
|
||||||
|
data := req.Data.(*common.IssueCommentWebhookEvent)
|
||||||
|
org := data.Repository.Owner.Username
|
||||||
|
repo := data.Repository.Name
|
||||||
|
index := int64(data.Issue.Number)
|
||||||
|
|
||||||
|
pr, err := s.bot.gitea.GetPullRequest(org, repo, index)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("Failed to fetch PullRequest from event: %s/%s!%d Error: %w", org, repo, index, err)
|
||||||
|
}
|
||||||
|
return s.bot.ProcessPR(pr)
|
||||||
|
}
|
||||||
|
|
||||||
type ConfigUpdatePush struct {
|
type ConfigUpdatePush struct {
|
||||||
|
bot *ReviewBot
|
||||||
config_modified chan *common.AutogitConfig
|
config_modified chan *common.AutogitConfig
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -27,7 +49,7 @@ func (s *ConfigUpdatePush) ProcessFunc(req *common.Request) error {
|
|||||||
}
|
}
|
||||||
branch := data.Ref[len(branch_ref):]
|
branch := data.Ref[len(branch_ref):]
|
||||||
|
|
||||||
c := configs.GetPrjGitConfig(org, repo, branch)
|
c := s.bot.configs.GetPrjGitConfig(org, repo, branch)
|
||||||
if c == nil {
|
if c == nil {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
@@ -45,7 +67,7 @@ func (s *ConfigUpdatePush) ProcessFunc(req *common.Request) error {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if modified_config {
|
if modified_config {
|
||||||
for _, config := range configs {
|
for _, config := range s.bot.configs {
|
||||||
if o, r, _ := config.GetPrjGit(); o == org && r == repo {
|
if o, r, _ := config.GetPrjGit(); o == org && r == repo {
|
||||||
s.config_modified <- config
|
s.config_modified <- config
|
||||||
}
|
}
|
||||||
|
|||||||
203
group-review/rabbit_test.go
Normal file
203
group-review/rabbit_test.go
Normal file
@@ -0,0 +1,203 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"go.uber.org/mock/gomock"
|
||||||
|
"src.opensuse.org/autogits/common"
|
||||||
|
"src.opensuse.org/autogits/common/gitea-generated/models"
|
||||||
|
mock_common "src.opensuse.org/autogits/common/mock"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestIssueCommentProcessor(t *testing.T) {
|
||||||
|
ctrl := gomock.NewController(t)
|
||||||
|
defer ctrl.Finish()
|
||||||
|
|
||||||
|
mockGitea := mock_common.NewMockGitea(ctrl)
|
||||||
|
groupName := "testgroup"
|
||||||
|
bot := &ReviewBot{
|
||||||
|
gitea: mockGitea,
|
||||||
|
groupName: groupName,
|
||||||
|
}
|
||||||
|
bot.InitRegex(groupName)
|
||||||
|
|
||||||
|
processor := &IssueCommentProcessor{bot: bot}
|
||||||
|
|
||||||
|
org := "myorg"
|
||||||
|
repo := "myrepo"
|
||||||
|
index := 123
|
||||||
|
|
||||||
|
event := &common.IssueCommentWebhookEvent{
|
||||||
|
Repository: &common.Repository{
|
||||||
|
Name: repo,
|
||||||
|
Owner: &common.Organization{
|
||||||
|
Username: org,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Issue: &common.IssueDetail{
|
||||||
|
Number: index,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
req := &common.Request{
|
||||||
|
Type: common.RequestType_IssueComment,
|
||||||
|
Data: event,
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("Successful Processing", func(t *testing.T) {
|
||||||
|
pr := &models.PullRequest{
|
||||||
|
Index: int64(index),
|
||||||
|
Base: &models.PRBranchInfo{
|
||||||
|
Name: "main",
|
||||||
|
Repo: &models.Repository{
|
||||||
|
Name: repo,
|
||||||
|
Owner: &models.User{UserName: org},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Head: &models.PRBranchInfo{
|
||||||
|
Sha: "headsha",
|
||||||
|
Repo: &models.Repository{
|
||||||
|
Name: repo,
|
||||||
|
Owner: &models.User{UserName: org},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
User: &models.User{UserName: "submitter"},
|
||||||
|
RequestedReviewers: []*models.User{{UserName: groupName}},
|
||||||
|
}
|
||||||
|
|
||||||
|
mockGitea.EXPECT().GetPullRequest(org, repo, int64(index)).Return(pr, nil)
|
||||||
|
|
||||||
|
prjConfig := &common.AutogitConfig{
|
||||||
|
GitProjectName: org + "/" + repo + "#main",
|
||||||
|
ReviewGroups: []*common.ReviewGroup{{Name: groupName, Reviewers: []string{"r1"}}},
|
||||||
|
}
|
||||||
|
bot.configs = common.AutogitConfigs{prjConfig}
|
||||||
|
mockGitea.EXPECT().GetPullRequestReviews(org, repo, int64(index)).Return(nil, nil)
|
||||||
|
mockGitea.EXPECT().GetTimeline(org, repo, int64(index)).Return(nil, nil)
|
||||||
|
mockGitea.EXPECT().RequestReviews(pr, "r1").Return(nil, nil)
|
||||||
|
mockGitea.EXPECT().AddComment(pr, gomock.Any()).Return(nil)
|
||||||
|
|
||||||
|
err := processor.ProcessFunc(req)
|
||||||
|
if err != ReviewNotFinished {
|
||||||
|
t.Errorf("Expected ReviewNotFinished, got %v", err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Gitea error in GetPullRequest", func(t *testing.T) {
|
||||||
|
mockGitea.EXPECT().GetPullRequest(org, repo, int64(index)).Return(nil, fmt.Errorf("gitea error"))
|
||||||
|
err := processor.ProcessFunc(req)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("Expected error, got nil")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Wrong Request Type", func(t *testing.T) {
|
||||||
|
wrongReq := &common.Request{Type: common.RequestType_Push}
|
||||||
|
err := processor.ProcessFunc(wrongReq)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("Expected error for wrong request type, got nil")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestConfigUpdatePush(t *testing.T) {
|
||||||
|
ctrl := gomock.NewController(t)
|
||||||
|
defer ctrl.Finish()
|
||||||
|
|
||||||
|
groupName := "testgroup"
|
||||||
|
bot := &ReviewBot{
|
||||||
|
groupName: groupName,
|
||||||
|
}
|
||||||
|
bot.InitRegex(groupName)
|
||||||
|
|
||||||
|
configChan := make(chan *common.AutogitConfig, 1)
|
||||||
|
processor := &ConfigUpdatePush{
|
||||||
|
bot: bot,
|
||||||
|
config_modified: configChan,
|
||||||
|
}
|
||||||
|
|
||||||
|
org := "myorg"
|
||||||
|
repo := "myrepo"
|
||||||
|
branch := "main"
|
||||||
|
|
||||||
|
prjConfig := &common.AutogitConfig{
|
||||||
|
GitProjectName: org + "/" + repo + "#" + branch,
|
||||||
|
Organization: org,
|
||||||
|
Branch: branch,
|
||||||
|
}
|
||||||
|
bot.configs = common.AutogitConfigs{prjConfig}
|
||||||
|
|
||||||
|
event := &common.PushWebhookEvent{
|
||||||
|
Ref: "refs/heads/" + branch,
|
||||||
|
Repository: &common.Repository{
|
||||||
|
Name: repo,
|
||||||
|
Owner: &common.Organization{
|
||||||
|
Username: org,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
Commits: []common.Commit{
|
||||||
|
{
|
||||||
|
Modified: []string{common.ProjectConfigFile},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
req := &common.Request{
|
||||||
|
Type: common.RequestType_Push,
|
||||||
|
Data: event,
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Run("Config Modified", func(t *testing.T) {
|
||||||
|
err := processor.ProcessFunc(req)
|
||||||
|
if err != nil {
|
||||||
|
t.Errorf("Expected nil error, got %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
select {
|
||||||
|
case modified := <-configChan:
|
||||||
|
if modified != prjConfig {
|
||||||
|
t.Errorf("Expected modified config to be %v, got %v", prjConfig, modified)
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
t.Error("Expected config modification signal, but none received")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("No Config Modified", func(t *testing.T) {
|
||||||
|
noConfigEvent := *event
|
||||||
|
noConfigEvent.Commits = []common.Commit{{Modified: []string{"README.md"}}}
|
||||||
|
noConfigReq := &common.Request{Type: common.RequestType_Push, Data: &noConfigEvent}
|
||||||
|
|
||||||
|
err := processor.ProcessFunc(noConfigReq)
|
||||||
|
if err != nil {
|
||||||
|
t.Errorf("Expected nil error, got %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
select {
|
||||||
|
case <-configChan:
|
||||||
|
t.Error("Did not expect config modification signal")
|
||||||
|
default:
|
||||||
|
// Success
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Wrong Branch Ref", func(t *testing.T) {
|
||||||
|
wrongBranchEvent := *event
|
||||||
|
wrongBranchEvent.Ref = "refs/tags/v1.0"
|
||||||
|
wrongBranchReq := &common.Request{Type: common.RequestType_Push, Data: &wrongBranchEvent}
|
||||||
|
|
||||||
|
err := processor.ProcessFunc(wrongBranchReq)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("Expected error for wrong branch ref, got nil")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
|
||||||
|
t.Run("Config Not Found", func(t *testing.T) {
|
||||||
|
bot.configs = common.AutogitConfigs{}
|
||||||
|
err := processor.ProcessFunc(req)
|
||||||
|
if err != nil {
|
||||||
|
t.Errorf("Expected nil error even if config not found, got %v", err)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
11
integration/Dockerfile
Normal file
11
integration/Dockerfile
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
|
||||||
|
FROM opensuse/tumbleweed
|
||||||
|
ENV container=podman
|
||||||
|
|
||||||
|
ENV LANG=en_US.UTF-8
|
||||||
|
|
||||||
|
RUN zypper -vvvn install podman podman-compose vim make python3-pytest python3-requests python3-pytest-dependency
|
||||||
|
|
||||||
|
COPY . /opt/project/
|
||||||
|
|
||||||
|
WORKDIR /opt/project/integration
|
||||||
76
integration/Makefile
Normal file
76
integration/Makefile
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
# We want to be able to test in two **modes**:
|
||||||
|
# A. bots are used from official packages as defined in */Dockerfile.package
|
||||||
|
# B. bots are just picked up from binaries that are placed in corresponding parent directory.
|
||||||
|
|
||||||
|
# The topology is defined in podman-compose file and can be spawned in two ways:
|
||||||
|
# 1. Privileged container (needs no additional dependancies)
|
||||||
|
# 2. podman-compose on a local machine (needs dependencies as defined in the Dockerfile)
|
||||||
|
|
||||||
|
|
||||||
|
# Typical workflow:
|
||||||
|
# A1: - run 'make test_package'
|
||||||
|
# B1: - run 'make test_local' (make sure that the go binaries in parent folder are built)
|
||||||
|
# A2:
|
||||||
|
# 1. 'make build_package' - prepares images (recommended, otherwise there might be surprises if image fails to build during `make up`)
|
||||||
|
# 2. 'make up' - spawns podman-compose
|
||||||
|
# 3. 'pytest -v tests/*' - run tests
|
||||||
|
# 4. 'make down' - once the containers are not needed
|
||||||
|
# B2: (make sure the go binaries in the parent folder are built)
|
||||||
|
# 4. 'make build_local' - prepared images (recommended, otherwise there might be surprises if image fails to build during `make up`)
|
||||||
|
# 5. 'make up' - spawns podman-compose
|
||||||
|
# 6. 'pytest -v tests/*' - run tests
|
||||||
|
# 7. 'make down' - once the containers are not needed
|
||||||
|
|
||||||
|
|
||||||
|
AUTO_DETECT_MODE := $(shell if test -e ../workflow-pr/workflow-pr; then echo .local; else echo .package; fi)
|
||||||
|
|
||||||
|
# try to detect mode B1, otherwise mode A1
|
||||||
|
test: GIWTF_IMAGE_SUFFIX=$(AUTO_DETECT_MODE)
|
||||||
|
test: build_container test_container
|
||||||
|
|
||||||
|
# mode A1
|
||||||
|
test_package: GIWTF_IMAGE_SUFFIX=.package
|
||||||
|
test_package: build_container test_container
|
||||||
|
|
||||||
|
# mode B1
|
||||||
|
test_local: GIWTF_IMAGE_SUFFIX=.local
|
||||||
|
test_local: build_container test_container
|
||||||
|
|
||||||
|
MODULES := gitea-events-rabbitmq-publisher obs-staging-bot workflow-pr
|
||||||
|
|
||||||
|
# Prepare topology 1
|
||||||
|
build_container:
|
||||||
|
podman build ../ -f integration/Dockerfile -t autogits_integration
|
||||||
|
|
||||||
|
# Run tests in topology 1
|
||||||
|
test_container:
|
||||||
|
podman run --rm --privileged -t --network integration_gitea-network -e GIWTF_IMAGE_SUFFIX=$(GIWTF_IMAGE_SUFFIX) autogits_integration /usr/bin/bash -c "make build && make up && sleep 25 && pytest -v tests/*"
|
||||||
|
|
||||||
|
|
||||||
|
build_local: AUTO_DETECT_MODE=.local
|
||||||
|
build_local: build
|
||||||
|
|
||||||
|
build_package: AUTO_DETECT_MODE=.package
|
||||||
|
build_package: build
|
||||||
|
|
||||||
|
# parse all service images from podman-compose and build them (topology 2)
|
||||||
|
build:
|
||||||
|
podman pull docker.io/library/rabbitmq:3.13.7-management
|
||||||
|
for i in $$(grep -A 1000 services: podman-compose.yml | grep -oE '^ [^: ]+'); do GIWTF_IMAGE_SUFFIX=$(AUTO_DETECT_MODE) podman-compose build $$i || exit 1; done
|
||||||
|
|
||||||
|
# this will spawn prebuilt containers (topology 2)
|
||||||
|
up:
|
||||||
|
podman-compose up -d
|
||||||
|
|
||||||
|
# tear down (topology 2)
|
||||||
|
down:
|
||||||
|
podman-compose down
|
||||||
|
|
||||||
|
# mode A
|
||||||
|
up-bots-package:
|
||||||
|
GIWTF_IMAGE_SUFFIX=.package podman-compose up -d
|
||||||
|
|
||||||
|
# mode B
|
||||||
|
up-bots-local:
|
||||||
|
GIWTF_IMAGE_SUFFIX=.local podman-compose up -d
|
||||||
|
|
||||||
1
integration/clean.sh
Executable file
1
integration/clean.sh
Executable file
@@ -0,0 +1 @@
|
|||||||
|
sudo rm -rf gitea-data/ gitea-logs/ rabbitmq-data/ workflow-pr-repos/
|
||||||
1
integration/gitea-events-rabbitmq-publisher/Dockerfile
Symbolic link
1
integration/gitea-events-rabbitmq-publisher/Dockerfile
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
Dockerfile.package
|
||||||
15
integration/gitea-events-rabbitmq-publisher/Dockerfile.local
Normal file
15
integration/gitea-events-rabbitmq-publisher/Dockerfile.local
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
FROM registry.suse.com/bci/bci-base:15.7
|
||||||
|
|
||||||
|
# Add the custom CA to the trust store
|
||||||
|
COPY integration/rabbitmq-config/certs/cert.pem /usr/share/pki/trust/anchors/gitea-rabbitmq-ca.crt
|
||||||
|
RUN update-ca-certificates
|
||||||
|
|
||||||
|
RUN zypper -n in which binutils
|
||||||
|
|
||||||
|
# Copy the pre-built binary into the container
|
||||||
|
# The user will build this and place it in the same directory as this Dockerfile
|
||||||
|
COPY gitea-events-rabbitmq-publisher/gitea-events-rabbitmq-publisher /usr/local/bin/
|
||||||
|
COPY integration/gitea-events-rabbitmq-publisher/entrypoint.sh /usr/local/bin/entrypoint.sh
|
||||||
|
RUN chmod +x /usr/local/bin/entrypoint.sh
|
||||||
|
|
||||||
|
ENTRYPOINT ["/usr/local/bin/entrypoint.sh"]
|
||||||
@@ -0,0 +1,15 @@
|
|||||||
|
FROM registry.suse.com/bci/bci-base:15.7
|
||||||
|
|
||||||
|
# Add the custom CA to the trust store
|
||||||
|
COPY integration/rabbitmq-config/certs/cert.pem /usr/share/pki/trust/anchors/gitea-rabbitmq-ca.crt
|
||||||
|
RUN update-ca-certificates
|
||||||
|
|
||||||
|
RUN zypper ar -f http://download.opensuse.org/repositories/devel:/Factory:/git-workflow/15.7/devel:Factory:git-workflow.repo
|
||||||
|
RUN zypper --gpg-auto-import-keys ref
|
||||||
|
|
||||||
|
RUN zypper -n in git-core curl autogits-gitea-events-rabbitmq-publisher binutils
|
||||||
|
|
||||||
|
COPY integration/gitea-events-rabbitmq-publisher/entrypoint.sh /usr/local/bin/entrypoint.sh
|
||||||
|
RUN chmod +x /usr/local/bin/entrypoint.sh
|
||||||
|
|
||||||
|
ENTRYPOINT ["/usr/local/bin/entrypoint.sh"]
|
||||||
13
integration/gitea-events-rabbitmq-publisher/entrypoint.sh
Normal file
13
integration/gitea-events-rabbitmq-publisher/entrypoint.sh
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
set -e
|
||||||
|
|
||||||
|
exe=$(which gitea-events-rabbitmq-publisher 2>/dev/null) || :
|
||||||
|
exe=${exe:-/usr/local/bin/gitea-events-rabbitmq-publisher}
|
||||||
|
|
||||||
|
package=$(rpm -qa | grep autogits-gitea-events-rabbitmq-publisher) || :
|
||||||
|
|
||||||
|
echo "!!!!!!!!!!!!!!!! using binary $exe; installed package: $package"
|
||||||
|
which strings > /dev/null 2>&1 && strings "$exe" | grep -A 2 vcs.revision= | head -4 || :
|
||||||
|
echo "RABBITMQ_HOST: $RABBITMQ_HOST"
|
||||||
|
|
||||||
|
exec $exe "$@"
|
||||||
25
integration/gitea/Dockerfile
Normal file
25
integration/gitea/Dockerfile
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
FROM registry.suse.com/bci/bci-base:15.7
|
||||||
|
|
||||||
|
RUN zypper ar --repo https://download.opensuse.org/repositories/devel:/Factory:/git-workflow/15.7/devel:Factory:git-workflow.repo \
|
||||||
|
&& zypper -n --gpg-auto-import-keys refresh
|
||||||
|
|
||||||
|
RUN zypper -n install \
|
||||||
|
git \
|
||||||
|
sqlite3 \
|
||||||
|
curl \
|
||||||
|
gawk \
|
||||||
|
openssh \
|
||||||
|
jq \
|
||||||
|
devel_Factory_git-workflow:gitea \
|
||||||
|
&& rm -rf /var/cache/zypp/*
|
||||||
|
|
||||||
|
# Copy the minimal set of required files from the local 'container-files' directory
|
||||||
|
COPY container-files/ /
|
||||||
|
|
||||||
|
RUN chmod -R 777 /etc/gitea/conf
|
||||||
|
|
||||||
|
# Make the setup and entrypoint scripts executable
|
||||||
|
RUN chmod +x /opt/setup/setup-gitea.sh && chmod +x /opt/setup/entrypoint.sh && chmod +x /opt/setup/setup-webhook.sh && chmod +x /opt/setup/setup-dummy-data.sh
|
||||||
|
|
||||||
|
# Use the new entrypoint script to start the container
|
||||||
|
ENTRYPOINT ["/opt/setup/entrypoint.sh"]
|
||||||
42
integration/gitea/container-files/etc/gitea/conf/app.ini
Normal file
42
integration/gitea/container-files/etc/gitea/conf/app.ini
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
WORK_PATH = /var/lib/gitea
|
||||||
|
|
||||||
|
[server]
|
||||||
|
CERT_FILE = /etc/gitea/https/cert.pem
|
||||||
|
KEY_FILE = /etc/gitea/https/key.pem
|
||||||
|
STATIC_ROOT_PATH = /usr/share/gitea
|
||||||
|
APP_DATA_PATH = /var/lib/gitea/data
|
||||||
|
PPROF_DATA_PATH = /var/lib/gitea/data/tmp/pprof
|
||||||
|
PROTOCOL = http
|
||||||
|
DOMAIN = gitea-test
|
||||||
|
SSH_DOMAIN = gitea-test
|
||||||
|
ROOT_URL = http://gitea-test:3000/
|
||||||
|
HTTP_PORT = 3000
|
||||||
|
DISABLE_SSH = false
|
||||||
|
START_SSH_SERVER = true
|
||||||
|
SSH_PORT = 3022
|
||||||
|
LFS_START_SERVER = true
|
||||||
|
|
||||||
|
[lfs]
|
||||||
|
PATH = /var/lib/gitea/data/lfs
|
||||||
|
|
||||||
|
[database]
|
||||||
|
DB_TYPE = sqlite3
|
||||||
|
PATH = /var/lib/gitea/data/gitea.db
|
||||||
|
|
||||||
|
[security]
|
||||||
|
INSTALL_LOCK = true
|
||||||
|
|
||||||
|
[oauth2]
|
||||||
|
ENABLED = false
|
||||||
|
|
||||||
|
[log]
|
||||||
|
ROOT_PATH = /var/log/gitea
|
||||||
|
MODE = console, file
|
||||||
|
; Either "Trace", "Debug", "Info", "Warn", "Error" or "None", default is "Info"
|
||||||
|
LEVEL = Debug
|
||||||
|
|
||||||
|
[service]
|
||||||
|
ENABLE_BASIC_AUTHENTICATION = true
|
||||||
|
|
||||||
|
[webhook]
|
||||||
|
ALLOWED_HOST_LIST = gitea-publisher
|
||||||
19
integration/gitea/container-files/opt/setup/entrypoint.sh
Normal file
19
integration/gitea/container-files/opt/setup/entrypoint.sh
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# Run setup to ensure permissions, migrations, and the admin user are ready.
|
||||||
|
# The setup script is now idempotent.
|
||||||
|
/opt/setup/setup-gitea.sh
|
||||||
|
|
||||||
|
# Start the webhook setup script in the background.
|
||||||
|
# It will wait for the main Gitea process to be ready before creating the webhook.
|
||||||
|
/opt/setup/setup-webhook.sh &
|
||||||
|
|
||||||
|
echo "Starting Gitea..."
|
||||||
|
|
||||||
|
# The original systemd service ran as user 'gitea' and group 'gitea'
|
||||||
|
# with a working directory of '/var/lib/gitea'.
|
||||||
|
# We will switch to that user and run the web command.
|
||||||
|
# Using exec means Gitea will become PID 1, allowing it to receive signals correctly.
|
||||||
|
cd /var/lib/gitea
|
||||||
|
exec su -s /bin/bash gitea -c "/usr/bin/gitea web --config /etc/gitea/conf/app.ini"
|
||||||
@@ -0,0 +1,2 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# This script is now empty as dummy data setup is handled by pytest fixtures.
|
||||||
100
integration/gitea/container-files/opt/setup/setup-gitea.sh
Normal file
100
integration/gitea/container-files/opt/setup/setup-gitea.sh
Normal file
@@ -0,0 +1,100 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
set -x
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# Set ownership on the volume mounts. This allows the 'gitea' user to write to them.
|
||||||
|
# We use -R to ensure all subdirectories (like /var/lib/gitea/data) are covered.
|
||||||
|
chown -R gitea:gitea /var/lib/gitea /var/log/gitea
|
||||||
|
|
||||||
|
# Set ownership on the config directory.
|
||||||
|
chown -R gitea:gitea /etc/gitea
|
||||||
|
|
||||||
|
# Run database migrations to initialize the sqlite3 db based on app.ini.
|
||||||
|
su -s /bin/bash gitea -c 'gitea migrate'
|
||||||
|
|
||||||
|
# Create a default admin user if it doesn't exist
|
||||||
|
if ! su -s /bin/bash gitea -c 'gitea admin user list' | awk 'NR>1 && $2 == "admin" {found=1} END {exit !found}'; then
|
||||||
|
echo "Creating admin user..."
|
||||||
|
su -s /bin/bash gitea -c 'gitea admin user create --username admin --password opensuse --email admin@example.com --must-change-password=false --admin'
|
||||||
|
else
|
||||||
|
echo "Admin user already exists."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Generate an access token for the admin user
|
||||||
|
ADMIN_TOKEN_FILE="/var/lib/gitea/admin.token"
|
||||||
|
if [ -f "$ADMIN_TOKEN_FILE" ]; then
|
||||||
|
echo "Admin token already exists at $ADMIN_TOKEN_FILE."
|
||||||
|
else
|
||||||
|
echo "Generating admin token..."
|
||||||
|
ADMIN_TOKEN=$(su -s /bin/bash gitea -c "gitea admin user generate-access-token -raw -u admin -t admin-token")
|
||||||
|
if [ -n "$ADMIN_TOKEN" ]; then
|
||||||
|
printf "%s" "$ADMIN_TOKEN" > "$ADMIN_TOKEN_FILE"
|
||||||
|
chmod 777 "$ADMIN_TOKEN_FILE"
|
||||||
|
chown gitea:gitea "$ADMIN_TOKEN_FILE"
|
||||||
|
echo "Admin token generated and saved to $ADMIN_TOKEN_FILE."
|
||||||
|
else
|
||||||
|
echo "Failed to generate admin token."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Generate SSH key for the admin user if it doesn't exist
|
||||||
|
SSH_KEY_DIR="/var/lib/gitea/ssh-keys"
|
||||||
|
mkdir -p "$SSH_KEY_DIR"
|
||||||
|
if [ ! -f "$SSH_KEY_DIR/id_ed25519" ]; then
|
||||||
|
echo "Generating SSH key for admin user..."
|
||||||
|
ssh-keygen -t ed25519 -N "" -f "$SSH_KEY_DIR/id_ed25519"
|
||||||
|
chown -R gitea:gitea "$SSH_KEY_DIR"
|
||||||
|
chmod 700 "$SSH_KEY_DIR"
|
||||||
|
chmod 600 "$SSH_KEY_DIR/id_ed25519"
|
||||||
|
chmod 644 "$SSH_KEY_DIR/id_ed25519.pub"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Create a autogits_obs_staging_bot user if it doesn't exist
|
||||||
|
if ! su -s /bin/bash gitea -c 'gitea admin user list' | awk 'NR>1 && $2 == "autogits_obs_staging_bot" {found=1} END {exit !found}'; then
|
||||||
|
echo "Creating autogits_obs_staging_bot user..."
|
||||||
|
su -s /bin/bash gitea -c 'gitea admin user create --username autogits_obs_staging_bot --password opensuse --email autogits_obs_staging_bot@example.com --must-change-password=false'
|
||||||
|
else
|
||||||
|
echo "autogits_obs_staging_bot user already exists."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Generate an access token for the autogits_obs_staging_bot user
|
||||||
|
BOT_TOKEN_FILE="/var/lib/gitea/autogits_obs_staging_bot.token"
|
||||||
|
if [ -f "$BOT_TOKEN_FILE" ]; then
|
||||||
|
echo "autogits_obs_staging_bot token already exists at $BOT_TOKEN_FILE."
|
||||||
|
else
|
||||||
|
echo "Generating autogits_obs_staging_bot token..."
|
||||||
|
BOT_TOKEN=$(su -s /bin/bash gitea -c "gitea admin user generate-access-token -raw -u autogits_obs_staging_bot -t autogits_obs_staging_bot-token")
|
||||||
|
if [ -n "$BOT_TOKEN" ]; then
|
||||||
|
printf "%s" "$BOT_TOKEN" > "$BOT_TOKEN_FILE"
|
||||||
|
chmod 666 "$BOT_TOKEN_FILE"
|
||||||
|
chown gitea:gitea "$BOT_TOKEN_FILE"
|
||||||
|
echo "autogits_obs_staging_bot token generated and saved to $BOT_TOKEN_FILE."
|
||||||
|
else
|
||||||
|
echo "Failed to generate autogits_obs_staging_bot token."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Create a workflow-pr user if it doesn't exist
|
||||||
|
if ! su -s /bin/bash gitea -c 'gitea admin user list' | awk 'NR>1 && $2 == "workflow-pr" {found=1} END {exit !found}'; then
|
||||||
|
echo "Creating workflow-pr user..."
|
||||||
|
su -s /bin/bash gitea -c 'gitea admin user create --username workflow-pr --password opensuse --email workflow-pr@example.com --must-change-password=false'
|
||||||
|
else
|
||||||
|
echo "workflow-pr user already exists."
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Generate an access token for the workflow-pr user
|
||||||
|
BOT_TOKEN_FILE="/var/lib/gitea/workflow-pr.token"
|
||||||
|
if [ -f "$BOT_TOKEN_FILE" ]; then
|
||||||
|
echo "workflow-pr token already exists at $BOT_TOKEN_FILE."
|
||||||
|
else
|
||||||
|
echo "Generating workflow-pr token..."
|
||||||
|
BOT_TOKEN=$(su -s /bin/bash gitea -c "gitea admin user generate-access-token -raw -u workflow-pr -t workflow-pr-token")
|
||||||
|
if [ -n "$BOT_TOKEN" ]; then
|
||||||
|
printf "%s" "$BOT_TOKEN" > "$BOT_TOKEN_FILE"
|
||||||
|
chmod 666 "$BOT_TOKEN_FILE"
|
||||||
|
chown gitea:gitea "$BOT_TOKEN_FILE"
|
||||||
|
echo "workflow-pr token generated and saved to $BOT_TOKEN_FILE."
|
||||||
|
else
|
||||||
|
echo "Failed to generate workflow-pr token."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
92
integration/gitea/container-files/opt/setup/setup-webhook.sh
Normal file
92
integration/gitea/container-files/opt/setup/setup-webhook.sh
Normal file
@@ -0,0 +1,92 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
set -e
|
||||||
|
|
||||||
|
GITEA_URL="http://localhost:3000"
|
||||||
|
WEBHOOK_URL="http://gitea-publisher:8002/rabbitmq-forwarder"
|
||||||
|
TOKEN_NAME="webhook-creator"
|
||||||
|
|
||||||
|
echo "Webhook setup script started in background."
|
||||||
|
|
||||||
|
# Wait 10s for the main Gitea process to start
|
||||||
|
sleep 10
|
||||||
|
|
||||||
|
# Wait for Gitea API to be ready
|
||||||
|
echo "Waiting for Gitea API at $GITEA_URL..."
|
||||||
|
while ! curl -s -f "$GITEA_URL/api/v1/version" > /dev/null; do
|
||||||
|
echo "Gitea API not up yet, waiting 5s..."
|
||||||
|
sleep 5
|
||||||
|
done
|
||||||
|
echo "Gitea API is up."
|
||||||
|
|
||||||
|
# The `gitea admin` command needs to be run as the gitea user.
|
||||||
|
# The -raw flag gives us the token directly.
|
||||||
|
echo "Generating or retrieving admin token..."
|
||||||
|
TOKEN_FILE="/var/lib/gitea/admin.token"
|
||||||
|
|
||||||
|
if [ -f "$TOKEN_FILE" ]; then
|
||||||
|
TOKEN=$(cat "$TOKEN_FILE" | tr -d '\n\r ')
|
||||||
|
echo "Admin token loaded from $TOKEN_FILE."
|
||||||
|
else
|
||||||
|
TOKEN=$(su -s /bin/bash gitea -c "gitea admin user generate-access-token -raw -u admin -t $TOKEN_NAME")
|
||||||
|
if [ -n "$TOKEN" ]; then
|
||||||
|
printf "%s" "$TOKEN" > "$TOKEN_FILE"
|
||||||
|
chmod 666 "$TOKEN_FILE"
|
||||||
|
chown gitea:gitea "$TOKEN_FILE"
|
||||||
|
echo "Admin token generated and saved to $TOKEN_FILE."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -z "$TOKEN" ]; then
|
||||||
|
echo "Failed to generate or retrieve admin token. This might be because the token already exists in Gitea but not in $TOKEN_FILE. Exiting."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Run the dummy data setup script
|
||||||
|
/opt/setup/setup-dummy-data.sh "$GITEA_URL" "$TOKEN"
|
||||||
|
|
||||||
|
# Add SSH key via API
|
||||||
|
PUB_KEY_FILE="/var/lib/gitea/ssh-keys/id_ed25519.pub"
|
||||||
|
if [ -f "$PUB_KEY_FILE" ]; then
|
||||||
|
echo "Checking for existing SSH key 'bot-key'..."
|
||||||
|
KEYS_URL="$GITEA_URL/api/v1/admin/users/workflow-pr/keys"
|
||||||
|
EXISTING_KEYS=$(curl -s -X GET -H "Authorization: token $TOKEN" "$KEYS_URL")
|
||||||
|
|
||||||
|
if ! echo "$EXISTING_KEYS" | grep -q "\"title\":\"bot-key\""; then
|
||||||
|
echo "Registering SSH key 'bot-key' via API..."
|
||||||
|
KEY_CONTENT=$(cat "$PUB_KEY_FILE")
|
||||||
|
curl -s -X POST "$KEYS_URL" \
|
||||||
|
-H "Authorization: token $TOKEN" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d "{
|
||||||
|
\"key\": \"$KEY_CONTENT\",
|
||||||
|
\"read_only\": false,
|
||||||
|
\"title\": \"bot-key\"
|
||||||
|
}"
|
||||||
|
echo -e "\nSSH key registered."
|
||||||
|
else
|
||||||
|
echo "SSH key 'bot-key' already registered."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if the webhook already exists
|
||||||
|
echo "Checking for existing system webhook..."
|
||||||
|
DB_PATH="/var/lib/gitea/data/gitea.db"
|
||||||
|
EXISTS=$(su -s /bin/bash gitea -c "sqlite3 '$DB_PATH' \"SELECT 1 FROM webhook WHERE url = '$WEBHOOK_URL' AND is_system_webhook = 1 LIMIT 1;\"")
|
||||||
|
|
||||||
|
if [ "$EXISTS" = "1" ]; then
|
||||||
|
echo "System webhook for $WEBHOOK_URL already exists. Exiting."
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "Creating Gitea system webhook for $WEBHOOK_URL via direct database INSERT..."
|
||||||
|
# The events JSON requires escaped double quotes for the sqlite3 command.
|
||||||
|
EVENTS_JSON='{\"push_only\":false,\"send_everything\":true,\"choose_events\":false,\"branch_filter\":\"*\",\"events\":{\"create\":false,\"delete\":false,\"fork\":false,\"issue_assign\":false,\"issue_comment\":false,\"issue_label\":false,\"issue_milestone\":false,\"issues\":false,\"package\":false,\"pull_request\":false,\"pull_request_assign\":false,\"pull_request_comment\":false,\"pull_request_label\":false,\"pull_request_milestone\":false,\"pull_request_review\":false,\"pull_request_review_request\":false,\"pull_request_sync\":false,\"push\":false,\"release\":false,\"repository\":false,\"status\":false,\"wiki\":false,\"workflow_job\":false,\"workflow_run\":false}}'
|
||||||
|
NOW_UNIX=$(date +%s)
|
||||||
|
|
||||||
|
INSERT_CMD="INSERT INTO webhook (repo_id, owner_id, is_system_webhook, url, http_method, content_type, events, is_active, type, meta, created_unix, updated_unix) VALUES (0, 0, 1, '$WEBHOOK_URL', 'POST', 1, '$EVENTS_JSON', 1, 'gitea', '', $NOW_UNIX, $NOW_UNIX);"
|
||||||
|
|
||||||
|
su -s /bin/bash gitea -c "sqlite3 '$DB_PATH' \"$INSERT_CMD\""
|
||||||
|
|
||||||
|
echo "System webhook created successfully."
|
||||||
|
|
||||||
|
exit 0
|
||||||
14
integration/mock-obs/Dockerfile
Normal file
14
integration/mock-obs/Dockerfile
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
# Use a base Python image
|
||||||
|
FROM registry.suse.com/bci/python:3.11
|
||||||
|
|
||||||
|
# Set the working directory
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Copy the server script
|
||||||
|
COPY server.py .
|
||||||
|
|
||||||
|
# Expose the port the server will run on
|
||||||
|
EXPOSE 8080
|
||||||
|
|
||||||
|
# Command to run the server
|
||||||
|
CMD ["python3", "-u", "server.py"]
|
||||||
@@ -0,0 +1,18 @@
|
|||||||
|
<project name="openSUSE:Leap:16.0:PullRequest">
|
||||||
|
<title>Leap 16.0 PullRequest area</title>
|
||||||
|
<description>Base project to define the pull request builds</description>
|
||||||
|
<person userid="autogits_obs_staging_bot" role="maintainer"/>
|
||||||
|
<person userid="maxlin_factory" role="maintainer"/>
|
||||||
|
<group groupid="maintenance-opensuse.org" role="maintainer"/>
|
||||||
|
<debuginfo>
|
||||||
|
<enable/>
|
||||||
|
</debuginfo>
|
||||||
|
<repository name="standard">
|
||||||
|
<path project="openSUSE:Leap:16.0" repository="standard"/>
|
||||||
|
<arch>x86_64</arch>
|
||||||
|
<arch>i586</arch>
|
||||||
|
<arch>aarch64</arch>
|
||||||
|
<arch>ppc64le</arch>
|
||||||
|
<arch>s390x</arch>
|
||||||
|
</repository>
|
||||||
|
</project>
|
||||||
@@ -0,0 +1,59 @@
|
|||||||
|
<project name="openSUSE:Leap:16.0">
|
||||||
|
<title>openSUSE Leap 16.0 based on SLFO</title>
|
||||||
|
<description>Leap 16.0 based on SLES 16.0 (specifically SLFO:1.2)</description>
|
||||||
|
<link project="openSUSE:Backports:SLE-16.0"/>
|
||||||
|
<scmsync>http://gitea-test:3000/products/SLFO#main</scmsync>
|
||||||
|
<person userid="dimstar_suse" role="maintainer"/>
|
||||||
|
<person userid="lkocman-factory" role="maintainer"/>
|
||||||
|
<person userid="maxlin_factory" role="maintainer"/>
|
||||||
|
<person userid="factory-auto" role="reviewer"/>
|
||||||
|
<person userid="licensedigger" role="reviewer"/>
|
||||||
|
<group groupid="autobuild-team" role="maintainer"/>
|
||||||
|
<group groupid="factory-maintainers" role="maintainer"/>
|
||||||
|
<group groupid="maintenance-opensuse.org" role="maintainer"/>
|
||||||
|
<group groupid="factory-staging" role="reviewer"/>
|
||||||
|
<build>
|
||||||
|
<disable repository="ports"/>
|
||||||
|
</build>
|
||||||
|
<debuginfo>
|
||||||
|
<enable/>
|
||||||
|
</debuginfo>
|
||||||
|
<repository name="standard" rebuild="local">
|
||||||
|
<path project="openSUSE:Backports:SLE-16.0" repository="standard"/>
|
||||||
|
<path project="SUSE:SLFO:1.2" repository="standard"/>
|
||||||
|
<arch>local</arch>
|
||||||
|
<arch>i586</arch>
|
||||||
|
<arch>x86_64</arch>
|
||||||
|
<arch>aarch64</arch>
|
||||||
|
<arch>ppc64le</arch>
|
||||||
|
<arch>s390x</arch>
|
||||||
|
</repository>
|
||||||
|
<repository name="product">
|
||||||
|
<releasetarget project="openSUSE:Leap:16.0:ToTest" repository="product" trigger="manual"/>
|
||||||
|
<path project="openSUSE:Leap:16.0:NonFree" repository="standard"/>
|
||||||
|
<path project="openSUSE:Leap:16.0" repository="images"/>
|
||||||
|
<path project="openSUSE:Leap:16.0" repository="standard"/>
|
||||||
|
<path project="openSUSE:Backports:SLE-16.0" repository="standard"/>
|
||||||
|
<path project="SUSE:SLFO:1.2" repository="standard"/>
|
||||||
|
<arch>local</arch>
|
||||||
|
<arch>i586</arch>
|
||||||
|
<arch>x86_64</arch>
|
||||||
|
<arch>aarch64</arch>
|
||||||
|
<arch>ppc64le</arch>
|
||||||
|
<arch>s390x</arch>
|
||||||
|
</repository>
|
||||||
|
<repository name="ports">
|
||||||
|
<arch>armv7l</arch>
|
||||||
|
</repository>
|
||||||
|
<repository name="images">
|
||||||
|
<releasetarget project="openSUSE:Leap:16.0:ToTest" repository="images" trigger="manual"/>
|
||||||
|
<path project="openSUSE:Leap:16.0" repository="standard"/>
|
||||||
|
<path project="openSUSE:Backports:SLE-16.0" repository="standard"/>
|
||||||
|
<path project="SUSE:SLFO:1.2" repository="standard"/>
|
||||||
|
<arch>i586</arch>
|
||||||
|
<arch>x86_64</arch>
|
||||||
|
<arch>aarch64</arch>
|
||||||
|
<arch>ppc64le</arch>
|
||||||
|
<arch>s390x</arch>
|
||||||
|
</repository>
|
||||||
|
</project>
|
||||||
140
integration/mock-obs/server.py
Normal file
140
integration/mock-obs/server.py
Normal file
@@ -0,0 +1,140 @@
|
|||||||
|
import http.server
|
||||||
|
import socketserver
|
||||||
|
import os
|
||||||
|
import logging
|
||||||
|
import signal
|
||||||
|
import sys
|
||||||
|
import threading
|
||||||
|
import fnmatch
|
||||||
|
|
||||||
|
PORT = 8080
|
||||||
|
RESPONSE_DIR = "/app/responses"
|
||||||
|
STATE_DIR = "/tmp/mock_obs_state"
|
||||||
|
|
||||||
|
class MockOBSHandler(http.server.SimpleHTTPRequestHandler):
|
||||||
|
def do_GET(self):
|
||||||
|
logging.info(f"GET request for: {self.path}")
|
||||||
|
path_without_query = self.path.split('?')[0]
|
||||||
|
|
||||||
|
# Check for state stored by a PUT request first
|
||||||
|
sanitized_put_path = 'PUT' + path_without_query.replace('/', '_')
|
||||||
|
state_file_path = os.path.join(STATE_DIR, sanitized_put_path)
|
||||||
|
if os.path.exists(state_file_path):
|
||||||
|
logging.info(f"Found stored PUT state for {self.path} at {state_file_path}")
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header("Content-type", "application/xml")
|
||||||
|
file_size = os.path.getsize(state_file_path)
|
||||||
|
self.send_header("Content-Length", str(file_size))
|
||||||
|
self.end_headers()
|
||||||
|
with open(state_file_path, 'rb') as f:
|
||||||
|
self.wfile.write(f.read())
|
||||||
|
return
|
||||||
|
|
||||||
|
# If no PUT state file, fall back to the glob/exact match logic
|
||||||
|
self.handle_request('GET')
|
||||||
|
|
||||||
|
def do_PUT(self):
|
||||||
|
logging.info(f"PUT request for: {self.path}")
|
||||||
|
logging.info(f"Headers: {self.headers}")
|
||||||
|
path_without_query = self.path.split('?')[0]
|
||||||
|
|
||||||
|
body = b''
|
||||||
|
if self.headers.get('Transfer-Encoding', '').lower() == 'chunked':
|
||||||
|
logging.info("Chunked transfer encoding detected")
|
||||||
|
while True:
|
||||||
|
line = self.rfile.readline().strip()
|
||||||
|
if not line:
|
||||||
|
break
|
||||||
|
chunk_length = int(line, 16)
|
||||||
|
if chunk_length == 0:
|
||||||
|
self.rfile.readline()
|
||||||
|
break
|
||||||
|
body += self.rfile.read(chunk_length)
|
||||||
|
self.rfile.read(2) # Read the trailing CRLF
|
||||||
|
else:
|
||||||
|
content_length = int(self.headers.get('Content-Length', 0))
|
||||||
|
body = self.rfile.read(content_length)
|
||||||
|
|
||||||
|
logging.info(f"Body: {body.decode('utf-8')}")
|
||||||
|
sanitized_path = 'PUT' + path_without_query.replace('/', '_')
|
||||||
|
state_file_path = os.path.join(STATE_DIR, sanitized_path)
|
||||||
|
|
||||||
|
logging.info(f"Saving state for {self.path} to {state_file_path}")
|
||||||
|
os.makedirs(os.path.dirname(state_file_path), exist_ok=True)
|
||||||
|
with open(state_file_path, 'wb') as f:
|
||||||
|
f.write(body)
|
||||||
|
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header("Content-type", "text/plain")
|
||||||
|
response_body = b"OK"
|
||||||
|
self.send_header("Content-Length", str(len(response_body)))
|
||||||
|
self.end_headers()
|
||||||
|
self.wfile.write(response_body)
|
||||||
|
|
||||||
|
def do_POST(self):
|
||||||
|
logging.info(f"POST request for: {self.path}")
|
||||||
|
self.handle_request('POST')
|
||||||
|
|
||||||
|
def do_DELETE(self):
|
||||||
|
logging.info(f"DELETE request for: {self.path}")
|
||||||
|
self.handle_request('DELETE')
|
||||||
|
|
||||||
|
def handle_request(self, method):
|
||||||
|
path_without_query = self.path.split('?')[0]
|
||||||
|
sanitized_request_path = method + path_without_query.replace('/', '_')
|
||||||
|
logging.info(f"Handling request, looking for match for: {sanitized_request_path}")
|
||||||
|
|
||||||
|
response_file = None
|
||||||
|
# Check for glob match first
|
||||||
|
if os.path.exists(RESPONSE_DIR):
|
||||||
|
for filename in os.listdir(RESPONSE_DIR):
|
||||||
|
if fnmatch.fnmatch(sanitized_request_path, filename):
|
||||||
|
response_file = os.path.join(RESPONSE_DIR, filename)
|
||||||
|
logging.info(f"Found matching response file (glob): {response_file}")
|
||||||
|
break
|
||||||
|
|
||||||
|
# Fallback to exact match if no glob match
|
||||||
|
if response_file is None:
|
||||||
|
exact_file = os.path.join(RESPONSE_DIR, sanitized_request_path)
|
||||||
|
if os.path.exists(exact_file):
|
||||||
|
response_file = exact_file
|
||||||
|
logging.info(f"Found matching response file (exact): {response_file}")
|
||||||
|
|
||||||
|
if response_file:
|
||||||
|
logging.info(f"Serving content from {response_file}")
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header("Content-type", "application/xml")
|
||||||
|
file_size = os.path.getsize(response_file)
|
||||||
|
self.send_header("Content-Length", str(file_size))
|
||||||
|
self.end_headers()
|
||||||
|
with open(response_file, 'rb') as f:
|
||||||
|
self.wfile.write(f.read())
|
||||||
|
else:
|
||||||
|
logging.info(f"Response file not found for {sanitized_request_path}. Sending 404.")
|
||||||
|
self.send_response(404)
|
||||||
|
self.send_header("Content-type", "text/plain")
|
||||||
|
body = f"Mock response not found for {sanitized_request_path}".encode('utf-8')
|
||||||
|
self.send_header("Content-Length", str(len(body)))
|
||||||
|
self.end_headers()
|
||||||
|
self.wfile.write(body)
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(message)s')
|
||||||
|
|
||||||
|
if not os.path.exists(STATE_DIR):
|
||||||
|
logging.info(f"Creating state directory: {STATE_DIR}")
|
||||||
|
os.makedirs(STATE_DIR)
|
||||||
|
if not os.path.exists(RESPONSE_DIR):
|
||||||
|
os.makedirs(RESPONSE_DIR)
|
||||||
|
|
||||||
|
with socketserver.TCPServer(("", PORT), MockOBSHandler) as httpd:
|
||||||
|
logging.info(f"Serving mock OBS API on port {PORT}")
|
||||||
|
|
||||||
|
def graceful_shutdown(sig, frame):
|
||||||
|
logging.info("Received SIGTERM, shutting down gracefully...")
|
||||||
|
threading.Thread(target=httpd.shutdown).start()
|
||||||
|
|
||||||
|
signal.signal(signal.SIGTERM, graceful_shutdown)
|
||||||
|
|
||||||
|
httpd.serve_forever()
|
||||||
|
logging.info("Server has shut down.")
|
||||||
1
integration/obs-staging-bot/Dockerfile
Symbolic link
1
integration/obs-staging-bot/Dockerfile
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
./Dockerfile.package
|
||||||
18
integration/obs-staging-bot/Dockerfile.local
Normal file
18
integration/obs-staging-bot/Dockerfile.local
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
# Use a base Python image
|
||||||
|
FROM registry.suse.com/bci/bci-base:15.7
|
||||||
|
|
||||||
|
# Install any necessary dependencies for the bot
|
||||||
|
# e.g., git, curl, etc.
|
||||||
|
RUN zypper -n in git-core curl binutils
|
||||||
|
|
||||||
|
# Copy the bot binary and its entrypoint script
|
||||||
|
COPY obs-staging-bot/obs-staging-bot /usr/local/bin/obs-staging-bot
|
||||||
|
COPY integration/obs-staging-bot/entrypoint.sh /usr/local/bin/entrypoint.sh
|
||||||
|
RUN chmod +x /usr/local/bin/entrypoint.sh
|
||||||
|
|
||||||
|
# Create a non-root user to run the bot
|
||||||
|
RUN useradd -m -u 1001 bot
|
||||||
|
USER 1001
|
||||||
|
|
||||||
|
# Set the entrypoint
|
||||||
|
ENTRYPOINT ["/usr/local/bin/entrypoint.sh"]
|
||||||
19
integration/obs-staging-bot/Dockerfile.package
Normal file
19
integration/obs-staging-bot/Dockerfile.package
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
# Use a base Python image
|
||||||
|
FROM registry.suse.com/bci/bci-base:15.7
|
||||||
|
|
||||||
|
RUN zypper ar -f http://download.opensuse.org/repositories/devel:/Factory:/git-workflow/15.7/devel:Factory:git-workflow.repo
|
||||||
|
RUN zypper --gpg-auto-import-keys ref
|
||||||
|
|
||||||
|
# Install any necessary dependencies for the bot
|
||||||
|
# e.g., git, curl, etc.
|
||||||
|
RUN zypper -n in git-core curl autogits-obs-staging-bot binutils
|
||||||
|
|
||||||
|
COPY integration/obs-staging-bot/entrypoint.sh /usr/local/bin/entrypoint.sh
|
||||||
|
RUN chmod +x /usr/local/bin/entrypoint.sh
|
||||||
|
|
||||||
|
# Create a non-root user to run the bot
|
||||||
|
RUN useradd -m -u 1001 bot
|
||||||
|
USER 1001
|
||||||
|
|
||||||
|
# Set the entrypoint
|
||||||
|
ENTRYPOINT ["/usr/local/bin/entrypoint.sh"]
|
||||||
28
integration/obs-staging-bot/entrypoint.sh
Normal file
28
integration/obs-staging-bot/entrypoint.sh
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# This script waits for the Gitea admin token to be created,
|
||||||
|
# exports it as an environment variable, and then executes the main container command.
|
||||||
|
|
||||||
|
TOKEN_FILE="/gitea-data/autogits_obs_staging_bot.token"
|
||||||
|
|
||||||
|
echo "OBS Staging Bot: Waiting for Gitea autogits_obs_staging_bot token at $TOKEN_FILE..."
|
||||||
|
while [ ! -s "$TOKEN_FILE" ]; do
|
||||||
|
sleep 2
|
||||||
|
done
|
||||||
|
|
||||||
|
export GITEA_TOKEN=$(cat "$TOKEN_FILE" | tr -d '\n\r ')
|
||||||
|
echo "OBS Staging Bot: GITEA_TOKEN exported."
|
||||||
|
|
||||||
|
# Execute the bot as the current user (root), using 'env' to pass required variables.
|
||||||
|
echo "OBS Staging Bot: Executing bot..."
|
||||||
|
|
||||||
|
exe=$(which obs-staging-bot)
|
||||||
|
exe=${exe:-/usr/local/bin/obs-staging-bot}
|
||||||
|
|
||||||
|
package=$(rpm -qa | grep autogits-obs-staging-bot) || :
|
||||||
|
|
||||||
|
echo "!!!!!!!!!!!!!!!! using binary $exe; installed package: $package"
|
||||||
|
which strings > /dev/null 2>&1 && strings "$exe" | grep -A 2 vcs.revision= | head -4 || :
|
||||||
|
|
||||||
|
exec $exe "$@"
|
||||||
136
integration/podman-compose.yml
Normal file
136
integration/podman-compose.yml
Normal file
@@ -0,0 +1,136 @@
|
|||||||
|
version: "3.8"
|
||||||
|
|
||||||
|
networks:
|
||||||
|
gitea-network:
|
||||||
|
driver: bridge
|
||||||
|
|
||||||
|
services:
|
||||||
|
gitea:
|
||||||
|
build: ./gitea
|
||||||
|
container_name: gitea-test
|
||||||
|
environment:
|
||||||
|
- GITEA_WORK_DIR=/var/lib/gitea
|
||||||
|
networks:
|
||||||
|
- gitea-network
|
||||||
|
ports:
|
||||||
|
# Map the HTTP and SSH ports defined in your app.ini
|
||||||
|
- "3000:3000"
|
||||||
|
- "3022:3022"
|
||||||
|
volumes:
|
||||||
|
# Persist Gitea's data (repositories, sqlite db, etc.) to a local directory
|
||||||
|
# The :z flag allows sharing between containers
|
||||||
|
- ./gitea-data:/var/lib/gitea:z
|
||||||
|
# Persist Gitea's logs to a local directory
|
||||||
|
- ./gitea-logs:/var/log/gitea:Z
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
rabbitmq:
|
||||||
|
image: rabbitmq:3.13.7-management
|
||||||
|
container_name: rabbitmq-test
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD", "rabbitmq-diagnostics", "check_running", "-q"]
|
||||||
|
interval: 30s
|
||||||
|
timeout: 30s
|
||||||
|
retries: 3
|
||||||
|
networks:
|
||||||
|
- gitea-network
|
||||||
|
ports:
|
||||||
|
# AMQP protocol port with TLS
|
||||||
|
- "5671:5671"
|
||||||
|
# HTTP management UI
|
||||||
|
- "15672:15672"
|
||||||
|
volumes:
|
||||||
|
# Persist RabbitMQ data
|
||||||
|
- ./rabbitmq-data:/var/lib/rabbitmq:Z
|
||||||
|
# Mount TLS certs
|
||||||
|
- ./rabbitmq-config/certs:/etc/rabbitmq/certs:Z
|
||||||
|
# Mount rabbitmq config
|
||||||
|
- ./rabbitmq-config/rabbitmq.conf:/etc/rabbitmq/rabbitmq.conf:Z
|
||||||
|
# Mount exchange definitions
|
||||||
|
- ./rabbitmq-config/definitions.json:/etc/rabbitmq/definitions.json:Z
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
gitea-publisher:
|
||||||
|
build:
|
||||||
|
context: ..
|
||||||
|
dockerfile: integration/gitea-events-rabbitmq-publisher/Dockerfile${GIWTF_IMAGE_SUFFIX}
|
||||||
|
container_name: gitea-publisher
|
||||||
|
networks:
|
||||||
|
- gitea-network
|
||||||
|
depends_on:
|
||||||
|
gitea:
|
||||||
|
condition: service_started
|
||||||
|
rabbitmq:
|
||||||
|
condition: service_healthy
|
||||||
|
environment:
|
||||||
|
- RABBITMQ_HOST=rabbitmq-test
|
||||||
|
- RABBITMQ_USERNAME=gitea
|
||||||
|
- RABBITMQ_PASSWORD=gitea
|
||||||
|
- SSL_CERT_FILE=/usr/share/pki/trust/anchors/gitea-rabbitmq-ca.crt
|
||||||
|
command: [ "-listen", "0.0.0.0:8002", "-topic-domain", "suse", "-debug" ]
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
workflow-pr:
|
||||||
|
build:
|
||||||
|
context: ..
|
||||||
|
dockerfile: integration/workflow-pr/Dockerfile${GIWTF_IMAGE_SUFFIX}
|
||||||
|
container_name: workflow-pr
|
||||||
|
networks:
|
||||||
|
- gitea-network
|
||||||
|
depends_on:
|
||||||
|
gitea:
|
||||||
|
condition: service_started
|
||||||
|
rabbitmq:
|
||||||
|
condition: service_healthy
|
||||||
|
environment:
|
||||||
|
- AMQP_USERNAME=gitea
|
||||||
|
- AMQP_PASSWORD=gitea
|
||||||
|
- SSL_CERT_FILE=/usr/share/pki/trust/anchors/gitea-rabbitmq-ca.crt
|
||||||
|
volumes:
|
||||||
|
- ./gitea-data:/var/lib/gitea:ro,z
|
||||||
|
- ./workflow-pr/workflow-pr.json:/etc/workflow-pr.json:ro,z
|
||||||
|
- ./workflow-pr-repos:/var/lib/workflow-pr/repos:Z
|
||||||
|
command: [
|
||||||
|
"-check-on-start",
|
||||||
|
"-debug",
|
||||||
|
"-gitea-url", "http://gitea-test:3000",
|
||||||
|
"-url", "amqps://rabbitmq-test:5671",
|
||||||
|
"-config", "/etc/workflow-pr.json",
|
||||||
|
"-repo-path", "/var/lib/workflow-pr/repos"
|
||||||
|
]
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
mock-obs:
|
||||||
|
build: ./mock-obs
|
||||||
|
container_name: mock-obs
|
||||||
|
networks:
|
||||||
|
- gitea-network
|
||||||
|
ports:
|
||||||
|
- "8080:8080"
|
||||||
|
volumes:
|
||||||
|
- ./mock-obs/responses:/app/responses:z # Use :z for shared SELinux label
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
obs-staging-bot:
|
||||||
|
build:
|
||||||
|
context: ..
|
||||||
|
dockerfile: integration/obs-staging-bot/Dockerfile${GIWTF_IMAGE_SUFFIX}
|
||||||
|
container_name: obs-staging-bot
|
||||||
|
networks:
|
||||||
|
- gitea-network
|
||||||
|
depends_on:
|
||||||
|
gitea:
|
||||||
|
condition: service_started
|
||||||
|
mock-obs:
|
||||||
|
condition: service_started
|
||||||
|
environment:
|
||||||
|
- OBS_USER=mock
|
||||||
|
- OBS_PASSWORD=mock-long-password
|
||||||
|
volumes:
|
||||||
|
- ./gitea-data:/gitea-data:ro,z
|
||||||
|
command:
|
||||||
|
- "-debug"
|
||||||
|
- "-gitea-url=http://gitea-test:3000"
|
||||||
|
- "-obs=http://mock-obs:8080"
|
||||||
|
- "-obs-web=http://mock-obs:8080"
|
||||||
|
restart: unless-stopped
|
||||||
30
integration/rabbitmq-config/certs/cert.pem
Normal file
30
integration/rabbitmq-config/certs/cert.pem
Normal file
@@ -0,0 +1,30 @@
|
|||||||
|
-----BEGIN CERTIFICATE-----
|
||||||
|
MIIFKzCCAxOgAwIBAgIUJsg/r0ZyIVxtAkrlZKOr4LvYEvMwDQYJKoZIhvcNAQEL
|
||||||
|
BQAwGDEWMBQGA1UEAwwNcmFiYml0bXEtdGVzdDAeFw0yNjAxMjQxMjQyMjNaFw0z
|
||||||
|
NjAxMjIxMjQyMjNaMBgxFjAUBgNVBAMMDXJhYmJpdG1xLXRlc3QwggIiMA0GCSqG
|
||||||
|
SIb3DQEBAQUAA4ICDwAwggIKAoICAQC9OjTq4DgqVo0mRpS8DGRR6SFrSpb2bqnl
|
||||||
|
YI7xSI3y67i/oP4weiZSawk2+euxhsN4FfOlsAgvpg4WyRQH5PwnXOA1Lxz51qp1
|
||||||
|
t0VumE3B1RDheiBTE8loG1FvmikOiek2gzz76nK0R1sbKY1+/NVJpMs6dL6NzJXG
|
||||||
|
N6aCpWTk7oeY+lW5bPBG0VRA7RUG80w9R9RDtqYc0SYUmm43tjjxPZ81rhCXFx/F
|
||||||
|
v1kxnNTQJdATNrTn9SofymSfm42f4loOGyGBsqJYybKXOPDxrM1erBN5eCwTpJMS
|
||||||
|
4J30aMSdQTzza2Z4wi2LR0vq/FU/ouqzlRp7+7tNJbVAsqhiUa2eeAVkFwZl9wRw
|
||||||
|
lddY0W85U507nw5M3iQv2GTOhJRXwhWpzDUFQ0fT56hAY/V+VbF1iHGAVIz4XlUj
|
||||||
|
gC21wuXz0xRdqP8cCd8UHLSbp8dmie161GeKVwO037aP+1hZJbm7ePsS5Na+qYG1
|
||||||
|
LCy0GhfQn71BsYUaGJtfRcaMwIbqaNIYn+Y6S1FVjxDPXCxFXDrIcFvldmJYTyeK
|
||||||
|
7KrkO2P1RbEiwYyPPUhthbb1Agi9ZutZsnadmPRk27t9bBjNnWaY2z17hijnzVVz
|
||||||
|
jOHuPlpb7cSaagVzLTT0zrZ+ifnZWwdl0S2ZrjBAeVrkNt7DOCUqwBnuBqYiRZFt
|
||||||
|
A1QicHxaEQIDAQABo20wazAdBgNVHQ4EFgQU3l25Ghab2k7UhwxftZ2vZ1HO9Sow
|
||||||
|
HwYDVR0jBBgwFoAU3l25Ghab2k7UhwxftZ2vZ1HO9SowDwYDVR0TAQH/BAUwAwEB
|
||||||
|
/zAYBgNVHREEETAPgg1yYWJiaXRtcS10ZXN0MA0GCSqGSIb3DQEBCwUAA4ICAQB9
|
||||||
|
ilcsRqIvnyN25Oh668YC/xxyeNTIaIxjMLyJaMylBRjNwo1WfbdpXToaEXgot5gK
|
||||||
|
5HGlu3OIBBwBryNAlBtf/usxzLzmkEsm1Dsn9sJNY1ZTkD8MO9yyOtLqBlqAsIse
|
||||||
|
oPVjzSdjk1fP3uyoG/ZUVAFZHZD3/9BEsftfS13oUVxo7vYz1DSyUATT/4QTYMQB
|
||||||
|
PytL6EKJ0dLyuy7rIkZVkaUi+P7GuDXj25Mi6Zkxaw2QnssSuoqy1bAMkzEyNFK5
|
||||||
|
0wlNWEY8H3jRZuAz1T4AXb9sjeCgBKZoWXgmGbzleOophdzvlq66UGAWPWYFGp8Q
|
||||||
|
4GJognovhKzSY9+3n+rMPLAXSao48SYDlyTOZeBo1DTluR5QjVd+NWbEdIsA6buQ
|
||||||
|
a6uPTSVKsulm7hyUlEZp+SsYAtVoZx3jzKKjZXjnaxOfUFWx6pTxNXvxR7pQ/8Ls
|
||||||
|
IfduGy4VjKVQdyuwCE7eVEPDK6d53WWs6itziuj7gfq8mHvZivIA65z05lTwqkvb
|
||||||
|
1WS2aht+zacqVSYyNrK+/kJA2CST3ggc1EO73lRvbfO9LJZWMdO+f/tkXH4zkfmL
|
||||||
|
A3JtJcLOWuv+ZrZvHMpKlBFNMySxE3IeGX+Ad9bGyhZvZULut95/QD7Xy4cPRZHF
|
||||||
|
R3SRn0rn/BeTly+5fkEoFk+ttah8IbwzhduPyPIxng==
|
||||||
|
-----END CERTIFICATE-----
|
||||||
52
integration/rabbitmq-config/certs/key.pem
Normal file
52
integration/rabbitmq-config/certs/key.pem
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
-----BEGIN PRIVATE KEY-----
|
||||||
|
MIIJQwIBADANBgkqhkiG9w0BAQEFAASCCS0wggkpAgEAAoICAQC9OjTq4DgqVo0m
|
||||||
|
RpS8DGRR6SFrSpb2bqnlYI7xSI3y67i/oP4weiZSawk2+euxhsN4FfOlsAgvpg4W
|
||||||
|
yRQH5PwnXOA1Lxz51qp1t0VumE3B1RDheiBTE8loG1FvmikOiek2gzz76nK0R1sb
|
||||||
|
KY1+/NVJpMs6dL6NzJXGN6aCpWTk7oeY+lW5bPBG0VRA7RUG80w9R9RDtqYc0SYU
|
||||||
|
mm43tjjxPZ81rhCXFx/Fv1kxnNTQJdATNrTn9SofymSfm42f4loOGyGBsqJYybKX
|
||||||
|
OPDxrM1erBN5eCwTpJMS4J30aMSdQTzza2Z4wi2LR0vq/FU/ouqzlRp7+7tNJbVA
|
||||||
|
sqhiUa2eeAVkFwZl9wRwlddY0W85U507nw5M3iQv2GTOhJRXwhWpzDUFQ0fT56hA
|
||||||
|
Y/V+VbF1iHGAVIz4XlUjgC21wuXz0xRdqP8cCd8UHLSbp8dmie161GeKVwO037aP
|
||||||
|
+1hZJbm7ePsS5Na+qYG1LCy0GhfQn71BsYUaGJtfRcaMwIbqaNIYn+Y6S1FVjxDP
|
||||||
|
XCxFXDrIcFvldmJYTyeK7KrkO2P1RbEiwYyPPUhthbb1Agi9ZutZsnadmPRk27t9
|
||||||
|
bBjNnWaY2z17hijnzVVzjOHuPlpb7cSaagVzLTT0zrZ+ifnZWwdl0S2ZrjBAeVrk
|
||||||
|
Nt7DOCUqwBnuBqYiRZFtA1QicHxaEQIDAQABAoICAA+AWvDpzNgVDouV6R3NkxNN
|
||||||
|
upXgPqUx9BuNETCtbal6i4AxR1l/zC9gwti82QTKQi2OeM74MHd8zjcqIkiyRsDP
|
||||||
|
wDNDKIfEAONTT+4LLoWEN5WNDGRZ4Nw1LrLqiVX+ULtNPXvynRJtLQa43PVL74oQ
|
||||||
|
pLBle23A1n0uNmcJ9w21B6ktysN9q+JVSCZodZpD6Jk1jus8JXgDXy/9Za2NMTV8
|
||||||
|
A5ShbYz/ETSBJCSnERz7GARW7TN6V0jS6vLTSqMQJyn0KYbHNDr7TPTL7psRuaI5
|
||||||
|
jP/cqxmx1/WKLo5k3cR3IW/cesDGQXZhMRQvNymXJkxvWMPS36lmfyZtbFNflw4Z
|
||||||
|
9OD+2RKt5jFDJjG8fYiYoYBdLiTj2Wdvo4mbRPNkTL75o65riDkDCQuZhDXFBm3s
|
||||||
|
B1aDv5y1AXrzNZ5JSikszKgbLNPYB0rI3unp6i0P1985w6dyel0MGG+ouaeiyrxS
|
||||||
|
9IgJDnE4BJ79mEzHTXtbZ/+3aGAK/Y6mU8Pz2s6/+6ccT0miievsMS+si1KESF31
|
||||||
|
WLnsMdcrJcxqcm7Ypo24G0yBJluSDKtD1cqQUGN1MKp+EEv1SCH+4csaa3ooRB0o
|
||||||
|
YveySjqxtmhVpQuY3egCOaXhPmX7lgYwoe+G4UIkUMwPn20WMg+jFxgPASdh4lqE
|
||||||
|
mzpePP7STvEZAr+rrLu1AoIBAQDmCEiKOsUTtJlX3awOIRtCkIqBxS1E6rpyjfxK
|
||||||
|
A6+zpXnE++8MhIJ07+9bPdOshGjS3JbJ+hu+IocbNg++rjRArYQnJh8/qBZ2GB2v
|
||||||
|
Ryfptsoxtk/xUsmOfchvk4tOjvDHZrJehUtGc+LzX/WUqpgtEk1Gnx7RGRuDNnqS
|
||||||
|
Q1+yU4NubHwOHPswBBXOnVtopcAHFpKhbKRFOHOwMZN99qcWVIkv4J9c6emcPMLI
|
||||||
|
I/QPIvwB6WmbLa0o3JNXlD4kPdqCgNW36KEFiW8m+4tgzF3HWYSAyIeBRFG7ouE6
|
||||||
|
yk5hiptPKhZlTmTAkQSssCXksiTw1rsspFULZSRyaaaPunvVAoIBAQDSlrKu+B2h
|
||||||
|
AJtxWy5MQDOiroqT3KDneIGXPYgH3/tiDmxy0CIEbSb5SqZ6zAmihs3dWWCmc1JH
|
||||||
|
YObRrqIxu+qVi4K+Uz8l7WBrS7DkjZjajq+y/mrZYUNRoL2q9mnNqRNan7zxWDJc
|
||||||
|
U4u2NH9P4LOz6ttE4OG9SC3/gZLoepA+ANZatu93749IT7z8ske0MVPP76jVI1Gl
|
||||||
|
D7cPIlzcBUdJgNV8UOkxeqU3+S6Jn17Tkx5qMWND/2BCN4voQ4pfGWSkbaHlMLh1
|
||||||
|
2SbVuR+HYPY3aPJeSY7MEPoc7d2SSVOcVDr2AQwSDSCCgIFZOZlawehUz9R51hK8
|
||||||
|
LlaccFWXhS9NAoIBAEFZNRJf48DXW4DErq5M5WuhmFeJZnTfohwNDhEQvwdwCQnW
|
||||||
|
8HBD7LO/veXTyKCH9SeCFyxF6z+2m181mn93Cc0d/h8JC3OQEuF1tGko88PHc+Vv
|
||||||
|
f4J1HGFohlp8NeUZYnmjSSTlBR98qIqvRhr348daHa3kYmLQmSpLfcKzdSo542qp
|
||||||
|
UwzHWuynHHLX7THrdIQO+5T0Qi6P/P2e9+GfApSra1W4oE1K/lyuPj+RRzJNo/3/
|
||||||
|
C0tUTI8BKrKEoKq3D65nX0+hvKzQAE24xD25kSKi4aucTDKC8B04BngnJOE8+SYi
|
||||||
|
NL6O6Lxz9joAyKMRoMDyn7Xs8WQNVa9TKEhImAkCggEBAMljmIm/egZIoF7thf8h
|
||||||
|
vr+rD5eL/Myf776E95wgVTVW+dtqs71r7UOmYkM48VXeeO1f1hAYZO0h/Fs2GKJb
|
||||||
|
RWGyQ1xkHBXXRsgVYJuR1kXdAqW4rNIqM8jSYdAnStOFB5849+YOJEsrEocy+TWY
|
||||||
|
fAJpbTwXm4n6hxK8BZQR8fN5tYSXQbd+/5V1vBQlInFuYuqOFPWPizrBJp1wjUFU
|
||||||
|
QvJGJON4NSo+UdaPlDPEl1jabtG7XWTfylxI5qE+RgvgKuEcfyDBUQZSntLw8Pf0
|
||||||
|
gEJJOM92pPr+mVIlICoPucfcvW4ZXkO9DgP/hLOhY8jpe5fwERBa6xvPbMC6pP/8
|
||||||
|
PFkCggEBAOLtvboBThe57QRphsKHmCtRJHmT4oZzhMYsE+5GMGYzPNWod1hSyfXn
|
||||||
|
EB8iTmAFP5r7FdC10B8mMpACXuDdi2jbmlYOTU6xNTprSKtv8r8CvorWJdsQwRsy
|
||||||
|
pZ7diSCeyi0z/sIx//ov0b3WD0E8BG/HWsFbX0p5xXpaljYEv5dK7xUiWgBW+15a
|
||||||
|
N1AeVcPiXRDwhQMVcvVOvzgwKsw+Rpls/9W4hihcBHaiMcBUDFWxJtnf4ZAGAZS3
|
||||||
|
/694MOYlmfgT/cDqF9oOsCdxM0w24kL0dcUM7zPk314ixAAfUwXaxisBhS2roJ88
|
||||||
|
HsuK9JPSK/AS0IqUtKiq4LZ9ErixYF0=
|
||||||
|
-----END PRIVATE KEY-----
|
||||||
35
integration/rabbitmq-config/definitions.json
Executable file
35
integration/rabbitmq-config/definitions.json
Executable file
@@ -0,0 +1,35 @@
|
|||||||
|
{
|
||||||
|
"users": [
|
||||||
|
{
|
||||||
|
"name": "gitea",
|
||||||
|
"password_hash": "5IdZmMJhNb4otX/nz9Xtmkpj9khl6+5eAmXNs/oHYwQNO3jg",
|
||||||
|
"hashing_algorithm": "rabbit_password_hashing_sha256",
|
||||||
|
"tags": "administrator"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"vhosts": [
|
||||||
|
{
|
||||||
|
"name": "/"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"permissions": [
|
||||||
|
{
|
||||||
|
"user": "gitea",
|
||||||
|
"vhost": "/",
|
||||||
|
"configure": ".*",
|
||||||
|
"write": ".*",
|
||||||
|
"read": ".*"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"exchanges": [
|
||||||
|
{
|
||||||
|
"name": "pubsub",
|
||||||
|
"vhost": "/",
|
||||||
|
"type": "topic",
|
||||||
|
"durable": true,
|
||||||
|
"auto_delete": false,
|
||||||
|
"internal": false,
|
||||||
|
"arguments": {}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
83
integration/test-plan.md
Normal file
83
integration/test-plan.md
Normal file
@@ -0,0 +1,83 @@
|
|||||||
|
# Test Plan: workflow-pr Bot
|
||||||
|
|
||||||
|
## 1. Introduction
|
||||||
|
|
||||||
|
This document outlines the test plan for the `workflow-pr` bot. The bot is responsible for synchronizing pull requests between ProjectGit and PackageGit repositories, managing reviews, and handling merges. This test plan aims to ensure the bot's functionality, reliability, and performance.
|
||||||
|
|
||||||
|
## 2. Scope
|
||||||
|
|
||||||
|
### In Scope
|
||||||
|
|
||||||
|
* Pull Request synchronization (creation, update, closing).
|
||||||
|
* Reviewer management (adding, re-adding, mandatory vs. advisory).
|
||||||
|
* Merge management, including `ManualMergeOnly` and `ManualMergeProject` flags.
|
||||||
|
* Configuration parsing (`workflow.config`).
|
||||||
|
* Label management (`staging/Auto`, `review/Pending`, `review/Done`).
|
||||||
|
* Maintainership and permissions handling.
|
||||||
|
|
||||||
|
### Out of Scope
|
||||||
|
|
||||||
|
* Package deletion requests (planned feature).
|
||||||
|
* Underlying infrastructure (Gitea, RabbitMQ, OBS).
|
||||||
|
* Performance and load testing.
|
||||||
|
* Closing a PackageGit PR (currently disabled).
|
||||||
|
|
||||||
|
## 3. Test Objectives
|
||||||
|
|
||||||
|
* Verify that pull requests are correctly synchronized between ProjectGit and PackageGit.
|
||||||
|
* Ensure that reviewers are correctly added to pull requests based on the configuration.
|
||||||
|
* Validate that pull requests are merged only when all conditions are met.
|
||||||
|
* Confirm that the bot correctly handles various configurations in `workflow.config`.
|
||||||
|
* Verify that labels are correctly applied to pull requests.
|
||||||
|
* Ensure that maintainership and permissions are correctly enforced.
|
||||||
|
|
||||||
|
## 4. Test Strategy
|
||||||
|
|
||||||
|
The testing will be conducted in a dedicated test environment that mimics the production environment. The strategy will involve a combination of:
|
||||||
|
|
||||||
|
* **Component Testing:** Testing individual components of the bot in isolation using unit tests written in Go.
|
||||||
|
* **Integration Testing:** Testing the bot's interaction with Gitea, RabbitMQ, and a mock OBS server using `pytest`.
|
||||||
|
* **End-to-End Testing:** Testing the complete workflow from creating a pull request to merging it using `pytest`.
|
||||||
|
|
||||||
|
### Test Automation
|
||||||
|
|
||||||
|
* **Unit Tests:** Go's built-in testing framework will be used to write unit tests for individual functions and methods.
|
||||||
|
* **Integration and End-to-End Tests:** `pytest` will be used to write integration and end-to-end tests that use the Gitea API to create pull requests and verify the bot's behavior.
|
||||||
|
|
||||||
|
### Success Metrics
|
||||||
|
|
||||||
|
* **Test Coverage:** The goal is to achieve at least 80% test coverage for the bot's codebase.
|
||||||
|
* **Bug Detection Rate:** The number of bugs found during the testing phase.
|
||||||
|
* **Test Pass Rate:** The percentage of test cases that pass without any issues.
|
||||||
|
|
||||||
|
|
||||||
|
## 5. Test Cases
|
||||||
|
|
||||||
|
| Test Case ID | Description | Steps to Reproduce | Expected Results | Priority |
|
||||||
|
| :--- | :--- | :--- | :--- | :--- |
|
||||||
|
| **TC-SYNC-001** | **Create ProjectGit PR from PackageGit PR** | 1. Create a new PR in a PackageGit repository. | 1. A new PR is created in the corresponding ProjectGit repository with the title "Forwarded PRs: <package_name>".<br>2. The ProjectGit PR description contains a link to the PackageGit PR (e.g., `PR: org/package_repo!pr_number`).<br>3. The package submodule in the ProjectGit PR points to the PackageGit PR's commit. | High |
|
||||||
|
| **TC-SYNC-002** | **Update ProjectGit PR from PackageGit PR** | 1. Push a new commit to an existing PackageGit PR. | 1. The corresponding ProjectGit PR's head branch is updated with the new commit. | High |
|
||||||
|
| **TC-SYNC-003** | **WIP Flag Synchronization** | 1. Mark a PackageGit PR as "Work In Progress".<br>2. Remove the WIP flag from the PackageGit PR. | 1. The corresponding ProjectGit PR is also marked as "Work In Progress".<br>2. The WIP flag on the ProjectGit PR is removed. | Medium |
|
||||||
|
| **TC-SYNC-004** | **WIP Flag (multiple referenced package PRs)** | 1. Create a ProjectGit PR that references multiple PackageGit PRs.<br>2. Mark one of the PackageGit PRs as "Work In Progress".<br>3. Remove the "Work In Progress" flag from all PackageGit PRs. | 1. The ProjectGit PR is marked as "Work In Progress".<br>2. The "Work In Progress" flag is removed from the ProjectGit PR only after it has been removed from all associated PackageGit PRs. | Medium |
|
||||||
|
| **TC-SYNC-005** | **NoProjectGitPR = true, edits disabled** | 1. Set `NoProjectGitPR = true` in `workflow.config`.<br>2. Create a PackageGit PR without "Allow edits from maintainers" enabled. <br>3. Push a new commit to the PackageGit PR. | 1. No ProjectGit PR is created.<br>2. The bot adds a warning comment to the PackageGit PR explaining that it cannot update the PR. | High |
|
||||||
|
| **TC-SYNC-006** | **NoProjectGitPR = true, edits enabled** | 1. Set `NoProjectGitPR = true` in `workflow.config`.<br>2. Create a PackageGit PR with "Allow edits from maintainers" enabled.<br>3. Push a new commit to the PackageGit PR. | 1. No ProjectGit PR is created.<br>2. The submodule commit on the project PR is updated with the new commit from the PackageGit PR. | High |
|
||||||
|
| **TC-COMMENT-001** | **Detect duplicate comments** | 1. Create a PackageGit PR.<br>2. Wait for the `workflow-pr` bot to act on the PR.<br>3. Edit the body of the PR to trigger the bot a second time. | 1. The bot should not post a duplicate comment. | High |
|
||||||
|
| **TC-REVIEW-001** | **Add mandatory reviewers** | 1. Create a new PackageGit PR. | 1. All mandatory reviewers are added to both the PackageGit and ProjectGit PRs. | High |
|
||||||
|
| **TC-REVIEW-002** | **Add advisory reviewers** | 1. Create a new PackageGit PR with advisory reviewers defined in the configuration. | 1. Advisory reviewers are added to the PR, but their approval is not required for merging. | Medium |
|
||||||
|
| **TC-REVIEW-003** | **Re-add reviewers** | 1. Push a new commit to a PackageGit PR after it has been approved. | 1. The original reviewers are re-added to the PR. | Medium |
|
||||||
|
| **TC-REVIEW-004** | **Package PR created by a maintainer** | 1. Create a PackageGit PR from the account of a package maintainer. | 1. No review is requested from other package maintainers. | High |
|
||||||
|
| **TC-REVIEW-005** | **Package PR created by an external user (approve)** | 1. Create a PackageGit PR from the account of a user who is not a package maintainer.<br>2. One of the package maintainers approves the PR. | 1. All package maintainers are added as reviewers.<br>2. Once one maintainer approves the PR, the other maintainers are removed as reviewers. | High |
|
||||||
|
| **TC-REVIEW-006** | **Package PR created by an external user (reject)** | 1. Create a PackageGit PR from the account of a user who is not a package maintainer.<br>2. One of the package maintainers rejects the PR. | 1. All package maintainers are added as reviewers.<br>2. Once one maintainer rejects the PR, the other maintainers are removed as reviewers. | High |
|
||||||
|
| **TC-REVIEW-007** | **Package PR created by a maintainer with ReviewRequired=true** | 1. Set `ReviewRequired = true` in `workflow.config`.<br>2. Create a PackageGit PR from the account of a package maintainer. | 1. A review is requested from other package maintainers if available. | High |
|
||||||
|
| **TC-MERGE-001** | **Automatic Merge** | 1. Create a PackageGit PR.<br>2. Ensure all mandatory reviews are completed on both project and package PRs. | 1. The PR is automatically merged. | High |
|
||||||
|
| **TC-MERGE-002** | **ManualMergeOnly with Package Maintainer** | 1. Create a PackageGit PR with `ManualMergeOnly` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the package PR from the account of a package maintainer for that package. | 1. The PR is merged. | High |
|
||||||
|
| **TC-MERGE-003** | **ManualMergeOnly with unauthorized user** | 1. Create a PackageGit PR with `ManualMergeOnly` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the package PR from the account of a user who is not a maintainer for that package. | 1. The PR is not merged. | High |
|
||||||
|
| **TC-MERGE-004** | **ManualMergeOnly with multiple packages** | 1. Create a ProjectGit PR that references multiple PackageGit PRs with `ManualMergeOnly` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on each package PR from the account of a package maintainer. | 1. The PR is merged only after "merge ok" is commented on all associated PackageGit PRs. | High |
|
||||||
|
| **TC-MERGE-005** | **ManualMergeOnly with Project Maintainer** | 1. Create a PackageGit PR with `ManualMergeOnly` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the package PR from the account of a project maintainer. | 1. The PR is merged. | High |
|
||||||
|
| **TC-MERGE-006** | **ManualMergeProject with Project Maintainer** | 1. Create a PackageGit PR with `ManualMergeProject` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the project PR from the account of a project maintainer. | 1. The PR is merged. | High |
|
||||||
|
| **TC-MERGE-007** | **ManualMergeProject with unauthorized user** | 1. Create a PackageGit PR with `ManualMergeProject` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the project PR from the account of a package maintainer. | 1. The PR is not merged. | High |
|
||||||
|
| **TC-CONFIG-001** | **Invalid Configuration** | 1. Provide an invalid `workflow.config` file. | 1. The bot reports an error and does not process any PRs. | High |
|
||||||
|
| **TC-LABEL-001** | **Apply `staging/Auto` label** | 1. Create a new PackageGit PR. | 1. The `staging/Auto` label is applied to the ProjectGit PR. | High |
|
||||||
|
| **TC-LABEL-002** | **Apply `review/Pending` label** | 1. Create a new PackageGit PR. | 1. The `review/Pending` label is applied to the ProjectGit PR when there are pending reviews. | Medium |
|
||||||
|
| **TC-LABEL-003** | **Apply `review/Done` label** | 1. Ensure all mandatory reviews for a PR are completed. | 1. The `review/Done` label is applied to the ProjectGit PR when all mandatory reviews are completed. | Medium |
|
||||||
|
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user