2
1
forked from adamm/autogits

250 Commits

Author SHA256 Message Date
fa61af0db6 Implement detection for local repositories
Repositories which build against another repo in the same project need
to do so also in the forked project. This is eg for consuming rpms
from one repo in an image build from same project.
2025-05-05 11:26:07 +02:00
23e2566843 Fix git path compare of meta to pull request
.git is optional, but doesn't matter, so trimming it away
2025-05-05 10:54:31 +02:00
0d0fcef7ac staging: fixes 2025-05-04 20:45:33 +02:00
62a597718b fix parsing test 2025-05-03 14:34:33 +02:00
327cb4ceaf fixes if git cat-file has error 2025-05-02 22:46:31 +02:00
aac475ad16 wip 2025-05-02 16:57:13 +02:00
046a60a6ed move staging config to its own config file 2025-05-02 11:18:23 +02:00
dcf964bf7a wip 2025-04-30 17:26:31 +02:00
bff5f1cab7 common: handle case of missing remote
If repo present, but remote not setup, just set it up
2025-04-30 12:29:32 +02:00
6d1ef184e0 workflow-pr: logging 2025-04-29 19:08:37 +02:00
e30d366f2f workflow-pr: logging updates 2025-04-29 18:00:37 +02:00
4a2fe06f05 staging: refactor 2025-04-28 23:47:05 +02:00
210e7588f1 common: actually remove items we process 2025-04-28 22:05:50 +02:00
72b100124d staging: list notification correctly in logging 2025-04-28 19:51:32 +02:00
996d36aaa8 staging: more refactor 2025-04-28 19:47:05 +02:00
82b5b105b1 staging: refactor 2025-04-28 19:44:32 +02:00
248ec4d03c staging: get last results for reference project 2025-04-28 17:34:54 +02:00
faa21f5453 staging: logging adjustments 2025-04-28 16:57:05 +02:00
21c4a7c1e0 wip 2025-04-28 14:23:59 +02:00
f3f76e7d5b Merge commit '96e1c26600f02a81299d4c121a2239c2a28e3184ef306cb0ac2cf00f0f97202e' into refactor 2025-04-28 12:37:38 +02:00
e341b630a2 wip 2025-04-27 22:53:19 +02:00
58532b9b60 wip 2025-04-25 17:40:44 +02:00
a697ccd0ca sync 2025-04-25 16:55:24 +02:00
4bafe0b4ef Merge branch 'refactor' of c3:gitea_test/autogits into refactor 2025-04-25 16:55:09 +02:00
7af2092ae1 wip 2025-04-24 23:51:46 +02:00
32374f76c1 status 2025-04-23 17:51:59 +02:00
9403b563f6 wip 2025-04-22 23:42:41 +02:00
bd492f8d92 no branch if default 2025-04-17 18:40:20 +02:00
fbc84d551d workflow-direct: use correct remote name instead of origin 2025-04-17 18:21:23 +02:00
874a120f88 we are using master for project git .. this may change 2025-04-17 17:58:18 +02:00
199396c210 Use HuJSON
for the comments and glory
2025-04-17 15:33:18 +02:00
f0de3ad54a workflow-direct: no panic if no changes 2025-04-17 15:12:51 +02:00
bfeac63c57 update repo parsing 2025-04-17 13:34:11 +02:00
d65f37739c fixes 2025-04-17 00:38:53 +02:00
5895e3d02c workflow-direct: add no-op mode, for debugging 2025-04-16 23:49:31 +02:00
0e036b5ec6 workflow-direct: move away from prjgit repo being just repo 2025-04-16 18:07:37 +02:00
1d1602852c direct: GitClone instead of running clone directly 2025-04-15 23:38:41 +02:00
9b5013ee45 git clone lock fixes 2025-04-15 18:15:35 +02:00
ed815c3ad1 unique org_repo remote names 2025-04-15 14:55:19 +02:00
8645063e8d git utils 2025-04-15 13:51:08 +02:00
2d044d5664 git: one generator per app, multiple instances
this allows locking for access for each org
2025-04-14 18:33:18 +02:00
51ba81f257 Merge branch 'main' into refactor 2025-04-11 14:00:00 +02:00
bb7a247f66 common: commit status api 2025-04-11 13:58:20 +02:00
c1f71253a4 devel_update: dead code removal 2025-04-11 13:57:56 +02:00
96e1c26600 Fix crash when review.User is nil 2025-04-10 15:29:02 +02:00
9d9964df11 Build in :PR: sub project as wanted for SLFO
Instead of doing it a home project which won't scale.
2025-04-10 13:47:58 +02:00
e257b113b9 devel-importer: helpful scripts 2025-04-09 18:48:30 +02:00
11e0bbaed1 devel-importer: remove remote branches 2025-04-09 16:47:19 +02:00
fb430d8c76 group-review: don't use regex for matching group name 2025-04-09 12:21:55 +02:00
7ed2a7082d Fix notification parsing regex 2025-04-09 11:43:54 +02:00
ba7686189e add GitClone for persistent git clones 2025-04-09 00:03:22 +02:00
9dcd25b69a wip 2025-04-08 19:03:33 +02:00
881fad36a0 Initial support of QA subproject setup
- Nothing handed over to external scripts yet
- Not agreed file format in _obs_staging (YAML!)
- No build monitoring
2025-04-08 17:02:03 +02:00
29906e22d2 Complete project meta
description is not optional

define releasetarget element
2025-04-08 17:02:03 +02:00
d89c77e22d common: use hostname:port instead of just hostname for API calls 2025-04-08 16:48:25 +02:00
f91c61cd20 tests 2025-04-08 00:23:24 +02:00
06aef50047 start refactoring PR bot 2025-04-07 19:03:02 +02:00
52a5cdea94 group-review: fix typo 2025-04-07 14:24:48 +02:00
d3f1b36676 Use "-gitea-url" instead of "-gitea-host" or simiar
This allows to use another schema than https:// to connect to Gitea
2025-04-07 14:20:26 +02:00
5ea5f05b02 common: reviews fix 2025-04-07 09:47:07 +02:00
5877081280 whitespace 2025-04-06 17:32:16 +02:00
c4ce974ddf group-review: fixes 2025-04-05 23:45:40 +02:00
65c718e73b group-review: move config 2025-04-04 18:07:57 +02:00
a8e6c175c0 remove obsolete per-executable go.mod 2025-04-04 13:56:52 +02:00
044416cd2a Merge branch 'main' of c3:gitea_test/autogits 2025-04-04 13:55:54 +02:00
009cc88d54 Merge remote-tracking branch 'gitea/main' 2025-04-04 13:06:28 +02:00
da1f4f4fa0 fix mocks 2025-04-04 13:05:51 +02:00
cfad21e1a3 Set review state only after the end of the build
Instead using normal comments to inform users of the build project
or in case the used source of the pull request has changed
and the build project has been updated.
2025-04-04 10:09:00 +02:00
5eb54d40e0 Define "unknown" build state 2025-04-04 10:09:00 +02:00
80ff036acb group-review: rerequest reviwes missing group review
If user is member of group but doesn't review correctly, request
their review again.
2025-04-04 00:17:55 +02:00
Jan Zerebecki
2ed4f0d05f Build all modules and in obs directly from this repo
Build each go module in a subpackage.
2025-04-03 22:40:04 +02:00
Jan Zerebecki
23ed9b830d Merge all go.mod into a top level one 2025-04-03 22:38:31 +02:00
Jan Zerebecki
4604aaeeba Rename bots-common to common
to make it match the name it is imported as
2025-04-03 22:38:28 +02:00
Jan Zerebecki
2dfe973c51 Generate group-review/go.sum
it wasn't commited before
2025-04-02 18:47:05 +02:00
b7625cd4c4 Fix cloning for src.suse.de instance 2025-04-02 14:15:06 +02:00
12e7a071d9 whitespace only changes 2025-04-02 11:39:52 +02:00
6409741a12 Initial support for SSH based authentification
Moved all HTTP codes to the ObsRequest method.

Make use of the authorization cookie, but only store it in memory.
Should be fine for a constant running bot to do the authorization once
on startup.
2025-04-02 11:14:02 +02:00
78eb9f11e5 Extend regexp to match orgs and projects include - and _ chars 2025-04-01 09:40:39 +02:00
c28f28e852 Merge branch 'main' of c3:gitea_test/autogits 2025-03-28 16:34:10 +01:00
72270c57ed fixes to importer 2025-03-28 16:33:59 +01:00
5d6dc75400 fix maintainership writer 2025-03-28 16:09:36 +01:00
20b02d903c pr: move config to project 2025-03-26 23:20:26 +01:00
58dc4927c2 Flexible OBS api and www endpoints
Allow the endoinds to be configurable
2025-03-25 12:44:17 +01:00
ce48cbee72 Add ability to set build location 2025-03-25 11:52:15 +01:00
3bd179bee1 wip 2025-03-24 00:23:11 +01:00
940e5be2c1 Migrate to prjgit based config
config now only has reference to org or prjgits and the rest
is defined in the "workflow.config" in the prjgit itself. This
allows the config to be updated in the project.
2025-03-23 16:33:06 +01:00
4a4113aad7 wip 2025-03-21 16:39:50 +01:00
3ee939db1d logging 2025-03-18 17:14:19 +01:00
00f4e11f02 fixes when scmsync for packages already there 2025-03-18 16:55:30 +01:00
635bdd0f50 Migrate PR related to to common area 2025-03-18 13:08:49 +01:00
82f7a186a9 Only cherry-pick non-merges 2025-03-17 17:58:51 +01:00
030fa43404 Bug fixes in importer 2025-03-17 17:34:11 +01:00
2ad9f6c179 push changes if link changed 2025-03-16 20:29:42 +01:00
80952913c9 prjgit based config 2025-03-16 20:29:15 +01:00
1ce38c9de2 wip links 2025-03-13 18:44:38 +01:00
bbb721c6fa wip 2025-03-13 08:46:21 +01:00
a50d238715 wip 2025-03-12 23:13:13 +01:00
463a6e3236 wip 2025-03-12 19:13:22 +01:00
91ecf88a38 fixes for updates 2025-03-12 14:37:38 +01:00
4f9a99d232 link 2025-03-12 00:28:55 +01:00
02d3a2e159 fixes, etc. 2025-03-11 19:46:40 +01:00
03370871c4 do not clone if already created 2025-03-11 16:04:01 +01:00
1f4e1ac35e update build project if needed when build pending 2025-03-10 13:44:59 +01:00
debbee17eb wip 2025-03-07 17:40:59 +01:00
c63a56bc4e consistent usage parameters 2025-03-07 16:12:48 +01:00
568346ce3d wip 2025-03-04 22:38:17 +01:00
a010618764 do not demand credentials to print help 2025-03-04 18:37:08 +01:00
a80e04065f list maintainers correctly 2025-03-04 18:36:41 +01:00
72c2967d1f fixes 2025-03-03 15:03:43 +01:00
1cacb914b4 maintainer set finished 2025-03-01 00:26:08 +01:00
517ecbb68a wip 2025-02-28 17:36:35 +01:00
e1313105d1 remove rpm and re-org user queries 2025-02-26 18:10:42 +01:00
5b84f9d5ce maintianer info 2025-02-24 18:55:37 +01:00
7c254047a8 devel-importer: no OBS req header print in debug mode 2025-02-24 13:50:24 +01:00
fffdce2c58 better recovery in unexpected situations 2025-02-24 13:11:54 +01:00
4014747712 always skip LFS 2025-02-24 13:11:38 +01:00
b49df4845a add maintainership fetching 2025-02-24 00:12:28 +01:00
9bac2e924c devel-importer update 2025-02-21 17:18:37 +01:00
ee6d704e1e refactor 2025-02-20 19:25:36 +01:00
bfeaed40d5 remove legacy 2025-02-20 19:17:23 +01:00
4dd864c7b6 move maintainership to common/ 2025-02-20 12:20:14 +01:00
205741dde1 typos 2025-02-19 11:46:25 +01:00
a5acc1e34e yolo 2025-02-19 10:51:49 +01:00
fc2dbab782 wip 2025-02-19 00:06:54 +01:00
9236fa3ff1 check stale reviews 2025-02-18 18:14:17 +01:00
334fe5553e tests 2025-02-18 17:52:36 +01:00
9418b33c6c fix 2025-02-18 17:42:55 +01:00
7a8c84d1a6 check status 2025-02-18 12:40:32 +01:00
367d606870 wip 2025-02-17 17:49:02 +01:00
682397975f wip 2025-02-16 22:40:10 +01:00
b4a1c5dc01 docs 2025-02-15 13:30:58 +01:00
1c38c2105b wip 2025-02-14 17:13:51 +01:00
072d7b4825 workflow-direct: ignore non-sha1 repos 2025-02-13 16:42:59 +01:00
9ecda0c58b obs-staging-bot: log polling cycles 2025-02-11 16:26:29 +01:00
8c2cc51a3c obs-staging-bot: closed requests should no longer need review 2025-02-11 16:22:00 +01:00
2f38e559d1 fix obs staging bot 2025-02-10 15:16:48 +01:00
61d9359ce3 remove init() 2025-02-10 14:11:14 +01:00
d46ca05346 Fix against new git interfaces 2025-02-10 13:50:25 +01:00
a84d55c858 rename interfaces 2025-02-06 19:18:09 +01:00
2cd7307291 remove debug code 2025-02-06 18:56:02 +01:00
efde2fad69 reviewers fix in tests 2025-02-06 17:17:06 +01:00
e537e5d00c wip 2025-02-05 18:30:08 +01:00
adffc67ca0 typos 2025-02-05 14:44:38 +01:00
f0b184f4c3 move to reviewers 2025-02-05 14:44:22 +01:00
656a3bacdf add reviewer parsing 2025-02-05 14:43:38 +01:00
c0c467d72b merge 2025-02-04 17:44:49 +01:00
dbee0e8bd3 submodules 2025-02-04 14:24:38 +01:00
c7723fce2d wip 2025-02-03 18:15:01 +01:00
12a641c86f wip 2025-02-02 21:07:51 +01:00
73e817d408 wip 2025-01-31 17:39:46 +01:00
6aa53bdf25 git.status with rename support 2025-01-30 12:45:56 +01:00
d5dbb37e18 git.status 2025-01-29 17:29:09 +01:00
5108019db0 wip 2025-01-28 10:52:54 +01:00
6fc0607823 wip 2025-01-27 17:43:50 +01:00
c1df08dc59 {wip} 2025-01-21 17:20:00 +01:00
92747f0913 {wip} 2025-01-21 17:19:18 +01:00
f77e35731c workflow-direct: fix building 2025-01-21 16:24:50 +01:00
b9e70132ae workflow-pr: print errors from check handlers 2025-01-17 14:46:53 +01:00
245181ad28 workflow-pr: don't recreate branches on every check
Check if branch exists and if it matches the PR already created.
If yes, then nothing needs to be updated.
2025-01-16 16:36:53 +01:00
fbaeddfcd8 add support for maintainership directories 2025-01-15 00:46:03 +01:00
e63a450c5d add review checks 2025-01-11 21:37:59 +01:00
8ab35475fc review stuff 2025-01-03 00:46:40 +01:00
69776dc5dc Add ReviewSet.ConsistentSet() check 2025-01-02 14:44:31 +01:00
cfe15a0551 Add ReviewSet.ConsistentSet() check 2025-01-02 13:46:59 +01:00
888582a74a wip 2024-12-18 17:30:00 +01:00
72d5f64f90 wip 2024-12-17 23:33:43 +01:00
fe2a577b3b wip 2024-12-17 18:19:04 +01:00
ac6fb96534 wip 2024-12-16 18:12:54 +01:00
f6bd0c10c0 . 2024-12-16 15:50:33 +01:00
50aab4c662 wip 2024-12-16 08:15:49 +01:00
8c6180a8cf . 2024-12-15 13:00:20 +01:00
044241c71e . 2024-12-13 15:28:38 +01:00
e057cdf0d3 workflow-pr: review processing 2024-12-12 19:16:32 +01:00
7ccbd1deb2 wip 2024-12-11 14:41:51 +01:00
68ba45ca9c wip 2024-12-11 01:12:59 +01:00
a7d81d6013 wip 2024-12-10 19:03:54 +01:00
5f00b10f35 wip 2024-12-09 18:20:56 +01:00
7433ac1d3a wip 2024-12-09 00:39:55 +01:00
db766bacc3 commit: PR desc parser and writer 2024-12-07 14:35:34 +01:00
77751ecc46 workflow-pr: wip 2024-12-05 18:38:35 +01:00
a025328fef wip 2024-12-04 08:55:40 +01:00
0c866e8f89 worflow-pr: wip 2024-12-02 10:26:51 +01:00
2d12899da5 workflow-pr: renamed files 2024-12-01 11:36:26 +01:00
f4462190c9 workflow-pr: maintainership API change 2024-11-29 17:33:01 +01:00
7342dc42e9 workflow-pr: refactor 2024-11-29 12:49:11 +01:00
60c0a118c9 workflow-pr: maintainernship function signature change 2024-11-28 18:16:14 +01:00
cf101ef3f0 workflow-pr: maintainership doc update
Sync with https://github.com/openSUSE/openSUSE-git/issues/55

Update README.md with syntax
2024-11-28 17:25:32 +01:00
0331346025 workflow-pr: maintainership parsing 2024-11-28 17:10:26 +01:00
21f7da2257 workflow-pr: maintainership code 2024-11-28 00:15:37 +01:00
2916ec8da5 workflow-pr: tests 2024-11-27 17:50:55 +01:00
2bc9830a7a workflow-pr: tests 2024-11-27 16:13:37 +01:00
f281986c8f workflow-pr: tests 2024-11-26 17:21:17 +01:00
e56f444960 workflow-pr: more unit tests 2024-11-25 17:02:48 +01:00
b96c4d26ca workflow-pr: small refactor 2024-11-14 18:02:11 +01:00
2949e23b11 workflow-pr: test fixes 2024-11-13 16:21:51 +01:00
1d7d0a7b43 workflow-pr: more tests 2024-11-13 16:18:27 +01:00
e8e51e21ca more tests 2024-11-12 14:26:36 +01:00
dc96392b40 pr: more unit tests 2024-11-11 15:52:34 +01:00
c757b50c65 wip 2024-11-10 23:19:23 +01:00
0a7978569e refactor 2024-11-08 16:36:05 +01:00
463e3e198b refactor 2024-11-08 15:05:09 +01:00
8bedcc5195 wip 2024-11-07 18:25:35 +01:00
0d9451e92c {wip} - unit tests
`git submodule status` will display current state, which will be
overwritten by checkout submodule. Solution is to `submodule deinit`
before looking at submodule status.
2024-11-04 15:13:22 +01:00
a230c2aa52 {wip} tests 2024-11-03 22:21:57 +01:00
0f6cb392d6 wip tests 2024-10-30 16:55:51 +01:00
48a889b353 obs-staging-bot: cleanup modules 2024-10-29 15:37:57 +01:00
a672bb85fb wip 2024-10-29 15:36:20 +01:00
6ecc4ecb3a wip 2024-10-01 17:21:28 +02:00
881cba862f common: refactor IsReviewed() 2024-10-01 12:18:37 +02:00
77bdf7649a common: parse all PR for associated PrjGit PR
we may have more than 1 page of PR to parse
2024-10-01 12:03:50 +02:00
a0a79dcf4d pr: add reviewers to PR workflow 2024-09-30 16:19:40 +02:00
3d7336a3a0 doc formatting 2024-09-30 15:37:43 +02:00
bbdd9eb0be docs 2024-09-30 15:36:54 +02:00
c48ff699f4 docs 2024-09-30 15:35:46 +02:00
27014958be docs 2024-09-30 15:28:25 +02:00
5027e98c04 pr: processing checks 2024-09-30 15:09:45 +02:00
e2498afc4d pr: branch forwarding on check 2024-09-30 10:25:08 +02:00
7234811edc pr: wip 2024-09-29 23:11:51 +02:00
4692cfbe6f workflow-pr: app name 2024-09-29 15:32:32 +02:00
464e807747 pr: WIP 2024-09-27 17:58:09 +02:00
76f2ae8aec common: find pull requests by source and target 2024-09-27 17:55:49 +02:00
a0b65ea8f4 common: print stacktrace when recovering from panic 2024-09-27 17:54:31 +02:00
5de077610c direct: use CloneURL instead of SSH
Make sure that we use public CloneURL instead of SSH for submodule
OBS doesn't fetch submodules with SSH schema
2024-09-25 16:33:07 +02:00
d7bbe5695c devel-importer: more fixes 2024-09-19 19:00:56 +02:00
86df1921e0 devel-importer: handle history rewrite
Imports can have history rewritten because email addresses
can change in OBS and are not recorded as in git commits. This
can be handled via comparing Tree objects and rebasing
new changes ontop.
2024-09-18 17:17:24 +02:00
9de8cf698f . 2024-09-18 13:52:50 +02:00
c955811373 devel-importer: handle cases where remotes or factory branch not main 2024-09-18 13:51:52 +02:00
530318a35b add missing file 2024-09-18 12:23:32 +02:00
798f96e364 devel-importer: adapt for scmsync packages 2024-09-18 11:47:42 +02:00
11bf2aafcd remove temp dir created if using another 2024-09-17 15:16:04 +02:00
3a2c590158 Rename variables 2024-09-17 15:06:29 +02:00
a47d217ab3 [devel-importer] configurable import location 2024-09-17 10:56:31 +02:00
6b40bf7bbf devel-importer: migrate logging to log module
also remove webhook stuff, since we use RabbitMQ now
2024-09-16 16:05:46 +02:00
d36c0c407f devel-importer: use common.GitExec() 2024-09-16 13:10:25 +02:00
e71e6f04e8 rename things 2024-09-13 14:58:10 +02:00
7e663964ee Compiles, ship it!
ported pr-review to rabbitmq
2024-09-13 13:41:40 +02:00
7940a8cc86 refactor 2024-09-12 17:57:09 +02:00
edab8aa9dd fix branch handling in repo check 2024-09-12 17:40:51 +02:00
a552f751f0 refactor 2024-09-12 16:40:43 +02:00
b7ec9a9ffb Handle case when branch not exist
Handle default branch name in push and branch create handlers
Don't panic in this case in case the project has multiple configs
2024-09-12 16:25:22 +02:00
b0b39726b8 Do not remove git work directory in debug mode 2024-09-12 14:34:05 +02:00
06228c58f3 remove duplicate RabbitMQ listen topics 2024-09-12 14:30:19 +02:00
d828467d25 fixes 2024-09-12 14:15:59 +02:00
dd316e20b7 direct: change to src as app, instead of gitea 2024-09-11 19:51:49 +02:00
937664dfba enable debug-only rabbitmq webhook hander 2024-09-11 18:56:48 +02:00
4c8eae5e7c simplify error handling 2024-09-11 18:50:49 +02:00
c61d648294 auth token check 2024-09-11 17:51:56 +02:00
630803246c move After= to correct section 2024-09-11 16:32:26 +02:00
69b9e41129 spec file changes 2024-09-11 16:15:38 +02:00
f8ad932e33 spec file fix 2024-09-11 14:08:36 +02:00
1137 changed files with 10989 additions and 4972 deletions

21
.gitattributes vendored Normal file
View File

@@ -0,0 +1,21 @@
*.7z filter=lfs diff=lfs merge=lfs -text
*.bsp filter=lfs diff=lfs merge=lfs -text
*.bz2 filter=lfs diff=lfs merge=lfs -text
*.gem filter=lfs diff=lfs merge=lfs -text
*.gz filter=lfs diff=lfs merge=lfs -text
*.jar filter=lfs diff=lfs merge=lfs -text
*.lz filter=lfs diff=lfs merge=lfs -text
*.lzma filter=lfs diff=lfs merge=lfs -text
*.oxt filter=lfs diff=lfs merge=lfs -text
*.pdf filter=lfs diff=lfs merge=lfs -text
*.png filter=lfs diff=lfs merge=lfs -text
*.rpm filter=lfs diff=lfs merge=lfs -text
*.tbz filter=lfs diff=lfs merge=lfs -text
*.tbz2 filter=lfs diff=lfs merge=lfs -text
*.tgz filter=lfs diff=lfs merge=lfs -text
*.ttf filter=lfs diff=lfs merge=lfs -text
*.txz filter=lfs diff=lfs merge=lfs -text
*.whl filter=lfs diff=lfs merge=lfs -text
*.zip filter=lfs diff=lfs merge=lfs -text
*.zst filter=lfs diff=lfs merge=lfs -text
*.changes merge=merge-changes

5
.gitignore vendored Normal file
View File

@@ -0,0 +1,5 @@
mock
node_modules
*.obscpio
autogits-tmp.tar.zst
*.osc

View File

@@ -5,18 +5,16 @@ The bots that drive Git Workflow for package management
* devel-importer -- helper to import an OBS devel project into a Gitea organization
* gitea-events-rabbitmq-publisher -- takes all events from a Gitea organization (webhook) and publishes it on a RabbitMQ instance
* maintainer-and-policy-bot -- review bot to make sure maintainer signed off on reviews, along with necessary other entities
* obs-staging-bot -- build bot for a PR
* obs-status-service -- report build status of an OBS project as an SVG
* pr-review -- keeps PR to _ObsPrj consistent with a PR to a package update
* prjgit-updater -- update _ObsPrj based on direct pushes and repo creations/removals from organization
* workflow-pr -- keeps PR to _ObsPrj consistent with a PR to a package update
* workflow-direct -- update _ObsPrj based on direct pushes and repo creations/removals from organization
* staging-utils -- review tooling for PR
- list PR
- merge PR
- split PR
- diff PR
- accept/reject PR
* random -- random utils and tools
Bugs
----

15
_service Normal file
View File

@@ -0,0 +1,15 @@
<services>
<!-- workaround, go_modules needs a tar and obs_scm doesn't take file://. -->
<service name="roast" mode="manual">
<param name="target">.</param>
<param name="reproducible">true</param>
<param name="outfile">autogits-tmp.tar.zst</param>
<param name="exclude">autogits-tmp.tar.zst</param>
</service>
<service name="go_modules" mode="manual">
<param name="basename">./</param>
<param name="compression">zst</param>
<param name="vendorname">vendor</param>
</service>
</services>

10
autogits.changes Normal file
View File

@@ -0,0 +1,10 @@
-------------------------------------------------------------------
Wed Sep 11 16:00:58 UTC 2024 - Adam Majer <adam.majer@suse.de>
- enable Authorization bearer token checks
-------------------------------------------------------------------
Wed Sep 11 14:10:18 UTC 2024 - Adam Majer <adam.majer@suse.de>
- rabbitmq publisher

View File

@@ -17,12 +17,11 @@
Name: autogits
Version: 0.0.1
Version: 0
Release: 0
Summary: GitWorkflow utilities
License: GPL-2.0-or-later
URL: https://src.opensuse.org/adamm/autogits/
Source: https://src.opensuse.org/adamm/autogits/0.0.1.tar.gz
URL: https://src.opensuse.org/adamm/autogits
Source1: vendor.tar.zst
BuildRequires: golang-packaging
BuildRequires: systemd-rpm-macros
@@ -33,6 +32,7 @@ BuildRequires: zstd
Git Workflow tooling and utilities enabling automated handing of OBS projects
as git repositories
%package -n gitea-events-rabbitmq-publisher
Summary: Publishes Gitea webhook data via RabbitMQ
@@ -41,18 +41,113 @@ Listens on an HTTP socket and publishes Gitea events on a RabbitMQ instance
with a topic
<scope>.src.$organization.$webhook_type.[$webhook_action_type]
%package -n doc
Summary: Common documentation files
%description -n doc
Common documentation files
%package -n devel-importer
Summary: Imports devel projects from obs to git
%description -n devel-importer
Command-line tool to import devel projects from obs to git
%package -n group-review
Summary: Reviews of groups defined in ProjectGit
%description -n group-review
Is used to handle reviews associated with groups defined in the
ProjectGit.
%package -n obs-staging-bot
Summary: Build a PR against a ProjectGit, if review is requested
%description -n obs-staging-bot
Build a PR against a ProjectGit, if review is requested.
%package -n obs-status-service
Summary: Reports build status of OBS service as an easily to produce SVG
%description -n obs-status-service
Reports build status of OBS service as an easily to produce SVG
%package -n workflow-direct
Summary: Keep ProjectGit in sync for a devel project
%description -n workflow-direct
Keep ProjectGit in sync with packages in the organization of a devel project
%package -n workflow-pr
Summary: Keeps ProjectGit PR in-sync with a PackageGit PR
%description -n workflow-pr
Keeps ProjectGit PR in-sync with a PackageGit PR
%prep
%autosetup -p1
cp -r /home/abuild/rpmbuild/SOURCES/* ./
tar x --zstd -f %{SOURCE1}
%build
go build \
-C gitea-events-rabbitmq-publisher \
-mod=vendor \
-buildmode=pie
go build \
-C devel-importer \
-mod=vendor \
-buildmode=pie
go build \
-C group-review \
-mod=vendor \
-buildmode=pie
go build \
-C obs-staging-bot \
-mod=vendor \
-buildmode=pie
go build \
-C obs-status-service \
-mod=vendor \
-buildmode=pie
#go build \
# -C workflow-direct \
# -mod=vendor \
# -buildmode=pie
#go build \
# -C workflow-pr \
# -mod=vendor \
# -buildmode=pie
%install
install -D -m0755 gitea-events-rabbitmq-publisher/gitea-events-rabbitmq-publisher %{buildroot}%{_bindir}
install -D -m0755 systemd/gitea-events-rabbitmq-publisher.service %{buildroot}%{_unitdir}
install -D -m0755 gitea-events-rabbitmq-publisher/gitea-events-rabbitmq-publisher %{buildroot}%{_bindir}/gitea-events-rabbitmq-publisher
install -D -m0644 systemd/gitea-events-rabbitmq-publisher.service %{buildroot}%{_unitdir}/gitea-events-rabbitmq-publisher.service
install -D -m0755 devel-importer/devel-importer %{buildroot}%{_bindir}/devel-importer
install -D -m0755 group-review/group-review %{buildroot}%{_bindir}/group-review
install -D -m0755 obs-staging-bot/obs-staging-bot %{buildroot}%{_bindir}/obs-staging-bot
install -D -m0755 obs-status-service/obs-status-service %{buildroot}%{_bindir}/obs-status-service
#install -D -m0755 workflow-direct/workflow-direct %{buildroot}%{_bindir}/workflow-direct
#install -D -m0755 workflow-pr/workflow-pr %{buildroot}%{_bindir}/workflow-pr
%pre -n gitea-events-rabbitmq-publisher
%service_add_pre gitea-events-rabbitmq-publisher.service
%post -n gitea-events-rabbitmq-publisher
%service_add_post gitea-events-rabbitmq-publisher.service
%preun -n gitea-events-rabbitmq-publisher
%service_del_preun gitea-events-rabbitmq-publisher.service
%postun -n gitea-events-rabbitmq-publisher
%service_del_postun gitea-events-rabbitmq-publisher.service
%files -n gitea-events-rabbitmq-publisher
%license COPYING
@@ -60,5 +155,38 @@ install -D -m0755 systemd/gitea-events-rabbitmq-publisher.service %{buildroot}%{
%{_bindir}/gitea-events-rabbitmq-publisher
%{_unitdir}/gitea-events-rabbitmq-publisher.service
%changelog
%files -n doc
%license COPYING
%doc doc/README.md
%doc doc/workflows.md
%files -n devel-importer
%license COPYING
%doc devel-importer/README.md
%{_bindir}/devel-importer
%files -n group-review
%license COPYING
%doc group-review/README.md
%{_bindir}/group-review
%files -n obs-staging-bot
%license COPYING
%doc obs-staging-bot/README.md
%{_bindir}/obs-staging-bot
%files -n obs-status-service
%license COPYING
%doc obs-status-service/README.md
%{_bindir}/obs-status-service
%files -n workflow-direct
%license COPYING
%doc workflow-direct/README.md
#%{_bindir}/workflow-direct
%files -n workflow-pr
%license COPYING
%doc workflow-pr/README.md
#%{_bindir}/workflow-pr

View File

@@ -1,74 +0,0 @@
package common
/*
* This file is part of Autogits.
*
* Copyright © 2024 SUSE LLC
*
* Autogits is free software: you can redistribute it and/or modify it under
* the terms of the GNU General Public License as published by the Free Software
* Foundation, either version 2 of the License, or (at your option) any later
* version.
*
* Autogits is distributed in the hope that it will be useful, but WITHOUT ANY
* WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
* PARTICULAR PURPOSE. See the GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License along with
* Foobar. If not, see <https://www.gnu.org/licenses/>.
*/
import (
"encoding/json"
"fmt"
"io"
"os"
"slices"
"strings"
)
type AutogitConfig struct {
Workflows []string // [pr, direct, test]
Organization string
GitProjectName string // Organization/GitProjectName.git is PrjGit
Branch string // branch name of PkgGit that aligns with PrjGit submodules
}
func ReadWorkflowConfigs(reader io.Reader) ([]*AutogitConfig, error) {
data, err := io.ReadAll(reader)
if err != nil {
return nil, fmt.Errorf("Error reading config file. err: %w", err)
}
var config []*AutogitConfig
if err = json.Unmarshal(data, &config); err != nil {
return nil, fmt.Errorf("Error parsing config file. err: %w", err)
}
availableWorkflows := []string{"pr", "direct", "test"}
for _, workflow := range config {
for _, w := range workflow.Workflows {
if !slices.Contains(availableWorkflows, w) {
return nil, fmt.Errorf(
"Invalid Workflow '%s'. Only available workflows are: %s",
w, strings.Join(availableWorkflows, " "),
)
}
}
if len(workflow.GitProjectName) == 0 {
workflow.GitProjectName = DefaultGitPrj
}
}
return config, nil
}
func ReadWorkflowConfigsFile(filename string) ([]*AutogitConfig, error) {
file, err := os.Open(filename)
if err != nil {
return nil, fmt.Errorf("Cannot open config file for reading. err: %w", err)
}
defer file.Close()
return ReadWorkflowConfigs(file)
}

View File

@@ -1,705 +0,0 @@
package common
/*
* This file is part of Autogits.
*
* Copyright © 2024 SUSE LLC
*
* Autogits is free software: you can redistribute it and/or modify it under
* the terms of the GNU General Public License as published by the Free Software
* Foundation, either version 2 of the License, or (at your option) any later
* version.
*
* Autogits is distributed in the hope that it will be useful, but WITHOUT ANY
* WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
* PARTICULAR PURPOSE. See the GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License along with
* Foobar. If not, see <https://www.gnu.org/licenses/>.
*/
import (
"fmt"
"io"
"log"
"os"
"os/exec"
"path"
"path/filepath"
"strings"
"sync"
)
type GitHandler struct {
DebugLogger bool
GitPath string
GitCommiter string
GitEmail string
}
func CreateGitHandler(git_author, email, name string) (*GitHandler, error) {
var err error
git := new(GitHandler)
git.GitCommiter = git_author
git.GitPath, err = os.MkdirTemp("", name)
if err != nil {
return nil, fmt.Errorf("Cannot create temp dir: %w", err)
}
if err = os.Chmod(git.GitPath, 0700); err != nil {
return nil, fmt.Errorf("Cannot fix permissions of temp dir: %w", err)
}
return git, nil
}
//func (h *GitHandler) ProcessBranchList() []string {
// if h.HasError() {
// return make([]string, 0)
// }
//
// trackedBranches, err := os.ReadFile(path.Join(h.GitPath, DefaultGitPrj, TrackedBranchesFile))
// if err != nil {
// if errors.Is(err, os.ErrNotExist) {
// trackedBranches = []byte("factory")
// } else {
// h.LogError("file error reading '%s' file in repo", TrackedBranchesFile)
// h.Error = err
// return make([]string, 0)
// }
// }
//
// return strings.Split(string(trackedBranches), "\n")
//}
type GitReference struct {
Branch string
Id string
}
type GitReferences struct {
refs []GitReference
}
func (refs *GitReferences) addReference(id, branch string) {
for _, ref := range refs.refs {
if ref.Id == id && ref.Branch == branch {
return
}
}
refs.refs = append(refs.refs, GitReference{Branch: branch, Id: id})
}
func processRefs(gitDir string) ([]GitReference, error) {
packedRefsPath := path.Join(gitDir, "packed-refs")
stat, err := os.Stat(packedRefsPath)
if err != nil {
return nil, err
}
if stat.Size() > 10000 || stat.IsDir() {
return nil, fmt.Errorf("Funny business with 'packed-refs' in '%s'", gitDir)
}
data, err := os.ReadFile(packedRefsPath)
if err != nil {
return nil, err
}
var references GitReferences
for _, line := range strings.Split(string(data), "\n") {
if len(line) < 1 || line[0] == '#' {
continue
}
splitLine := strings.Split(line, " ")
if len(splitLine) != 2 {
return nil, fmt.Errorf("Unexpected packaged-refs entry '%#v' in '%s'", splitLine, packedRefsPath)
}
id, ref := splitLine[0], splitLine[1]
const remoteRefPrefix = "refs/remotes/origin/"
if ref[0:len(remoteRefPrefix)] != remoteRefPrefix {
continue
}
references.addReference(id, ref[len(remoteRefPrefix):])
}
return references.refs, nil
}
func findGitDir(p string) (string, error) {
gitFile := path.Join(p, ".git")
stat, err := os.Stat(gitFile)
if err != nil {
return "", err
}
if stat.IsDir() {
return path.Join(p, ".git"), nil
}
data, err := os.ReadFile(gitFile)
if err != nil {
return "", err
}
for _, line := range strings.Split(string(data), "\n") {
refs := strings.Split(line, ":")
if len(refs) != 2 {
return "", fmt.Errorf("Unknown format of .git file: '%s'\n", line)
}
if refs[0] != "gitdir" {
return "", fmt.Errorf("Unknown header of .git file: '%s'\n", refs[0])
}
return path.Join(p, strings.TrimSpace(refs[1])), nil
}
return "", fmt.Errorf("Can't find git subdirectory in '%s'", p)
}
func (e *GitHandler) GitBranchHead(gitDir, branchName string) (string, error) {
path, err := findGitDir(path.Join(e.GitPath, gitDir))
if err != nil {
return "", fmt.Errorf("Error identifying gitdir in `%s`: %w", gitDir, err)
}
refs, err := processRefs(path)
if err != nil {
return "", fmt.Errorf("Error finding branches (%s): %w\n", branchName, err)
}
for _, ref := range refs {
if ref.Branch == branchName {
return ref.Id, nil
}
}
return "", fmt.Errorf("Can't find default remote branch: %s", branchName)
}
func (e *GitHandler) Close() error {
if err := os.RemoveAll(e.GitPath); err != nil {
return err
}
e.GitPath = ""
return nil
}
type writeFunc func(data []byte) (int, error)
func (f writeFunc) Write(data []byte) (int, error) {
return f(data)
}
func (h writeFunc) UnmarshalText(text []byte) error {
_, err := h.Write(text)
return err
}
func (h writeFunc) Close() error {
_, err := h.Write(nil)
return err
}
func (e *GitHandler) GitExec(cwd string, params ...string) error {
cmd := exec.Command("/usr/bin/git", params...)
cmd.Env = []string{
"GIT_CEILING_DIRECTORIES=" + e.GitPath,
"GIT_CONFIG_GLOBAL=/dev/null",
"GIT_AUTHOR_NAME=" + e.GitCommiter,
"GIT_COMMITTER_NAME=" + e.GitCommiter,
"EMAIL=not@exist@src.opensuse.org",
"GIT_LFS_SKIP_SMUDGE=1",
"GIT_SSH_COMMAND=/usr/bin/ssh -o StrictHostKeyChecking=yes",
}
cmd.Dir = filepath.Join(e.GitPath, cwd)
cmd.Stdin = nil
if e.DebugLogger {
log.Printf("git execute: %#v\n", cmd.Args)
}
out, err := cmd.CombinedOutput()
if e.DebugLogger {
log.Println(string(out))
}
if err != nil {
if e.DebugLogger {
log.Printf(" *** error: %v\n", err)
}
return fmt.Errorf("error executing: git %#v \n%s\n err: %w", cmd.Args, out, err)
}
return nil
}
type ChanIO struct {
ch chan byte
}
func (c *ChanIO) Write(p []byte) (int, error) {
for _, b := range p {
c.ch <- b
}
return len(p), nil
}
// read at least 1 byte, but don't block if nothing more in channel
func (c *ChanIO) Read(data []byte) (idx int, err error) {
var ok bool
data[idx], ok = <-c.ch
if !ok {
err = io.EOF
return
}
idx++
for len(c.ch) > 0 && idx < len(data) {
data[idx], ok = <-c.ch
if !ok {
err = io.EOF
return
}
idx++
}
return
}
type gitMsg struct {
hash string
itemType string
size int
}
type commit struct {
Tree string
Msg string
}
type tree_entry struct {
name string
mode int
hash string
size int
}
type tree struct {
items []tree_entry
}
func (t *tree_entry) isSubmodule() bool {
return (t.mode & 0170000) == 0160000
}
func (t *tree_entry) isTree() bool {
return (t.mode & 0170000) == 0040000
}
func (t *tree_entry) isBlob() bool {
return !t.isTree() && !t.isSubmodule()
}
func parseGitMsg(data <-chan byte) (gitMsg, error) {
var id []byte = make([]byte, 64)
var msgType []byte = make([]byte, 16)
var size int
pos := 0
for c := <-data; c != ' '; c = <-data {
if (c >= '0' && c <= '9') || (c >= 'a' && c <= 'f') {
id[pos] = c
pos++
} else {
return gitMsg{}, fmt.Errorf("Invalid character during object hash parse '%c' at %d", c, pos)
}
}
id = id[:pos]
pos = 0
var c byte
for c = <-data; c != ' ' && c != '\x00'; c = <-data {
if c >= 'a' && c <= 'z' {
msgType[pos] = c
pos++
} else {
return gitMsg{}, fmt.Errorf("Invalid character during object type parse '%c' at %d", c, pos)
}
}
msgType = msgType[:pos]
switch string(msgType) {
case "commit", "tree", "blob":
break
case "missing":
if c != '\x00' {
return gitMsg{}, fmt.Errorf("Missing format weird")
}
return gitMsg{
hash: string(id[:]),
itemType: "missing",
size: 0,
}, fmt.Errorf("Object not found: '%s'", string(id))
default:
return gitMsg{}, fmt.Errorf("Invalid object type: '%s'", string(msgType))
}
for c = <-data; c != '\000'; c = <-data {
if c >= '0' && c <= '9' {
size = size*10 + (int(c) - '0')
} else {
return gitMsg{}, fmt.Errorf("Invalid character during object size parse: '%c'", c)
}
}
return gitMsg{
hash: string(id[:]),
itemType: string(msgType),
size: size,
}, nil
}
func parseGitCommitHdr(data <-chan byte) ([2]string, error) {
hdr := make([]byte, 0, 60)
val := make([]byte, 0, 1000)
c := <-data
if c != '\n' { // end of header marker
for ; c != ' '; c = <-data {
hdr = append(hdr, c)
}
for c := <-data; c != '\n'; c = <-data {
val = append(val, c)
}
}
return [2]string{string(hdr), string(val)}, nil
}
func parseGitCommitMsg(data <-chan byte, l int) (string, error) {
msg := make([]byte, 0, l)
for c := <-data; c != '\x00'; c = <-data {
msg = append(msg, c)
l--
}
// l--
if l != 0 {
return "", fmt.Errorf("Unexpected data in the git commit msg: l=%d", l)
}
return string(msg), nil
}
func parseGitCommit(data <-chan byte) (commit, error) {
hdr, err := parseGitMsg(data)
if err != nil {
return commit{}, err
} else if hdr.itemType != "commit" {
return commit{}, fmt.Errorf("expected commit but parsed %s", hdr.itemType)
}
var c commit
l := hdr.size
for {
hdr, err := parseGitCommitHdr(data)
if err != nil {
return commit{}, nil
}
if len(hdr[0])+len(hdr[1]) == 0 { // hdr end marker
break
}
switch hdr[0] {
case "tree":
c.Tree = hdr[1]
}
l -= len(hdr[0]) + len(hdr[1]) + 2
}
l--
c.Msg, err = parseGitCommitMsg(data, l)
return c, err
}
func parseTreeEntry(data <-chan byte, hashLen int) (tree_entry, error) {
var e tree_entry
for c := <-data; c != ' '; c = <-data {
e.mode = e.mode*8 + int(c-'0')
e.size++
}
e.size++
name := make([]byte, 0, 128)
for c := <-data; c != '\x00'; c = <-data {
name = append(name, c)
e.size++
}
e.size++
e.name = string(name)
const hexBinToAscii = "0123456789abcdef"
hash := make([]byte, 0, hashLen*2)
for range hashLen {
c := <-data
hash = append(hash, hexBinToAscii[((c&0xF0)>>4)], hexBinToAscii[c&0xF])
}
e.hash = string(hash)
e.size += hashLen
return e, nil
}
func parseGitTree(data <-chan byte) (tree, error) {
hdr, err := parseGitMsg(data)
if err != nil {
return tree{}, err
}
// max capacity to length of hash
t := tree{items: make([]tree_entry, 0, hdr.size/len(hdr.hash))}
parsedLen := 0
for parsedLen < hdr.size {
entry, err := parseTreeEntry(data, len(hdr.hash)/2)
if err != nil {
return tree{}, nil
}
t.items = append(t.items, entry)
parsedLen += entry.size
}
c := <-data // \0 read
if c != '\x00' {
return t, fmt.Errorf("Unexpected character during git tree data read")
}
if parsedLen != hdr.size {
return t, fmt.Errorf("Invalid size of git tree data")
}
return t, nil
}
func parseGitBlob(data <-chan byte) ([]byte, error) {
hdr, err := parseGitMsg(data)
if err != nil {
return []byte{}, err
}
d := make([]byte, hdr.size)
for l := 0; l < hdr.size; l++ {
d[l] = <-data
}
eob := <-data
if eob != '\x00' {
return d, fmt.Errorf("invalid byte read in parseGitBlob")
}
return d, nil
}
// TODO: support sub-trees
func (e *GitHandler) GitCatFile(cwd, commitId, filename string) (data []byte, err error) {
var done sync.Mutex
done.Lock()
data_in, data_out := ChanIO{make(chan byte, 256)}, ChanIO{make(chan byte, 70)}
go func() {
defer done.Unlock()
defer close(data_out.ch)
data_out.Write([]byte(commitId))
data_out.ch <- '\x00'
c, err := parseGitCommit(data_in.ch)
if err != nil {
log.Printf("Error parsing git commit: %v\n", err)
return
}
data_out.Write([]byte(c.Tree))
data_out.ch <- '\x00'
tree, err := parseGitTree(data_in.ch)
if err != nil {
if e.DebugLogger {
log.Printf("Error parsing git tree: %v\n", err)
}
return
}
for _, te := range tree.items {
if te.isBlob() && te.name == filename {
data_out.Write([]byte(te.hash))
data_out.ch <- '\x00'
data, err = parseGitBlob(data_in.ch)
return
}
}
err = fmt.Errorf("file not found: '%s'", filename)
}()
cmd := exec.Command("/usr/bin/git", "cat-file", "--batch", "-Z")
cmd.Env = []string{
"GIT_CEILING_DIRECTORIES=" + e.GitPath,
"GIT_CONFIG_GLOBAL=/dev/null",
}
cmd.Dir = filepath.Join(e.GitPath, cwd)
cmd.Stdout = &data_in
cmd.Stdin = &data_out
cmd.Stderr = writeFunc(func(data []byte) (int, error) {
if e.DebugLogger {
log.Printf(string(data))
}
return len(data), nil
})
if e.DebugLogger {
log.Printf("command run: %v\n", cmd.Args)
}
err = cmd.Run()
done.Lock()
return
}
// return (filename) -> (hash) map for all submodules
// TODO: recursive? map different orgs, not just assume '.' for path
func (e *GitHandler) GitSubmoduleList(cwd, commitId string) (submoduleList map[string]string, err error) {
var done sync.Mutex
submoduleList = make(map[string]string)
done.Lock()
data_in, data_out := ChanIO{make(chan byte, 256)}, ChanIO{make(chan byte, 70)}
go func() {
defer done.Unlock()
defer close(data_out.ch)
data_out.Write([]byte(commitId))
data_out.ch <- '\x00'
var c commit
c, err = parseGitCommit(data_in.ch)
if err != nil {
err = fmt.Errorf("Error parsing git commit. Err: %w", err)
return
}
data_out.Write([]byte(c.Tree))
data_out.ch <- '\x00'
var tree tree
tree, err = parseGitTree(data_in.ch)
if err != nil {
err = fmt.Errorf("Error parsing git tree: %w", err)
return
}
for _, te := range tree.items {
if te.isSubmodule() {
submoduleList[te.name] = te.hash
}
}
}()
cmd := exec.Command("/usr/bin/git", "cat-file", "--batch", "-Z")
cmd.Env = []string{
"GIT_CEILING_DIRECTORIES=" + e.GitPath,
"GIT_CONFIG_GLOBAL=/dev/null",
}
cmd.Dir = filepath.Join(e.GitPath, cwd)
cmd.Stdout = &data_in
cmd.Stdin = &data_out
cmd.Stderr = writeFunc(func(data []byte) (int, error) {
if e.DebugLogger {
log.Println(string(data))
}
return len(data), nil
})
if e.DebugLogger {
log.Printf("command run: %v\n", cmd.Args)
}
err = cmd.Run()
done.Lock()
return submoduleList, err
}
func (e *GitHandler) GitSubmoduleCommitId(cwd, packageName, commitId string) (subCommitId string, valid bool) {
defer func() {
if recover() != nil {
commitId = ""
valid = false
}
}()
data_in, data_out := ChanIO{make(chan byte, 256)}, ChanIO{make(chan byte, 70)}
var wg sync.WaitGroup
wg.Add(1)
if e.DebugLogger {
log.Printf("getting commit id '%s' from git at '%s' with packageName: %s\n", commitId, cwd, packageName)
}
go func() {
defer wg.Done()
defer close(data_out.ch)
data_out.Write([]byte(commitId))
data_out.ch <- '\x00'
c, err := parseGitCommit(data_in.ch)
if err != nil {
log.Panicf("Error parsing git commit: %v\n", err)
}
data_out.Write([]byte(c.Tree))
data_out.ch <- '\x00'
tree, err := parseGitTree(data_in.ch)
if err != nil {
log.Panicf("Error parsing git tree: %v\n", err)
}
for _, te := range tree.items {
if te.name == packageName && te.isSubmodule() {
subCommitId = te.hash
return
}
}
}()
cmd := exec.Command("/usr/bin/git", "cat-file", "--batch", "-Z")
cmd.Env = []string{
"GIT_CEILING_DIRECTORIES=" + e.GitPath,
"GIT_CONFIG_GLOBAL=/dev/null",
}
cmd.Dir = filepath.Join(e.GitPath, cwd)
cmd.Stdout = &data_in
cmd.Stdin = &data_out
cmd.Stderr = writeFunc(func(data []byte) (int, error) {
log.Println(string(data))
return len(data), nil
})
if e.DebugLogger {
log.Printf("command run: %v\n", cmd.Args)
}
if err := cmd.Run(); err != nil {
log.Printf("Error running command %v, err: %v", cmd.Args, err)
}
wg.Wait()
return subCommitId, len(subCommitId) == len(commitId)
}

View File

@@ -1,487 +0,0 @@
package common
/*
* This file is part of Autogits.
*
* Copyright © 2024 SUSE LLC
*
* Autogits is free software: you can redistribute it and/or modify it under
* the terms of the GNU General Public License as published by the Free Software
* Foundation, either version 2 of the License, or (at your option) any later
* version.
*
* Autogits is distributed in the hope that it will be useful, but WITHOUT ANY
* WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
* PARTICULAR PURPOSE. See the GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License along with
* Foobar. If not, see <https://www.gnu.org/licenses/>.
*/
import (
"fmt"
"io"
"os"
"path/filepath"
"slices"
"strings"
"time"
transport "github.com/go-openapi/runtime/client"
"github.com/go-openapi/strfmt"
apiclient "src.opensuse.org/autogits/common/gitea-generated/client"
"src.opensuse.org/autogits/common/gitea-generated/client/notification"
"src.opensuse.org/autogits/common/gitea-generated/client/organization"
"src.opensuse.org/autogits/common/gitea-generated/client/repository"
"src.opensuse.org/autogits/common/gitea-generated/models"
)
const PrPattern = "PR: %s/%s#%d"
const (
// from Gitea
// ReviewStateApproved pr is approved
ReviewStateApproved models.ReviewStateType = "APPROVED"
// ReviewStatePending pr state is pending
ReviewStatePending models.ReviewStateType = "PENDING"
// ReviewStateComment is a comment review
ReviewStateComment models.ReviewStateType = "COMMENT"
// ReviewStateRequestChanges changes for pr are requested
ReviewStateRequestChanges models.ReviewStateType = "REQUEST_CHANGES"
// ReviewStateRequestReview review is requested from user
ReviewStateRequestReview models.ReviewStateType = "REQUEST_REVIEW"
// ReviewStateUnknown state of pr is unknown
ReviewStateUnknown models.ReviewStateType = ""
)
type GiteaTransport struct {
transport *transport.Runtime
client *apiclient.GiteaAPI
}
func AllocateGiteaTransport(host string) *GiteaTransport {
var r GiteaTransport
r.transport = transport.New(host, apiclient.DefaultBasePath, [](string){"https"})
r.transport.DefaultAuthentication = transport.BearerToken(giteaToken)
r.client = apiclient.New(r.transport, nil)
return &r
}
func (gitea *GiteaTransport) GetPullRequestAndReviews(org, project string, num int64) (*models.PullRequest, []*models.PullReview, error) {
pr, err := gitea.client.Repository.RepoGetPullRequest(
repository.NewRepoGetPullRequestParams().
WithDefaults().
WithOwner(org).
WithRepo(project).
WithIndex(num),
gitea.transport.DefaultAuthentication,
)
if err != nil {
return nil, nil, err
}
limit := int64(1000)
reviews, err := gitea.client.Repository.RepoListPullReviews(
repository.NewRepoListPullReviewsParams().
WithDefaults().
WithOwner(org).
WithRepo(project).
WithIndex(num).
WithLimit(&limit),
gitea.transport.DefaultAuthentication,
)
if err != nil {
return nil, nil, err
}
return pr.Payload, reviews.Payload, nil
}
func (gitea *GiteaTransport) GetPullNotifications(since *time.Time) ([]*models.NotificationThread, error) {
bigLimit := int64(100000)
params := notification.NewNotifyGetListParams().
WithDefaults().
WithSubjectType([]string{"Pull"}).
WithStatusTypes([]string{"unread"}).
WithLimit(&bigLimit)
if since != nil {
s := strfmt.DateTime(*since)
params.SetSince(&s)
}
list, err := gitea.client.Notification.NotifyGetList(params, gitea.transport.DefaultAuthentication)
if err != nil {
return nil, err
}
return list.Payload, nil
}
func (gitea *GiteaTransport) SetNotificationRead(notificationId int64) error {
_, err := gitea.client.Notification.NotifyReadThread(
notification.NewNotifyReadThreadParams().
WithDefaults().
WithID(fmt.Sprint(notificationId)),
gitea.transport.DefaultAuthentication,
)
if err != nil {
return fmt.Errorf("Error setting notification: %d. Err: %w", notificationId, err)
}
return nil
}
func (gitea *GiteaTransport) GetOrganization(orgName string) (*models.Organization, error) {
org, err := gitea.client.Organization.OrgGet(
organization.NewOrgGetParams().WithOrg(orgName),
gitea.transport.DefaultAuthentication,
)
if err != nil {
return nil, fmt.Errorf("Error fetching org: '%s' data. Err: %w", orgName, err)
}
return org.Payload, nil
}
func (gitea *GiteaTransport) GetOrganizationRepositories(orgName string) ([]*models.Repository, error) {
var page int64
repos := make([]*models.Repository, 0, 100)
page = 1
for {
ret, err := gitea.client.Organization.OrgListRepos(
organization.NewOrgListReposParams().WithOrg(orgName).WithPage(&page),
gitea.transport.DefaultAuthentication,
)
if err != nil {
return nil, fmt.Errorf("Error retrieving repository list for org: '%s'. Err: %w", orgName, err)
}
if len(ret.Payload) == 0 {
break
}
repos = append(repos, ret.Payload...)
page++
}
return repos, nil
}
func (gitea *GiteaTransport) CreateRepositoryIfNotExist(git *GitHandler, org Organization, repoName string) (*models.Repository, error) {
repo, err := gitea.client.Repository.RepoGet(
repository.NewRepoGetParams().WithDefaults().WithOwner(org.Username).WithRepo(repoName),
gitea.transport.DefaultAuthentication)
if err != nil {
switch err.(type) {
case *repository.RepoGetNotFound:
repo, err := gitea.client.Organization.CreateOrgRepo(
organization.NewCreateOrgRepoParams().WithDefaults().WithBody(
&models.CreateRepoOption{
AutoInit: false,
Name: &repoName,
ObjectFormatName: models.CreateRepoOptionObjectFormatNameSha256,
},
).WithOrg(org.Username),
nil,
)
if err != nil {
switch err.(type) {
case *organization.CreateOrgRepoCreated:
// weird, but ok, repo created
default:
return nil, fmt.Errorf("error creating repo '%s' under '%s'. Err: %w", repoName, org.Username, err)
}
}
// initialize repository
if err = os.Mkdir(filepath.Join(git.GitPath, DefaultGitPrj), 0700); err != nil {
return nil, err
}
if err = git.GitExec(DefaultGitPrj, "init", "--object-format="+repo.Payload.ObjectFormatName); err != nil {
return nil, err
}
if err = git.GitExec(DefaultGitPrj, "checkout", "-b", repo.Payload.DefaultBranch); err != nil {
return nil, err
}
readmeFilename := filepath.Join(git.GitPath, DefaultGitPrj, "README.md")
{
file, _ := os.Create(readmeFilename)
defer file.Close()
io.WriteString(file, ReadmeBoilerplate)
}
if err = git.GitExec(DefaultGitPrj, "add", "README.md"); err != nil {
return nil, err
}
if err = git.GitExec(DefaultGitPrj, "commit", "-m", "Automatic devel project creation"); err != nil {
return nil, err
}
if err = git.GitExec(DefaultGitPrj, "remote", "add", "origin", repo.Payload.SSHURL); err != nil {
return nil, err
}
return repo.Payload, nil
default:
return nil, fmt.Errorf("cannot fetch repo data for '%s' / '%s' : %w", org.Username, repoName, err)
}
}
return repo.Payload, nil
}
func (gitea *GiteaTransport) CreatePullRequest(repo *models.Repository, srcId, targetId, title, body string) (*models.PullRequest, error) {
prOptions := models.CreatePullRequestOption{
Base: repo.DefaultBranch,
Head: srcId,
Title: title,
Body: body,
}
pr, err := gitea.client.Repository.RepoCreatePullRequest(
repository.
NewRepoCreatePullRequestParams().
WithDefaults().
WithOwner(repo.Owner.UserName).
WithRepo(repo.Name).
WithBody(&prOptions),
gitea.transport.DefaultAuthentication,
)
if err != nil {
return nil, fmt.Errorf("Cannot create pull request. %w", err)
}
return pr.GetPayload(), nil
}
func (gitea *GiteaTransport) RequestReviews(pr *models.PullRequest, reviewer string) ([]*models.PullReview, error) {
reviewOptions := models.PullReviewRequestOptions{
Reviewers: []string{reviewer},
}
review, err := gitea.client.Repository.RepoCreatePullReviewRequests(
repository.
NewRepoCreatePullReviewRequestsParams().
WithOwner(pr.Base.Repo.Owner.UserName).
WithRepo(pr.Base.Repo.Name).
WithIndex(pr.Index).
WithBody(&reviewOptions),
gitea.transport.DefaultAuthentication,
)
if err != nil {
return nil, fmt.Errorf("Cannot create pull request: %w", err)
}
return review.GetPayload(), nil
}
func (gitea *GiteaTransport) IsReviewed(pr *models.PullRequest) (bool, error) {
// TODO: get review from project git
reviewers := pr.RequestedReviewers
var page, limit int64
var reviews []*models.PullReview
page = 0
limit = 20
for {
res, err := gitea.client.Repository.RepoListPullReviews(
repository.NewRepoListPullReviewsParams().
WithOwner(pr.Base.Repo.Owner.UserName).
WithRepo(pr.Base.Repo.Name).
WithPage(&page).
WithLimit(&limit),
gitea.transport.DefaultAuthentication)
if err != nil {
return false, err
}
if res.IsSuccess() {
r := res.Payload
if reviews == nil {
reviews = r
} else {
reviews = append(reviews, r...)
}
if len(r) < int(limit) {
break
}
}
}
slices.Reverse(reviews)
for _, review := range reviews {
if review.Stale || review.Dismissed {
continue
}
next_review:
for i, reviewer := range reviewers {
if review.User.UserName == reviewer.UserName {
switch review.State {
case ReviewStateApproved:
reviewers = slices.Delete(reviewers, i, i)
break next_review
case ReviewStateRequestChanges:
return false, nil
}
}
}
}
return len(reviewers) == 0, nil
}
func (gitea *GiteaTransport) AddReviewComment(pr *models.PullRequest, state models.ReviewStateType, comment string) (*models.PullReview, error) {
c, err := gitea.client.Repository.RepoCreatePullReview(
repository.NewRepoCreatePullReviewParams().
WithDefaults().
WithOwner(pr.Base.Repo.Owner.UserName).
WithRepo(pr.Base.Repo.Name).
WithIndex(pr.Index).
WithBody(&models.CreatePullReviewOptions{
Event: state,
Body: comment,
}),
gitea.transport.DefaultAuthentication,
)
/*
c, err := client.Repository.RepoSubmitPullReview(
repository.NewRepoSubmitPullReviewParams().
WithDefaults().
WithOwner(pr.Base.Repo.Owner.UserName).
WithRepo(pr.Base.Repo.Name).
WithIndex(pr.Index).
WithID(review.ID).
WithBody(&models.SubmitPullReviewOptions{
Event: state,
Body: comment,
}),
transport.DefaultAuthentication,
)
*/
/* c, err := client.Issue.IssueCreateComment(
issue.NewIssueCreateCommentParams().
WithDefaults().
WithOwner(pr.Base.Repo.Owner.UserName).
WithRepo(pr.Base.Repo.Name).
WithIndex(pr.Index).
WithBody(&models.CreateIssueCommentOption{
Body: &comment,
}),
transport.DefaultAuthentication)
*/
if err != nil {
return nil, err
}
return c.Payload, nil
}
func (gitea *GiteaTransport) GetAssociatedPrjGitPR(pr *PullRequestWebhookEvent) (*models.PullRequest, error) {
var page, maxSize int64
page = 1
maxSize = 10000
state := "open"
prs, err := gitea.client.Repository.RepoListPullRequests(
repository.
NewRepoListPullRequestsParams().
WithDefaults().
WithOwner(pr.Repository.Owner.Username).
WithRepo(DefaultGitPrj).
WithState(&state).
WithLimit(&maxSize).
WithPage(&page),
gitea.transport.DefaultAuthentication)
if err != nil {
return nil, fmt.Errorf("cannot fetch PR list for %s / %s : %w", pr.Repository.Owner.Username, pr.Repository.Name, err)
}
prLine := fmt.Sprintf(PrPattern, pr.Repository.Owner.Username, pr.Repository.Name, pr.Number)
// h.StdLogger.Printf("attemping to match line: '%s'\n", prLine)
// payload_processing:
for _, pr := range prs.Payload {
lines := strings.Split(pr.Body, "\n")
for _, line := range lines {
if strings.TrimSpace(line) == prLine {
return pr, nil
}
}
}
return nil, nil
}
func (gitea *GiteaTransport) GetRepositoryFileContent(repo *models.Repository, hash, path string) ([]byte, error) {
var retData []byte
dataOut := writeFunc(func(data []byte) (int, error) {
if len(data) == 0 {
return 0, nil
}
retData = data
return len(data), nil
})
_, err := gitea.client.Repository.RepoGetRawFile(
repository.NewRepoGetRawFileParams().
WithOwner(repo.Owner.UserName).
WithRepo(repo.Name).
WithFilepath(path).
WithRef(&hash),
gitea.transport.DefaultAuthentication,
dataOut,
repository.WithContentTypeApplicationOctetStream,
)
if err != nil {
return nil, err
}
return retData, nil
}
func (gitea *GiteaTransport) GetPullRequestFileContent(pr *models.PullRequest, path string) ([]byte, error) {
return gitea.GetRepositoryFileContent(pr.Head.Repo, pr.Head.Sha, path)
}
func (gitea *GiteaTransport) GetRecentCommits(org, repo, branch string, commitNo int64) ([]*models.Commit, error) {
not := false
var page int64
page = 1
commits, err := gitea.client.Repository.RepoGetAllCommits(
repository.NewRepoGetAllCommitsParams().
WithOwner(org).
WithRepo(repo).
WithSha(&branch).
WithPage(&page).
WithStat(&not).
WithFiles(&not).
WithVerification(&not).
WithLimit(&commitNo),
gitea.transport.DefaultAuthentication,
)
if err != nil {
return nil, err
}
return commits.Payload, nil
}

View File

@@ -7,7 +7,8 @@ gitea-generated/client/gitea_api_client.go:: api.json
[ -d gitea-generated ] || mkdir gitea-generated
podman run --rm -v $$(pwd):/api ghcr.io/go-swagger/go-swagger generate client -f /api/api.json -t /api/gitea-generated
api: gitea-generated/client/gitea_api_client.go
api: gitea-generated/client/gitea_api_client.go mock_gitea_utils.go
go generate
build: api
go build

View File

@@ -0,0 +1,118 @@
package common
import (
"bufio"
"errors"
"fmt"
"io"
"regexp"
"slices"
"strconv"
"strings"
)
const PrPattern = "PR: %s/%s#%d"
type BasicPR struct {
Org, Repo string
Num int64
}
var validOrgAndRepoRx *regexp.Regexp = regexp.MustCompile("^[A-Za-z0-9_-]+$")
func parsePrLine(line string) (BasicPR, error) {
var ret BasicPR
trimmedLine := strings.TrimSpace(line)
// min size > 9 -> must fit all parameters in th PrPattern with at least one item per parameter
if len(trimmedLine) < 9 || trimmedLine[0:4] != "PR: " {
return ret, errors.New("Line too short")
}
trimmedLine = trimmedLine[4:]
org := strings.SplitN(trimmedLine, "/", 2)
ret.Org = org[0]
if len(org) != 2 {
return ret, errors.New("missing / separator")
}
repo := strings.SplitN(org[1], "#", 2)
ret.Repo = repo[0]
if len(repo) != 2 {
return ret, errors.New("Missing # separator")
}
// Gitea requires that each org and repo be [A-Za-z0-9_-]+
var err error
if ret.Num, err = strconv.ParseInt(repo[1], 10, 64); err != nil {
return ret, errors.New("Invalid number")
}
if !validOrgAndRepoRx.MatchString(repo[0]) || !validOrgAndRepoRx.MatchString(org[0]) {
return ret, errors.New("Invalid repo or org character set")
}
return ret, nil
}
func ExtractDescriptionAndPRs(data *bufio.Scanner) (string, []BasicPR) {
prs := make([]BasicPR, 0, 1)
var desc strings.Builder
for data.Scan() {
line := data.Text()
pr, err := parsePrLine(line)
if err != nil {
desc.WriteString(line)
desc.WriteByte('\n')
} else {
prs = append(prs, pr)
}
}
return strings.TrimSpace(desc.String()), prs
}
func prToLine(writer io.Writer, pr BasicPR) {
writer.Write([]byte("\n"))
fmt.Fprintf(writer, PrPattern, pr.Org, pr.Repo, pr.Num)
}
// returns:
// <0 for a<b
// >0 for a>b
// =0 when equal
func compareBasicPRs(a BasicPR, b BasicPR) int {
if c := strings.Compare(a.Org, b.Org); c != 0 {
return c
}
if c := strings.Compare(a.Repo, b.Repo); c != 0 {
return c
}
if a.Num > b.Num {
return 1
}
if a.Num < b.Num {
return -1
}
return 0
}
func AppendPRsToDescription(desc string, prs []BasicPR) string {
var out strings.Builder
out.WriteString(strings.TrimSpace(desc))
out.WriteString("\n")
slices.SortFunc(prs, compareBasicPRs)
prs = slices.Compact(prs)
for _, pr := range prs {
prToLine(&out, pr)
}
return out.String()
}

View File

@@ -0,0 +1,149 @@
package common_test
import (
"bufio"
"slices"
"strings"
"testing"
"src.opensuse.org/autogits/common"
)
func newStringScanner(s string) *bufio.Scanner {
return bufio.NewScanner(strings.NewReader(s))
}
func TestAssociatedPRScanner(t *testing.T) {
testTable := []struct {
name string
input string
prs []common.BasicPR
desc string
}{
{
"No PRs",
"",
[]common.BasicPR{},
"",
},
{
"Single PRs",
"Some header of the issue\n\nFollowed by some description\n\nPR: test/foo#4\n",
[]common.BasicPR{{Org: "test", Repo: "foo", Num: 4}},
"Some header of the issue\n\nFollowed by some description",
},
{
"Multiple PRs",
"Some header of the issue\n\nFollowed by some description\nPR: test/foo#4\n\nPR: test/goo#5\n",
[]common.BasicPR{
{Org: "test", Repo: "foo", Num: 4},
{Org: "test", Repo: "goo", Num: 5},
},
"Some header of the issue\n\nFollowed by some description",
},
{
"Multiple PRs with whitespace",
"Some header of the issue\n\n\tPR: test/goo#5\n\n Followed by some description\n \t PR: test/foo#4\n",
[]common.BasicPR{
{Org: "test", Repo: "foo", Num: 4},
{Org: "test", Repo: "goo", Num: 5},
},
"Some header of the issue\n\n\n Followed by some description",
},
{
"Multiple PRs with missing names and other special cases to ignore",
"Some header of the issue\n\n\n\t PR: foobar#5 \n\t PR: rd/goo5 \n\t PR: test/#5 \n" +
"\t PR: /goo#5 \n\t PR: test/goo# \n\t PR: test / goo # 10 \n\tPR: test/gool# 10 \n" +
"\t PR: test/goo#5 \n\t\n Followed by some description\n\t PR: test/foo#4 \n\t\n\n",
[]common.BasicPR{
{
Org: "test",
Repo: "foo",
Num: 4,
},
{
Org: "test",
Repo: "goo",
Num: 5,
},
},
"Some header of the issue\n\n\n\t PR: foobar#5 \n\t PR: rd/goo5 \n\t PR: test/#5 \n" +
"\t PR: /goo#5 \n\t PR: test/goo# \n\t PR: test / goo # 10 \n\tPR: test/gool# 10 \n" +
"\t\n Followed by some description",
},
}
for _, test := range testTable {
t.Run(test.name, func(t *testing.T) {
desc, prs := common.ExtractDescriptionAndPRs(newStringScanner(test.input))
if len(prs) != len(test.prs) {
t.Error("Unexpected length:", len(prs), "expected:", len(test.prs))
return
}
for _, p := range test.prs {
if !slices.Contains(prs, p) {
t.Error("missing expected PR", p)
}
}
if desc != test.desc {
t.Error("Desc output", len(desc), "!=", len(test.desc), ":", desc)
}
})
}
}
func TestAppendingPRsToDescription(t *testing.T) {
testTable := []struct {
name string
desc string
PRs []common.BasicPR
output string
}{
{
"Append single PR to end of description",
"something",
[]common.BasicPR{
{Org: "a", Repo: "b", Num: 100},
},
"something\n\nPR: a/b#100",
},
{
"Append multiple PR to end of description",
"something",
[]common.BasicPR{
{Org: "a1", Repo: "b", Num: 100},
{Org: "a1", Repo: "c", Num: 100},
{Org: "a1", Repo: "c", Num: 101},
{Org: "b", Repo: "b", Num: 100},
{Org: "c", Repo: "b", Num: 100},
},
"something\n\nPR: a1/b#100\nPR: a1/c#100\nPR: a1/c#101\nPR: b/b#100\nPR: c/b#100",
},
{
"Append multiple sorted PR to end of description and remove dups",
"something",
[]common.BasicPR{
{Org: "a1", Repo: "c", Num: 101},
{Org: "a1", Repo: "c", Num: 100},
{Org: "c", Repo: "b", Num: 100},
{Org: "b", Repo: "b", Num: 100},
{Org: "a1", Repo: "c", Num: 101},
{Org: "a1", Repo: "c", Num: 101},
{Org: "a1", Repo: "b", Num: 100},
},
"something\n\nPR: a1/b#100\nPR: a1/c#100\nPR: a1/c#101\nPR: b/b#100\nPR: c/b#100",
},
}
for _, test := range testTable {
t.Run(test.name, func(t *testing.T) {
d := common.AppendPRsToDescription(test.desc, test.PRs)
if d != test.output {
t.Error(len(d), "vs", len(test.output))
t.Error("unpected output", d)
}
})
}
}

246
common/config.go Normal file
View File

@@ -0,0 +1,246 @@
package common
/*
* This file is part of Autogits.
*
* Copyright © 2024 SUSE LLC
*
* Autogits is free software: you can redistribute it and/or modify it under
* the terms of the GNU General Public License as published by the Free Software
* Foundation, either version 2 of the License, or (at your option) any later
* version.
*
* Autogits is distributed in the hope that it will be useful, but WITHOUT ANY
* WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A
* PARTICULAR PURPOSE. See the GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License along with
* Foobar. If not, see <https://www.gnu.org/licenses/>.
*/
import (
"encoding/json"
"errors"
"fmt"
"io"
"log"
"os"
"strings"
"github.com/tailscale/hujson"
)
//go:generate mockgen -source=config.go -destination=mock/config.go -typed
const (
ProjectConfigFile = "workflow.config"
StagingConfigFile = "staging.config"
)
type ConfigFile struct {
GitProjectNames []string
}
type ReviewGroup struct {
Name string
Reviewers []string
}
type QAConfig struct {
Name string
Origin string
}
type AutogitConfig struct {
Workflows []string // [pr, direct, test]
Organization string
GitProjectName string // Organization/GitProjectName.git is PrjGit
Branch string // branch name of PkgGit that aligns with PrjGit submodules
Reviewers []string // only used by `pr` workflow
ReviewGroups []ReviewGroup
}
type AutogitConfigs []*AutogitConfig
func ReadConfig(reader io.Reader) (*ConfigFile, error) {
data, err := io.ReadAll(reader)
if err != nil {
return nil, fmt.Errorf("Error reading config data: %w", err)
}
config := ConfigFile{}
data, err = hujson.Standardize(data)
if err != nil {
return nil, fmt.Errorf("Failed to parse json: %w", err)
}
if err := json.Unmarshal(data, &config.GitProjectNames); err != nil {
return nil, fmt.Errorf("Error parsing Git Project paths: %w", err)
}
return &config, nil
}
func ReadConfigFile(filename string) (*ConfigFile, error) {
file, err := os.Open(filename)
if err != nil {
return nil, fmt.Errorf("Cannot open config file for reading. err: %w", err)
}
defer file.Close()
return ReadConfig(file)
}
type GiteaFileContentAndRepoFetcher interface {
GiteaFileContentReader
GiteaRepoFetcher
}
func PartiallyParseWorkflowConfig(data []byte) (*AutogitConfig, error) {
var config AutogitConfig
data, err := hujson.Standardize(data)
if err != nil {
return nil, fmt.Errorf("Failed to parse json: %w", err)
}
if err := json.Unmarshal(data, &config); err != nil {
return nil, fmt.Errorf("Error parsing workflow config file: %s: %w", string(data), err)
}
return &config, nil
}
func ReadWorkflowConfig(gitea GiteaFileContentAndRepoFetcher, git_project string) (*AutogitConfig, error) {
hash := strings.Split(git_project, "#")
branch := ""
if len(hash) > 1 {
branch = hash[1]
}
a := strings.Split(hash[0], "/")
prjGitRepo := DefaultGitPrj
switch len(a) {
case 1:
case 2:
prjGitRepo = a[1]
default:
return nil, fmt.Errorf("Missing org/repo in projectgit: %s", git_project)
}
data, _, err := gitea.GetRepositoryFileContent(a[0], prjGitRepo, branch, ProjectConfigFile)
if err != nil {
return nil, fmt.Errorf("Error fetching 'workflow.config' for %s/%s#%s: %w", a[0], prjGitRepo, branch, err)
}
config, err := PartiallyParseWorkflowConfig(data)
if err != nil {
return nil, err
}
if len(config.Organization) < 1 {
config.Organization = a[0]
}
config.GitProjectName = a[0] + "/" + prjGitRepo
if len(branch) == 0 {
if r, err := gitea.GetRepository(a[0], prjGitRepo); err == nil {
branch = r.DefaultBranch
} else {
return nil, fmt.Errorf("Failed to read workflow config in %s: %w", git_project, err)
}
}
config.GitProjectName = config.GitProjectName + "#" + branch
return config, nil
}
func ResolveWorkflowConfigs(gitea GiteaFileContentAndRepoFetcher, config *ConfigFile) (AutogitConfigs, error) {
configs := make([]*AutogitConfig, 0, len(config.GitProjectNames))
for _, git_project := range config.GitProjectNames {
c, err := ReadWorkflowConfig(gitea, git_project)
if err != nil {
// can't sync, so ignore for now
log.Println(err)
} else {
configs = append(configs, c)
}
}
return configs, nil
}
func (configs AutogitConfigs) GetPrjGitConfig(org, repo, branch string) *AutogitConfig {
prjgit := org + "/" + repo + "#" + branch
for _, c := range configs {
if c.GitProjectName == prjgit {
return c
}
if c.Organization == org && c.Branch == branch {
return c
}
}
return nil
}
func (config *AutogitConfig) GetReviewGroupMembers(reviewer string) ([]string, error) {
for _, g := range config.ReviewGroups {
if g.Name == reviewer {
return g.Reviewers, nil
}
}
return nil, errors.New("User " + reviewer + " not found as group reviewer for " + config.GitProjectName)
}
func (config *AutogitConfig) GetPrjGit() (string, string, string) {
org := config.Organization
repo := DefaultGitPrj
branch := "master"
a := strings.Split(config.GitProjectName, "/")
if len(a[0]) > 0 {
repo = strings.TrimSpace(a[0])
}
if len(a) == 2 {
if a[0] = strings.TrimSpace(a[0]); len(a[0]) > 0 {
org = a[0]
}
repo = strings.TrimSpace(a[1])
}
b := strings.Split(repo, "#")
if len(b) == 2 {
if b[0] = strings.TrimSpace(b[0]); len(b[0]) > 0 {
repo = b[0]
} else {
repo = DefaultGitPrj
}
if b[1] = strings.TrimSpace(b[1]); len(b[1]) > 0 {
branch = strings.TrimSpace(b[1])
}
}
return org, repo, branch
}
func (config *AutogitConfig) GetRemoteBranch() string {
return "origin_" + config.Branch
}
type StagingConfig struct {
ObsProject string
RebuildAll bool
// if set, then only use pull request numbers as unique identifiers
StagingProject string
QA []QAConfig
}
func ParseStagingConfig(data []byte) (*StagingConfig, error) {
var staging StagingConfig
data, err := hujson.Standardize(data)
if err != nil {
return nil, err
}
if err := json.Unmarshal(data, &staging); err != nil {
return nil, err
}
return &staging, nil
}

130
common/config_test.go Normal file
View File

@@ -0,0 +1,130 @@
package common_test
import (
"slices"
"testing"
"go.uber.org/mock/gomock"
"src.opensuse.org/autogits/common"
"src.opensuse.org/autogits/common/gitea-generated/models"
mock_common "src.opensuse.org/autogits/common/mock"
)
func TestConfigWorkflowParser(t *testing.T) {
tests := []struct {
name string
config_json string
repo models.Repository
}{
{
name: "Regular workflow file",
config_json: `{
"Workflows": ["direct", "pr"],
"Organization": "testing",
"ReviewGroups": [
{
"Name": "gnuman1",
"Reviewers": ["adamm"]
}
]
}`,
repo: models.Repository{
DefaultBranch: "master",
},
},
}
for _, test := range tests {
t.Run(test.name, func(t *testing.T) {
ctl := gomock.NewController(t)
gitea := mock_common.NewMockGiteaFileContentAndRepoFetcher(ctl)
gitea.EXPECT().GetRepositoryFileContent("foo", "bar", "", "workflow.config").Return([]byte(test.config_json), "abc", nil)
gitea.EXPECT().GetRepository("foo", "bar").Return(&test.repo, nil)
_, err := common.ReadWorkflowConfig(gitea, "foo/bar")
if err != nil {
t.Fatal(err)
}
})
}
}
func TestProjectGitParser(t *testing.T) {
tests := []struct {
name string
prjgit string
org string
branch string
res [3]string
}{
{
name: "repo only",
prjgit: "repo.git",
org: "org",
branch: "br",
res: [3]string{"org", "repo.git", "master"},
},
{
name: "default",
org: "org",
res: [3]string{"org", common.DefaultGitPrj, "master"},
},
{
name: "repo with branch",
org: "org2",
prjgit: "repo.git#somebranch",
res: [3]string{"org2", "repo.git", "somebranch"},
},
{
name: "repo org and branch",
org: "org3",
prjgit: "oorg/foo.bar#point",
res: [3]string{"oorg", "foo.bar", "point"},
},
{
name: "whitespace shouldn't matter",
prjgit: " oorg / \nfoo.bar\t # point ",
res: [3]string{"oorg", "foo.bar", "point"},
},
{
name: "repo org and empty branch",
org: "org3",
prjgit: "oorg/foo.bar#",
res: [3]string{"oorg", "foo.bar", "master"},
},
{
name: "only branch defined",
org: "org3",
prjgit: "#mybranch",
res: [3]string{"org3", "_ObsPrj", "mybranch"},
},
{
name: "only org and branch defined",
org: "org3",
prjgit: "org1/#mybranch",
res: [3]string{"org1", "_ObsPrj", "mybranch"},
},
{
name: "empty org and repo",
org: "org3",
prjgit: "/repo#",
res: [3]string{"org3", "repo", "master"},
},
}
for _, test := range tests {
t.Run(test.name, func(t *testing.T) {
c := &common.AutogitConfig{
Organization: test.org,
Branch: test.branch,
GitProjectName: test.prjgit,
}
i, j, k := c.GetPrjGit()
res := []string{i, j, k}
if !slices.Equal(res, test.res[:]) {
t.Error("Expected", test.res, "but received", res)
}
})
}
}

View File

@@ -22,8 +22,11 @@ const (
GiteaTokenEnv = "GITEA_TOKEN"
ObsUserEnv = "OBS_USER"
ObsPasswordEnv = "OBS_PASSWORD"
ObsSshkeyEnv = "OBS_SSHKEY"
ObsSshkeyFileEnv = "OBS_SSHKEYFILE"
DefaultGitPrj = "_ObsPrj"
PrjLinksFile = "links.json"
GiteaRequestHeader = "X-Gitea-Event-Type"
Bot_BuildReview = "autogits_obs_staging_bot"

View File

@@ -1,4 +1,4 @@
package common
package common_test
/*
* This file is part of Autogits.

1013
common/git_utils.go Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -19,13 +19,78 @@ package common
*/
import (
"bufio"
"bytes"
"os"
"os/exec"
"path"
"slices"
"strings"
"testing"
)
func TestGitClone(t *testing.T) {
tests := []struct {
name string
repo string
branch string
remoteName string
remoteUrl string
}{
{
name: "Basic clone",
repo: "pkgAclone",
branch: "main",
remoteName: "pkgA_main",
remoteUrl: "/pkgA",
},
{
name: "Remote branch is non-existent",
repo: "pkgAclone",
branch: "main_not_here",
remoteName: "pkgA_main",
remoteUrl: "/pkgA",
},
}
execPath, err := os.Getwd()
if err != nil {
t.Fatal(err)
}
d := t.TempDir()
os.Chdir(d)
defer os.Chdir(execPath)
cmd := exec.Command(path.Join(execPath, "test_clone_setup.sh"))
if _, err := cmd.Output(); err != nil {
t.Fatal(err)
}
gh, err := AllocateGitWorkTree(d, "Test", "test@example.com")
if err != nil {
t.Fatal(err)
}
for _, test := range tests {
t.Run(test.name, func(t *testing.T) {
g, err := gh.CreateGitHandler("org")
if err != nil {
t.Fatal(err)
}
if _, err := g.GitClone(test.repo, test.branch, "file://"+d+test.remoteUrl); err != nil {
t.Fatal(err)
}
id, err := g.GitBranchHead(test.repo, test.branch)
if err != nil {
t.Fatal(err)
}
t.Fatal(id)
})
}
}
func TestGitMsgParsing(t *testing.T) {
t.Run("tree message with size 56", func(t *testing.T) {
const hdr = "f40888ea4515fe2e8eea617a16f5f50a45f652d894de3ad181d58de3aafb8f98 tree 56\x00"
@@ -240,9 +305,36 @@ Reviewed-By: Marco Ippolito <marcoippolito54@gmail.com>` + "\x00"
t.Error("expected submodule not found")
}
})
t.Run("parse nested trees with subtrees", func(t *testing.T) {
const data = "873a323b262ebb3bd77b2592b2e11bdd08dbc721cbf4ac9f97637e58e1fffce7 tree 1083\x00100644\x20\x2Egitattributes\x00\xD8v\xA95\x87\xC1\xA9\xFCPn\xDD\xD4\x13\x9B\x8E\xD2\xCFs\xBD\x11q\x8A\xAE\x8A\x7Cg\xE2C\x14J\x01\xB0100644\x20\x2Egitignore\x00\xC3\xCD\x8En\x887\x3AJ\xA0P\xEEL\xD4\xF5\xD2v\x9C\xA6v\xC5D\x60\x40\x95\xD1\x0B\xA4\xB8\x86\xD4rE100644\x20COPYING\x00\x12\x2A\x28\xC8\xB9\x5D\x9B\x8A\x23\x1F\xE96\x07\x3F\xA9D\x90\xFD\xCE\x2Bi\x2D\x031\x5C\xCC\xC4fx\x00\xC22100644\x20README\x2Emd\x00\x92D\xF7\xFF\x0E0\x5C\xF2\xAC\x0DA\x06\x92\x0B\xD6z\x3CGh\x00y\x7EW1\xB9a\x8Ch\x215Fa100644\x20_service\x00\xC51\xF2\x12\xF3\x24\x9C\xD9\x9F\x0A\x93Mp\x12\xC1\xF7i\x05\x95\xC5Z\x06\x95i\x3Az\xC3\xF59\x7E\xF8\x1B100644\x20autogits\x2Echanges\x00\xF7\x8D\xBF\x0A\xCB\x5D\xB7y\x8C\xA9\x9C\xEB\x92\xAFd\x2C\x98\x23\x0C\x13\x13\xED\xDE\x5D\xBALD6\x3BR\x5B\xCA100644\x20autogits\x2Espec\x00\xD2\xBC\x20v\xD3\xE5F\xCA\xEE\xEA\x18\xC84\x0D\xA7\xCA\xD8O\xF2\x0A\xAB\x40\x2A\xFAL\x3B\xB4\xE6\x11\xE7o\xD140000\x20common\x00\xE2\xC9dg\xD0\x5D\xD1\xF1\x8ARW\xF0\x96\xD6\x29\x2F\x8F\xD9\xC7\x82\x1A\xB7\xAAw\xB0\xCE\xA8\xFE\xC8\xD7D\xF2100755\x20dev_test_helper\x2Esh\x00\xECY\xDD\xB3rz\x9Fh\xD4\x2E\x85\x02\x13\xF8\xFE\xB57\x8B\x1B6\x8E\x09dC\x1E\xE0\x90\x09\x08\xED\xBD_40000\x20devel\x2Dimporter\x00v\x98\x9B\x92\xD8\x24lu\xFC\xB2d\xC9\xCENb\xEE\x0F\x21\x8B\x92\x88\xDBs\xF8\x2E\xA8\xC8W\x1C\x20\xCF\xD440000\x20doc\x00\x8Akyq\xD0\xCF\xB8\x2F\x80Y\x2F\x11\xF0\x14\xA9\xFE\x96\x14\xE0W\x2C\xCF\xB9\x86\x7E\xFDi\xD7\x1F\x08Q\xFB40000\x20gitea\x2Devents\x2Drabbitmq\x2Dpublisher\x00\x5Cb\x3Fh\xA2\x06\x06\x0Cd\x09\xA5\xD9\xF7\x23\x5C\xF85\xF5\xB8\xBE\x7F\xD4O\x25t\xEF\xCC\xAB\x18\x7C\x0C\xF3100644\x20go\x2Emod\x00j\x85\x0B\x03\xC8\x9F\x9F\x0F\xC8\xE0\x8C\xF7\x3D\xC19\xF7\x12gk\xD6\x18JN\x24\xC0\x1C\xBE\x97oY\x02\x8D100644\x20go\x2Esum\x00h\x88\x2E\x27\xED\xD39\x8D\x12\x0F\x7D\x97\xA2\x5DE\xB9\x82o\x0Cu\xF4l\xA17s\x28\x2BQT\xE6\x12\x9040000\x20group\x2Dreview\x00\x7E\x7B\xB42\x0F\x3B\xC9o\x2C\xE79\x1DR\xE2\xE4i\xAE\xF6u\x90\x09\xD8\xC9c\xE7\xF7\xC7\x92\xFB\xD7\xDD140000\x20obs\x2Dstaging\x2Dbot\x00\x12\xE8\xAF\x09\xD4\x5D\x13\x8D\xC9\x0AvPDc\xB6\x7C\xAC4\xD9\xC5\xD4_\x98i\xBE2\xA7\x25aj\xE2k40000\x20obs\x2Dstatus\x2Dservice\x00MATY\xA3\xFA\xED\x05\xBE\xEB\x2B\x07\x9CN\xA9\xF3SB\x22MlV\xA4\x5D\xDA\x0B\x0F\x23\xA1\xA8z\xD740000\x20systemd\x00\x2D\xE2\x03\x7E\xBD\xEB6\x8F\xC5\x0E\x12\xD4\xBD\x97P\xDD\xA2\x92\xCE6n\x08Q\xCA\xE4\x15\x97\x8F\x26V\x3DW100644\x20vendor\x2Etar\x2Ezst\x00\xD9\x2Es\x03I\x91\x22\x24\xC86q\x91\x95\xEF\xA3\xC9\x3C\x06D\x90w\xAD\xCB\xAE\xEEu2i\xCE\x05\x09u40000\x20workflow\x2Ddirect\x00\x94\xDB\xDFc\xB5A\xD5\x16\xB3\xC3ng\x94J\xE7\x101jYF\x15Q\xE97\xCFg\x14\x12\x28\x3A\xFC\xDB40000\x20workflow\x2Dpr\x00\xC1\xD8Z9\x18\x60\xA2\xE2\xEF\xB0\xFC\xD7\x2Ah\xF07\x0D\xEC\x8A7\x7E\x1A\xAAn\x13\x9C\xEC\x05s\xE8\xBDf\x00"
ch := make(chan byte, 2000)
for _, b := range []byte(data) {
ch <- b
}
tree, err := parseGitTree(ch)
if err != nil {
t.Error(err)
}
found := false
for _, item := range tree.items {
t.Log(item)
if item.name == "workflow-pr" && item.hash == "c1d85a391860a2e2efb0fcd72a68f0370dec8a377e1aaa6e139cec0573e8bd66" && item.isTree() {
found = true
break
}
}
if !found {
t.Error("expected submodule not found")
}
})
}
func TestCommitTreeParsingOfHead(t *testing.T) {
func TestCommitTreeParsing(t *testing.T) {
gitDir := t.TempDir()
testDir, _ := os.Getwd()
var commitId string
@@ -257,11 +349,58 @@ func TestCommitTreeParsingOfHead(t *testing.T) {
t.Fatal(err.Error())
}
gh, err := AllocateGitWorkTree(gitDir, "", "")
if err != nil {
t.Fatal(err)
}
t.Run("GitCatFile commit", func(t *testing.T) {
h, _ := gh.ReadExistingPath(".")
defer h.Close()
file, err := h.GitCatFile("", commitId, "help")
if err != nil {
t.Error("failed", err)
}
if string(file) != "help\n" {
t.Error("expected 'help\\n' but got", string(file))
}
})
t.Run("GitCatFile commit", func(t *testing.T) {
h, _ := gh.ReadExistingPath(".")
defer h.Close()
file, err := h.GitCatFile("", "HEAD", "help")
if err != nil {
t.Error("failed", err)
}
if string(file) != "help\n" {
t.Error("expected 'help\\n' but got", string(file))
}
})
t.Run("GitCatFile bad commit", func(t *testing.T) {
h, _ := gh.ReadExistingPath(".")
defer h.Close()
file, err := h.GitCatFile("", "518b468f391bf01d5d76d497d7cbecfa8b46d185714cf8745800ae18afb21afd", "help")
if err == nil {
t.Error("expected error, but not nothing")
}
if string(file) != "" {
t.Error("expected 'help\\n' but got", file)
}
})
t.Run("reads HEAD and parses the tree", func(t *testing.T) {
const nodejs21 = "c678c57007d496a98bec668ae38f2c26a695f94af78012f15d044ccf066ccb41"
h := GitHandler{
GitPath: gitDir,
}
h, _ := gh.ReadExistingPath(".")
defer h.Close()
id, ok := h.GitSubmoduleCommitId("", "nodejs21", commitId)
if !ok {
t.Error("failed parse")
@@ -272,9 +411,9 @@ func TestCommitTreeParsingOfHead(t *testing.T) {
})
t.Run("reads README.md", func(t *testing.T) {
h := GitHandler{
GitPath: gitDir,
}
h, _ := gh.ReadExistingPath(".")
defer h.Close()
data, err := h.GitCatFile("", commitId, "README.md")
if err != nil {
t.Errorf("failed parse: %v", err)
@@ -285,9 +424,8 @@ func TestCommitTreeParsingOfHead(t *testing.T) {
})
t.Run("read HEAD", func(t *testing.T) {
h := GitHandler{
GitPath: gitDir,
}
h, _ := gh.ReadExistingPath(".")
defer h.Close()
data, err := h.GitSubmoduleList("", "HEAD")
if err != nil {
@@ -302,3 +440,109 @@ func TestCommitTreeParsingOfHead(t *testing.T) {
t.Run("try to parse unknown item", func(t *testing.T) {
})
}
func TestGitStatusParse(t *testing.T) {
testData := []struct {
name string
data []byte
res []GitStatusData
}{
{
name: "Single modified line",
data: []byte("1 .M N... 100644 100644 100644 dbe4b3d5a0a2e385f78fd41d726baa20e9190f7b5a2e78cbd4885586832f39e7 dbe4b3d5a0a2e385f78fd41d726baa20e9190f7b5a2e78cbd4885586832f39e7 bots-common/git_utils.go\x00"),
res: []GitStatusData{
{
Path: "bots-common/git_utils.go",
Status: GitStatus_Modified,
},
},
},
{
name: "Untracked entries",
data: []byte("1 .M N... 100644 100644 100644 dbe4b3d5a0a2e385f78fd41d726baa20e9190f7b5a2e78cbd4885586832f39e7 dbe4b3d5a0a2e385f78fd41d726baa20e9190f7b5a2e78cbd4885586832f39e7 bots-common/git_utils.go\x00? bots-common/c.out\x00? doc/Makefile\x00"),
res: []GitStatusData{
{
Path: "bots-common/git_utils.go",
Status: GitStatus_Modified,
},
{
Path: "bots-common/c.out",
Status: GitStatus_Untracked,
},
{
Path: "doc/Makefile",
Status: GitStatus_Untracked,
},
},
},
{
name: "Untracked entries",
data: []byte("1 .M N... 100644 100644 100644 dbe4b3d5a0a2e385f78fd41d726baa20e9190f7b5a2e78cbd4885586832f39e7 dbe4b3d5a0a2e385f78fd41d726baa20e9190f7b5a2e78cbd4885586832f39e7 bots-common/git_utils.go\x00? bots-common/c.out\x00! doc/Makefile\x00"),
res: []GitStatusData{
{
Path: "bots-common/git_utils.go",
Status: GitStatus_Modified,
},
{
Path: "bots-common/c.out",
Status: GitStatus_Untracked,
},
{
Path: "doc/Makefile",
Status: GitStatus_Ignored,
},
},
},
{
name: "Nothing",
},
{
name: "Unmerged .gitmodules during a merge",
data: []byte("1 A. S... 000000 160000 160000 0000000000000000000000000000000000000000000000000000000000000000 ed07665aea0522096c88a7555f1fa9009ed0e0bac92de4613c3479516dd3d147 pkgB2\x00u UU N... 100644 100644 100644 100644 587ec403f01113f2629da538f6e14b84781f70ac59c41aeedd978ea8b1253a76 d23eb05d9ca92883ab9f4d28f3ec90c05f667f3a5c8c8e291bd65e03bac9ae3c 087b1d5f22dbf0aa4a879fff27fff03568b334c90daa5f2653f4a7961e24ea33 .gitmodules\x00"),
res: []GitStatusData{
{
Path: "pkgB2",
Status: GitStatus_Modified,
},
{
Path: ".gitmodules",
Status: GitStatus_Unmerged,
States: [3]string{"587ec403f01113f2629da538f6e14b84781f70ac59c41aeedd978ea8b1253a76", "d23eb05d9ca92883ab9f4d28f3ec90c05f667f3a5c8c8e291bd65e03bac9ae3c", "087b1d5f22dbf0aa4a879fff27fff03568b334c90daa5f2653f4a7961e24ea33"},
},
},
},
{
name: "Renamed file",
data: []byte("1 M. N... 100644 100644 100644 d23eb05d9ca92883ab9f4d28f3ec90c05f667f3a5c8c8e291bd65e03bac9ae3c 896cd09f36d39e782d66ae32dd5614d4f4d83fc689f132aab2dfc019a9f5b6f3 .gitmodules\x002 R. S... 160000 160000 160000 3befe051a34612530acfa84c736d2454278453ec0f78ec028f25d2980f8c3559 3befe051a34612530acfa84c736d2454278453ec0f78ec028f25d2980f8c3559 R100 pkgQ\x00pkgC\x00"),
res: []GitStatusData{
{
Path: "pkgQ",
Status: GitStatus_Renamed,
States: [3]string{"pkgC"},
},
{
Path: ".gitmodules",
Status: GitStatus_Modified,
},
},
},
}
for _, test := range testData {
t.Run(test.name, func(t *testing.T) {
r, err := parseGitStatusData(bufio.NewReader(bytes.NewReader(test.data)))
if err != nil {
t.Fatal(err)
}
if len(r) != len(test.res) {
t.Fatal("len(r):", len(r), "is not expected", len(test.res))
}
for _, expected := range test.res {
if !slices.Contains(r, expected) {
t.Fatal("result", r, "doesn't contains expected", expected)
}
}
})
}
}

Some files were not shown because too many files have changed in this diff Show More