45 Commits

Author SHA256 Message Date
ba20810c99 common: parser error fix resulting in deadlock
All checks were successful
go-generate-check / go-generate-check (pull_request) Successful in 26s
Integration tests / t (pull_request) Successful in 8m21s
2026-03-05 13:25:51 +01:00
Frank Schreiner
f5a32792e0 t: implemented test_003_refuse_manual_merge
All checks were successful
Integration tests / t (pull_request) Successful in 8m13s
Integration tests / t (push) Successful in 8m18s
Fixes: git-workflow/tests #19
2026-03-05 09:20:25 +01:00
b4945a8ae4 pr(docs): remove ReviewPending and ReviewDone label
All checks were successful
Integration tests / t (push) Successful in 7m35s
Not yet implemented. Will re-add when implemented
2026-03-04 21:57:02 +01:00
1451266ddc README: add missing MergeMode project config option
All checks were successful
Integration tests / t (push) Successful in 9m2s
2026-03-04 16:00:04 +01:00
bd618983e9 obs-staging-bot: Temporary hack for current factory setup
All checks were successful
go-generate-check / go-generate-check (push) Successful in 26s
Integration tests / t (push) Successful in 7m44s
We accept currently the temporary openSUSE:Factory:git not being the
master of openSUSE:Factory:PullRequest. We want to have it at the final
place. Once factory switches to git, content of openSUSE:Factory:git
will move to openSUSE:Factory and we can drop this exception again
2026-03-04 00:06:42 +01:00
c7840ddd47 obs: Adding new linkedbuild mode
All checks were successful
go-generate-check / go-generate-check (push) Successful in 13s
Integration tests / t (push) Successful in 8m8s
2026-03-03 19:58:08 +01:00
3f110ce5f6 Always handle build results as building when dirty flag is set 2026-03-03 19:57:03 +01:00
3a2c87b4af pr: adjusts test expectations.
All checks were successful
go-generate-check / go-generate-check (push) Successful in 14s
Integration tests / t (push) Successful in 8m32s
2026-03-03 18:11:09 +01:00
d0056ed461 tests: fix tests 2026-03-03 18:03:20 +01:00
e5e1b5d9a5 Merge remote-tracking branch 'gitea/t-mergemodes'
Some checks failed
go-generate-check / go-generate-check (push) Successful in 8s
Integration tests / t (push) Failing after 17m21s
2026-03-03 17:29:07 +01:00
Andrii Nikitin
96a908d0be t: Simplify create_gitea_pr using new_branch param
All checks were successful
go-generate-check / go-generate-check (pull_request) Successful in 8s
Integration tests / t (pull_request) Successful in 8m31s
Refactor create_gitea_pr() in common_test_utils.py to leverage the
'new_branch' parameter in the Gitea 'diffpatch' API call. This allows
for automatic creation of the target branch while applying the diff,
eliminating the need for explicit branch creation using the branches
endpoint. This also fixed strange behavior when diffpatch damaged
history of precreated branch
2026-03-03 13:03:32 +01:00
Andrii Nikitin
6aaff89179 t: test MergeMode of workflow-pr
- Add TC-MERGE-008 to 013 for testing MergeMode of workflow-pr
- Synchronize integration/test-plan.md with the actual test implementations
2026-03-03 13:03:00 +01:00
7ec663db27 rabbitmq: dial with timeout
Some checks failed
go-generate-check / go-generate-check (pull_request) Successful in 22s
Integration tests / t (pull_request) Failing after 5m50s
go-generate-check / go-generate-check (push) Successful in 14s
Integration tests / t (push) Successful in 4m26s
Hardcoded 10 second timeout on no connection instead of waiting
forever.
2026-03-03 09:28:44 +01:00
3b83ba96e4 pr: fix spam
Some checks failed
go-generate-check / go-generate-check (push) Successful in 12s
Integration tests / t (push) Failing after 7m46s
2026-03-02 16:55:57 +01:00
3a0445e857 rabbitmq: dial with timeout
Hardcoded 10 second timeout on no connection instead of waiting
forever.
2026-03-02 11:56:35 +01:00
ef5db8ca28 pr: No need to try to merge changes
We can reset current worktree and clobber it with the merged changes.
We want to emulate `git merge -s theirs` strategy while
`git merge -Xtheirs` only picks `theirs` in case of conflicts.
So, resetting the changes and reading exact is sufficient

`git read-tree -u` updates the current work tree so we do not have
unstaged changes.
2026-03-02 11:04:22 +01:00
10f74f681d pr: add function that checks and prepares PRs 2026-03-02 11:04:22 +01:00
b514f9784c pr: add merge modes documentation and config parsing 2026-03-02 11:04:22 +01:00
18435a8820 pr: unit tests for remote PrjGit changes
All checks were successful
go-generate-check / go-generate-check (push) Successful in 26s
Integration tests / t (push) Successful in 5m8s
Added unit tests when remote projectgit changes are allowed or
not allowed.
2026-02-27 16:30:30 +01:00
5f646f4520 Disable temporary comment adding in case of lacking permissions 2026-02-27 16:28:21 +01:00
88aa8c32fd Merge commit 'refs/pull/149/head' of src.opensuse.org:git-workflow/autogits
Some checks failed
go-generate-check / go-generate-check (push) Successful in 11s
Integration tests / t (push) Failing after 7m32s
2026-02-27 15:51:34 +01:00
99d27a48ff Support remote source in pull requests
This requires write permission by maintainer there
2026-02-27 15:50:31 +01:00
Andrii Nikitin
7832ef90c0 t: rename Gitea objects to avoid accidental collision with real repos
All checks were successful
go-generate-check / go-generate-check (pull_request) Successful in 8s
Integration tests / t (pull_request) Successful in 4m38s
Also address potential race condition between requested reviews and sending approvals in TC-MERGE-002
2026-02-27 13:22:26 +01:00
Andrii Nikitin
4691747038 t: add manual merge test and improve robustness
All checks were successful
go-generate-check / go-generate-check (pull_request) Successful in 26s
Integration tests / t (pull_request) Successful in 4m35s
- Add TC-MERGE-002: new end-to-end test for ManualMergeOnly functionality.
- Implement global object tracking in conftest.py to prevent redundant setup.
- Update test-plan.md to reflect current test implementation and skip statuses.
- Improve review tests robustness by using unique filenames and better assertions.
- Configure SLFO staging-main and manual-merge branches for monitored tests.
- Move flaky NoProjectGitPR tests from xfail to skip.
2026-02-27 02:30:47 +01:00
af2ff0bdd2 common: check for old pending request reviews
Timeline events will contain Reviews and ReviewRequests and
ReviewDismissed events. We need to handle this at event parsing
time and not to punt this to the query functions later on.

If the last event is an actual review, we use this.
If no review, check if last event associated with the reviewer
is Dismissed or Requested Review but not if a dismissed Review
preceeds it.
2026-02-27 01:03:26 +01:00
5669083388 pr: handle case of nil user in reviews
All checks were successful
go-generate-check / go-generate-check (push) Successful in 21s
Integration tests / t (push) Successful in 6m29s
This can happen when a review request is assigned automatically via
CODEOWNERS or perhaps the requesting user has account removed.
2026-02-26 13:15:58 +01:00
Andrii Nikitin
cb9131a5dd t: add init process to all services
Some checks failed
Integration tests / t (pull_request) Successful in 7m0s
Integration tests / t (push) Failing after 6m58s
Enable init: true for all services in podman-compose.yml to ensure
proper signal handling and zombie process reaping within containers.
2026-02-26 12:17:29 +01:00
582df2555b Merge branch 'staging-updates'
Some checks failed
go-generate-check / go-generate-check (push) Successful in 29s
Integration tests / t (push) Has been cancelled
2026-02-25 14:48:56 +01:00
91d22f7eea staging: add tests for idempotency and label changes
All checks were successful
go-generate-check / go-generate-check (pull_request) Successful in 33s
We do not want duplicate comments. And if we do have label changes,
new comments should be added.
2026-02-24 18:22:08 +01:00
913b8c8a4b staging: Match previous message format
All checks were successful
go-generate-check / go-generate-check (pull_request) Successful in 28s
Match changes in older message format. That is,

    Build is started in https://host/project/show/SUSE:SLFO:2.2:PullRequest:2162 .

    Additional QA builds:
    https://host/project/show/SUSE:SLFO:2.2:PullRequest:2162:SLES
    https://host/project/show/SUSE:SLFO:2.2:PullRequest:2162:SL-Micro

Add unit test to verify this exact format.
2026-02-24 12:23:35 +01:00
e1825dc658 staging: CommentPROnce everywhere
All checks were successful
go-generate-check / go-generate-check (pull_request) Successful in 32s
This replaces last usage of gitea.AddComment() where we do not
check if the comment already exists.
2026-02-23 19:16:40 +01:00
59965e7b5c staging: comment once on PRs using timeline
We need to comment once on PRs and verify using issue timeline
that only one comment is present

Furthermore, staging and secondary QA links should be present
in a single comment as tooling already expects this format.
2026-02-23 19:05:44 +01:00
24a4a592a7 staging: add PollWorkNotifications coverage 2026-02-23 16:01:47 +01:00
d3d9d66797 staging: add tests on commentOnPackagePR 2026-02-23 15:48:38 +01:00
7a2f7a6ee7 staging: test default projectgit repo 2026-02-23 15:44:45 +01:00
34a3a4795b staging: increase coverage of PulllRequest processing 2026-02-23 15:39:02 +01:00
bb5daebdfa staging: return correct error
Don't clobber our error before returning it
2026-02-23 15:37:41 +01:00
70bba5e239 staging: improve CreateQASubProject unit coverage 2026-02-23 15:11:30 +01:00
5793391586 staging: add core logic unit tests 2026-02-23 15:05:51 +01:00
d923db3f87 staging: tests for Notification and Review handling 2026-02-23 14:47:51 +01:00
fc4547f9a9 tests: sanitize check 2026-02-23 14:44:17 +01:00
6fa57fc4d4 staging: Fix logic error
We need to report only once all building is finished, and not partial
results. Partial results are not yet finalized, so we can only
report that build is still in progress.

Add unit tests to cover these scenarios
2026-02-23 14:33:51 +01:00
82d4e2ed5d staging: mock interface setup 2026-02-23 14:17:53 +01:00
8920644792 staging: Use interfaces allowing dependency injection
This includes also a few formatting changes
2026-02-23 14:10:10 +01:00
06772ca662 common: Add ObsClientInterface
This allows for dependency injection for future unit tests.
2026-02-23 13:43:23 +01:00
35 changed files with 3830 additions and 647 deletions

View File

@@ -39,6 +39,10 @@ const (
Permission_ForceMerge = "force-merge" Permission_ForceMerge = "force-merge"
Permission_Group = "release-engineering" Permission_Group = "release-engineering"
MergeModeFF = "ff-only"
MergeModeReplace = "replace"
MergeModeDevel = "devel"
) )
type ConfigFile struct { type ConfigFile struct {
@@ -52,9 +56,9 @@ type ReviewGroup struct {
} }
type QAConfig struct { type QAConfig struct {
Name string Name string
Origin string Origin string
Label string // requires this gitea lable to be set or skipped Label string // requires this gitea lable to be set or skipped
BuildDisableRepos []string // which repos to build disable in the new project BuildDisableRepos []string // which repos to build disable in the new project
} }
@@ -89,7 +93,8 @@ type AutogitConfig struct {
Committers []string // group in addition to Reviewers and Maintainers that can order the bot around, mostly as helper for factory-maintainers Committers []string // group in addition to Reviewers and Maintainers that can order the bot around, mostly as helper for factory-maintainers
Subdirs []string // list of directories to sort submodules into. Needed b/c _manifest cannot list non-existent directories Subdirs []string // list of directories to sort submodules into. Needed b/c _manifest cannot list non-existent directories
Labels map[string]string // list of tags, if not default, to apply Labels map[string]string // list of tags, if not default, to apply
MergeMode string // project merge mode
NoProjectGitPR bool // do not automatically create project git PRs, just assign reviewers and assume somethign else creates the ProjectGit PR NoProjectGitPR bool // do not automatically create project git PRs, just assign reviewers and assume somethign else creates the ProjectGit PR
ManualMergeOnly bool // only merge with "Merge OK" comment by Project Maintainers and/or Package Maintainers and/or reviewers ManualMergeOnly bool // only merge with "Merge OK" comment by Project Maintainers and/or Package Maintainers and/or reviewers
@@ -184,6 +189,17 @@ func ReadWorkflowConfig(gitea GiteaFileContentAndRepoFetcher, git_project string
} }
} }
config.GitProjectName = config.GitProjectName + "#" + branch config.GitProjectName = config.GitProjectName + "#" + branch
// verify merge modes
switch config.MergeMode {
case MergeModeFF, MergeModeDevel, MergeModeReplace:
break // good results
case "":
config.MergeMode = MergeModeFF
default:
return nil, fmt.Errorf("Unsupported merge mode in %s: %s", git_project, config.MergeMode)
}
return config, nil return config, nil
} }

View File

@@ -4,6 +4,7 @@ import (
"slices" "slices"
"testing" "testing"
"go.uber.org/mock/gomock"
"src.opensuse.org/autogits/common" "src.opensuse.org/autogits/common"
"src.opensuse.org/autogits/common/gitea-generated/models" "src.opensuse.org/autogits/common/gitea-generated/models"
mock_common "src.opensuse.org/autogits/common/mock" mock_common "src.opensuse.org/autogits/common/mock"
@@ -341,3 +342,67 @@ func TestConfigPermissions(t *testing.T) {
}) })
} }
} }
func TestConfigMergeModeParser(t *testing.T) {
tests := []struct {
name string
json string
mergeMode string
wantErr bool
}{
{
name: "empty",
json: "{}",
mergeMode: common.MergeModeFF,
},
{
name: "ff-only",
json: `{"MergeMode": "ff-only"}`,
mergeMode: common.MergeModeFF,
},
{
name: "replace",
json: `{"MergeMode": "replace"}`,
mergeMode: common.MergeModeReplace,
},
{
name: "devel",
json: `{"MergeMode": "devel"}`,
mergeMode: common.MergeModeDevel,
},
{
name: "unsupported",
json: `{"MergeMode": "invalid"}`,
wantErr: true,
},
}
for _, test := range tests {
t.Run(test.name, func(t *testing.T) {
repo := models.Repository{
DefaultBranch: "master",
}
ctl := gomock.NewController(t)
gitea := mock_common.NewMockGiteaFileContentAndRepoFetcher(ctl)
gitea.EXPECT().GetRepositoryFileContent("foo", "bar", "", "workflow.config").Return([]byte(test.json), "abc", nil)
gitea.EXPECT().GetRepository("foo", "bar").Return(&repo, nil)
config, err := common.ReadWorkflowConfig(gitea, "foo/bar")
if test.wantErr {
if err == nil {
t.Fatal("Expected error, got nil")
}
return
}
if err != nil {
t.Fatal(err)
}
if config.MergeMode != test.mergeMode {
t.Errorf("Expected MergeMode %s, got %s", test.mergeMode, config.MergeMode)
}
})
}
}

View File

@@ -288,7 +288,7 @@ func (e *GitHandlerImpl) GitClone(repo, branch, remoteUrl string) (string, error
func (e *GitHandlerImpl) GitBranchHead(gitDir, branchName string) (string, error) { func (e *GitHandlerImpl) GitBranchHead(gitDir, branchName string) (string, error) {
id, err := e.GitExecWithOutput(gitDir, "show-ref", "--heads", "--hash", branchName) id, err := e.GitExecWithOutput(gitDir, "show-ref", "--heads", "--hash", branchName)
if err != nil { if err != nil {
return "", fmt.Errorf("Can't find default branch: %s", branchName) return "", fmt.Errorf("Can't find default branch: %s in %s", branchName, gitDir)
} }
id = strings.TrimSpace(SplitLines(id)[0]) id = strings.TrimSpace(SplitLines(id)[0])
@@ -302,7 +302,7 @@ func (e *GitHandlerImpl) GitBranchHead(gitDir, branchName string) (string, error
func (e *GitHandlerImpl) GitRemoteHead(gitDir, remote, branchName string) (string, error) { func (e *GitHandlerImpl) GitRemoteHead(gitDir, remote, branchName string) (string, error) {
id, err := e.GitExecWithOutput(gitDir, "show-ref", "--hash", "--verify", "refs/remotes/"+remote+"/"+branchName) id, err := e.GitExecWithOutput(gitDir, "show-ref", "--hash", "--verify", "refs/remotes/"+remote+"/"+branchName)
if err != nil { if err != nil {
return "", fmt.Errorf("Can't find default branch: %s", branchName) return "", fmt.Errorf("Can't find default branch: %s in %s", branchName, gitDir)
} }
return strings.TrimSpace(id), nil return strings.TrimSpace(id), nil
@@ -396,12 +396,17 @@ func (e *GitHandlerImpl) GitExecQuietOrPanic(cwd string, params ...string) {
} }
type ChanIO struct { type ChanIO struct {
ch chan byte ch chan byte
done chan struct{}
} }
func (c *ChanIO) Write(p []byte) (int, error) { func (c *ChanIO) Write(p []byte) (int, error) {
for _, b := range p { for _, b := range p {
c.ch <- b select {
case c.ch <- b:
case <-c.done:
return 0, io.EOF
}
} }
return len(p), nil return len(p), nil
} }
@@ -410,21 +415,32 @@ func (c *ChanIO) Write(p []byte) (int, error) {
func (c *ChanIO) Read(data []byte) (idx int, err error) { func (c *ChanIO) Read(data []byte) (idx int, err error) {
var ok bool var ok bool
data[idx], ok = <-c.ch select {
if !ok { case data[idx], ok = <-c.ch:
err = io.EOF
return
}
idx++
for len(c.ch) > 0 && idx < len(data) {
data[idx], ok = <-c.ch
if !ok { if !ok {
err = io.EOF err = io.EOF
return return
} }
idx++ idx++
case <-c.done:
err = io.EOF
return
}
for len(c.ch) > 0 && idx < len(data) {
select {
case data[idx], ok = <-c.ch:
if !ok {
err = io.EOF
return
}
idx++
case <-c.done:
err = io.EOF
return
default:
return
}
} }
return return
@@ -471,7 +487,14 @@ func parseGitMsg(data <-chan byte) (GitMsg, error) {
var size int var size int
pos := 0 pos := 0
for c := <-data; c != ' '; c = <-data { for {
c, ok := <-data
if !ok {
return GitMsg{}, io.EOF
}
if c == ' ' {
break
}
if (c >= '0' && c <= '9') || (c >= 'a' && c <= 'f') { if (c >= '0' && c <= '9') || (c >= 'a' && c <= 'f') {
id[pos] = c id[pos] = c
pos++ pos++
@@ -483,7 +506,15 @@ func parseGitMsg(data <-chan byte) (GitMsg, error) {
pos = 0 pos = 0
var c byte var c byte
for c = <-data; c != ' ' && c != '\x00'; c = <-data { for {
var ok bool
c, ok = <-data
if !ok {
return GitMsg{}, io.EOF
}
if c == ' ' || c == '\x00' {
break
}
if c >= 'a' && c <= 'z' { if c >= 'a' && c <= 'z' {
msgType[pos] = c msgType[pos] = c
pos++ pos++
@@ -509,7 +540,14 @@ func parseGitMsg(data <-chan byte) (GitMsg, error) {
return GitMsg{}, fmt.Errorf("Invalid object type: '%s'", string(msgType)) return GitMsg{}, fmt.Errorf("Invalid object type: '%s'", string(msgType))
} }
for c = <-data; c != '\000'; c = <-data { for {
c, ok := <-data
if !ok {
return GitMsg{}, io.EOF
}
if c == '\x00' {
break
}
if c >= '0' && c <= '9' { if c >= '0' && c <= '9' {
size = size*10 + (int(c) - '0') size = size*10 + (int(c) - '0')
} else { } else {
@@ -528,18 +566,37 @@ func parseGitCommitHdr(oldHdr [2]string, data <-chan byte) ([2]string, int, erro
hdr := make([]byte, 0, 60) hdr := make([]byte, 0, 60)
val := make([]byte, 0, 1000) val := make([]byte, 0, 1000)
c := <-data c, ok := <-data
if !ok {
return [2]string{}, 0, io.EOF
}
size := 1 size := 1
if c != '\n' { // end of header marker if c != '\n' { // end of header marker
for ; c != ' '; c = <-data { for {
if c == ' ' {
break
}
hdr = append(hdr, c) hdr = append(hdr, c)
size++ size++
var ok bool
c, ok = <-data
if !ok {
return [2]string{}, size, io.EOF
}
} }
if size == 1 { // continuation header here if size == 1 { // continuation header here
hdr = []byte(oldHdr[0]) hdr = []byte(oldHdr[0])
val = append([]byte(oldHdr[1]), '\n') val = append([]byte(oldHdr[1]), '\n')
} }
for c := <-data; c != '\n'; c = <-data { for {
var ok bool
c, ok = <-data
if !ok {
return [2]string{}, size, io.EOF
}
if c == '\n' {
break
}
val = append(val, c) val = append(val, c)
size++ size++
} }
@@ -552,7 +609,14 @@ func parseGitCommitHdr(oldHdr [2]string, data <-chan byte) ([2]string, int, erro
func parseGitCommitMsg(data <-chan byte, l int) (string, error) { func parseGitCommitMsg(data <-chan byte, l int) (string, error) {
msg := make([]byte, 0, l) msg := make([]byte, 0, l)
for c := <-data; c != '\x00'; c = <-data { for {
c, ok := <-data
if !ok {
return string(msg), io.EOF
}
if c == '\x00' {
break
}
msg = append(msg, c) msg = append(msg, c)
l-- l--
} }
@@ -578,7 +642,7 @@ func parseGitCommit(data <-chan byte) (GitCommit, error) {
var hdr [2]string var hdr [2]string
hdr, size, err := parseGitCommitHdr(hdr, data) hdr, size, err := parseGitCommitHdr(hdr, data)
if err != nil { if err != nil {
return GitCommit{}, nil return GitCommit{}, err
} }
l -= size l -= size
@@ -599,14 +663,28 @@ func parseGitCommit(data <-chan byte) (GitCommit, error) {
func parseTreeEntry(data <-chan byte, hashLen int) (GitTreeEntry, error) { func parseTreeEntry(data <-chan byte, hashLen int) (GitTreeEntry, error) {
var e GitTreeEntry var e GitTreeEntry
for c := <-data; c != ' '; c = <-data { for {
c, ok := <-data
if !ok {
return e, io.EOF
}
if c == ' ' {
break
}
e.mode = e.mode*8 + int(c-'0') e.mode = e.mode*8 + int(c-'0')
e.size++ e.size++
} }
e.size++ e.size++
name := make([]byte, 0, 128) name := make([]byte, 0, 128)
for c := <-data; c != '\x00'; c = <-data { for {
c, ok := <-data
if !ok {
return e, io.EOF
}
if c == '\x00' {
break
}
name = append(name, c) name = append(name, c)
e.size++ e.size++
} }
@@ -617,7 +695,10 @@ func parseTreeEntry(data <-chan byte, hashLen int) (GitTreeEntry, error) {
hash := make([]byte, 0, hashLen*2) hash := make([]byte, 0, hashLen*2)
for range hashLen { for range hashLen {
c := <-data c, ok := <-data
if !ok {
return e, io.EOF
}
hash = append(hash, hexBinToAscii[((c&0xF0)>>4)], hexBinToAscii[c&0xF]) hash = append(hash, hexBinToAscii[((c&0xF0)>>4)], hexBinToAscii[c&0xF])
} }
e.hash = string(hash) e.hash = string(hash)
@@ -638,13 +719,16 @@ func parseGitTree(data <-chan byte) (GitTree, error) {
for parsedLen < hdr.size { for parsedLen < hdr.size {
entry, err := parseTreeEntry(data, len(hdr.hash)/2) entry, err := parseTreeEntry(data, len(hdr.hash)/2)
if err != nil { if err != nil {
return GitTree{}, nil return GitTree{}, err
} }
t.items = append(t.items, entry) t.items = append(t.items, entry)
parsedLen += entry.size parsedLen += entry.size
} }
c := <-data // \0 read c, ok := <-data // \0 read
if !ok {
return t, io.EOF
}
if c != '\x00' { if c != '\x00' {
return t, fmt.Errorf("Unexpected character during git tree data read") return t, fmt.Errorf("Unexpected character during git tree data read")
@@ -665,9 +749,16 @@ func parseGitBlob(data <-chan byte) ([]byte, error) {
d := make([]byte, hdr.size) d := make([]byte, hdr.size)
for l := 0; l < hdr.size; l++ { for l := 0; l < hdr.size; l++ {
d[l] = <-data var ok bool
d[l], ok = <-data
if !ok {
return d, io.EOF
}
}
eob, ok := <-data
if !ok {
return d, io.EOF
} }
eob := <-data
if eob != '\x00' { if eob != '\x00' {
return d, fmt.Errorf("invalid byte read in parseGitBlob") return d, fmt.Errorf("invalid byte read in parseGitBlob")
} }
@@ -679,16 +770,25 @@ func (e *GitHandlerImpl) GitParseCommits(cwd string, commitIDs []string) (parsed
var done sync.Mutex var done sync.Mutex
done.Lock() done.Lock()
data_in, data_out := ChanIO{make(chan byte)}, ChanIO{make(chan byte)} done_signal := make(chan struct{})
var once sync.Once
close_done := func() {
once.Do(func() {
close(done_signal)
})
}
data_in, data_out := ChanIO{make(chan byte), done_signal}, ChanIO{make(chan byte), done_signal}
parsedCommits = make([]GitCommit, 0, len(commitIDs)) parsedCommits = make([]GitCommit, 0, len(commitIDs))
go func() { go func() {
defer done.Unlock() defer done.Unlock()
defer close_done()
defer close(data_out.ch) defer close(data_out.ch)
for _, id := range commitIDs { for _, id := range commitIDs {
data_out.Write([]byte(id)) data_out.Write([]byte(id))
data_out.ch <- '\x00' data_out.Write([]byte{0})
c, e := parseGitCommit(data_in.ch) c, e := parseGitCommit(data_in.ch)
if e != nil { if e != nil {
err = fmt.Errorf("Error parsing git commit: %w", e) err = fmt.Errorf("Error parsing git commit: %w", e)
@@ -715,12 +815,14 @@ func (e *GitHandlerImpl) GitParseCommits(cwd string, commitIDs []string) (parsed
LogDebug("command run:", cmd.Args) LogDebug("command run:", cmd.Args)
if e := cmd.Run(); e != nil { if e := cmd.Run(); e != nil {
LogError(e) LogError(e)
close_done()
close(data_in.ch) close(data_in.ch)
close(data_out.ch)
return nil, e return nil, e
} }
done.Lock() done.Lock()
close_done()
close(data_in.ch)
return return
} }
@@ -729,15 +831,21 @@ func (e *GitHandlerImpl) GitCatFile(cwd, commitId, filename string) (data []byte
var done sync.Mutex var done sync.Mutex
done.Lock() done.Lock()
data_in, data_out := ChanIO{make(chan byte)}, ChanIO{make(chan byte)} done_signal := make(chan struct{})
var once sync.Once
close_done := func() {
once.Do(func() {
close(done_signal)
})
}
data_in, data_out := ChanIO{make(chan byte), done_signal}, ChanIO{make(chan byte), done_signal}
go func() { go func() {
defer done.Unlock() defer done.Unlock()
defer close_done()
defer close(data_out.ch) defer close(data_out.ch)
data_out.Write([]byte(commitId)) data_out.Write([]byte(commitId))
data_out.ch <- '\x00' data_out.Write([]byte{0})
var c GitCommit var c GitCommit
c, err = parseGitCommit(data_in.ch) c, err = parseGitCommit(data_in.ch)
if err != nil { if err != nil {
@@ -745,11 +853,9 @@ func (e *GitHandlerImpl) GitCatFile(cwd, commitId, filename string) (data []byte
return return
} }
data_out.Write([]byte(c.Tree)) data_out.Write([]byte(c.Tree))
data_out.ch <- '\x00' data_out.Write([]byte{0})
var tree GitTree var tree GitTree
tree, err = parseGitTree(data_in.ch) tree, err = parseGitTree(data_in.ch)
if err != nil { if err != nil {
LogError("Error parsing git tree:", err) LogError("Error parsing git tree:", err)
return return
@@ -759,7 +865,7 @@ func (e *GitHandlerImpl) GitCatFile(cwd, commitId, filename string) (data []byte
if te.isBlob() && te.name == filename { if te.isBlob() && te.name == filename {
LogInfo("blob", te.hash) LogInfo("blob", te.hash)
data_out.Write([]byte(te.hash)) data_out.Write([]byte(te.hash))
data_out.ch <- '\x00' data_out.Write([]byte{0})
data, err = parseGitBlob(data_in.ch) data, err = parseGitBlob(data_in.ch)
return return
} }
@@ -784,11 +890,13 @@ func (e *GitHandlerImpl) GitCatFile(cwd, commitId, filename string) (data []byte
LogDebug("command run:", cmd.Args) LogDebug("command run:", cmd.Args)
if e := cmd.Run(); e != nil { if e := cmd.Run(); e != nil {
LogError(e) LogError(e)
close_done()
close(data_in.ch) close(data_in.ch)
close(data_out.ch)
return nil, e return nil, e
} }
done.Lock() done.Lock()
close_done()
close(data_in.ch)
return return
} }
@@ -798,16 +906,24 @@ func (e *GitHandlerImpl) GitDirectoryList(gitPath, commitId string) (directoryLi
directoryList = make(map[string]string) directoryList = make(map[string]string)
done.Lock() done.Lock()
data_in, data_out := ChanIO{make(chan byte)}, ChanIO{make(chan byte)} done_signal := make(chan struct{})
var once sync.Once
close_done := func() {
once.Do(func() {
close(done_signal)
})
}
data_in, data_out := ChanIO{make(chan byte), done_signal}, ChanIO{make(chan byte), done_signal}
LogDebug("Getting directory for:", commitId) LogDebug("Getting directory for:", commitId)
go func() { go func() {
defer done.Unlock() defer done.Unlock()
defer close_done()
defer close(data_out.ch) defer close(data_out.ch)
data_out.Write([]byte(commitId)) data_out.Write([]byte(commitId))
data_out.ch <- '\x00' data_out.Write([]byte{0})
var c GitCommit var c GitCommit
c, err = parseGitCommit(data_in.ch) c, err = parseGitCommit(data_in.ch)
if err != nil { if err != nil {
@@ -823,7 +939,7 @@ func (e *GitHandlerImpl) GitDirectoryList(gitPath, commitId string) (directoryLi
delete(trees, p) delete(trees, p)
data_out.Write([]byte(tree)) data_out.Write([]byte(tree))
data_out.ch <- '\x00' data_out.Write([]byte{0})
var tree GitTree var tree GitTree
tree, err = parseGitTree(data_in.ch) tree, err = parseGitTree(data_in.ch)
@@ -857,12 +973,14 @@ func (e *GitHandlerImpl) GitDirectoryList(gitPath, commitId string) (directoryLi
LogDebug("command run:", cmd.Args) LogDebug("command run:", cmd.Args)
if e := cmd.Run(); e != nil { if e := cmd.Run(); e != nil {
LogError(e) LogError(e)
close_done()
close(data_in.ch) close(data_in.ch)
close(data_out.ch)
return directoryList, e return directoryList, e
} }
done.Lock() done.Lock()
close_done()
close(data_in.ch)
return directoryList, err return directoryList, err
} }
@@ -872,7 +990,14 @@ func (e *GitHandlerImpl) GitDirectoryContentList(gitPath, commitId string) (dire
directoryList = make(map[string]string) directoryList = make(map[string]string)
done.Lock() done.Lock()
data_in, data_out := ChanIO{make(chan byte)}, ChanIO{make(chan byte)} done_signal := make(chan struct{})
var once sync.Once
close_done := func() {
once.Do(func() {
close(done_signal)
})
}
data_in, data_out := ChanIO{make(chan byte), done_signal}, ChanIO{make(chan byte), done_signal}
LogDebug("Getting directory content for:", commitId) LogDebug("Getting directory content for:", commitId)
@@ -881,7 +1006,7 @@ func (e *GitHandlerImpl) GitDirectoryContentList(gitPath, commitId string) (dire
defer close(data_out.ch) defer close(data_out.ch)
data_out.Write([]byte(commitId)) data_out.Write([]byte(commitId))
data_out.ch <- '\x00' data_out.Write([]byte{0})
var c GitCommit var c GitCommit
c, err = parseGitCommit(data_in.ch) c, err = parseGitCommit(data_in.ch)
if err != nil { if err != nil {
@@ -897,7 +1022,7 @@ func (e *GitHandlerImpl) GitDirectoryContentList(gitPath, commitId string) (dire
delete(trees, p) delete(trees, p)
data_out.Write([]byte(tree)) data_out.Write([]byte(tree))
data_out.ch <- '\x00' data_out.Write([]byte{0})
var tree GitTree var tree GitTree
tree, err = parseGitTree(data_in.ch) tree, err = parseGitTree(data_in.ch)
@@ -933,12 +1058,14 @@ func (e *GitHandlerImpl) GitDirectoryContentList(gitPath, commitId string) (dire
LogDebug("command run:", cmd.Args) LogDebug("command run:", cmd.Args)
if e := cmd.Run(); e != nil { if e := cmd.Run(); e != nil {
LogError(e) LogError(e)
close_done()
close(data_in.ch) close(data_in.ch)
close(data_out.ch)
return directoryList, e return directoryList, e
} }
done.Lock() done.Lock()
close_done()
close(data_in.ch)
return directoryList, err return directoryList, err
} }
@@ -948,16 +1075,24 @@ func (e *GitHandlerImpl) GitSubmoduleList(gitPath, commitId string) (submoduleLi
submoduleList = make(map[string]string) submoduleList = make(map[string]string)
done.Lock() done.Lock()
data_in, data_out := ChanIO{make(chan byte)}, ChanIO{make(chan byte)} done_signal := make(chan struct{})
var once sync.Once
close_done := func() {
once.Do(func() {
close(done_signal)
})
}
data_in, data_out := ChanIO{make(chan byte), done_signal}, ChanIO{make(chan byte), done_signal}
LogDebug("Getting submodules for:", commitId) LogDebug("Getting submodules for:", commitId)
go func() { go func() {
defer done.Unlock() defer done.Unlock()
defer close_done()
defer close(data_out.ch) defer close(data_out.ch)
data_out.Write([]byte(commitId)) data_out.Write([]byte(commitId))
data_out.ch <- '\x00' data_out.Write([]byte{0})
var c GitCommit var c GitCommit
c, err = parseGitCommit(data_in.ch) c, err = parseGitCommit(data_in.ch)
if err != nil { if err != nil {
@@ -973,7 +1108,7 @@ func (e *GitHandlerImpl) GitSubmoduleList(gitPath, commitId string) (submoduleLi
delete(trees, p) delete(trees, p)
data_out.Write([]byte(tree)) data_out.Write([]byte(tree))
data_out.ch <- '\x00' data_out.Write([]byte{0})
var tree GitTree var tree GitTree
tree, err = parseGitTree(data_in.ch) tree, err = parseGitTree(data_in.ch)
@@ -1010,17 +1145,26 @@ func (e *GitHandlerImpl) GitSubmoduleList(gitPath, commitId string) (submoduleLi
LogDebug("command run:", cmd.Args) LogDebug("command run:", cmd.Args)
if e := cmd.Run(); e != nil { if e := cmd.Run(); e != nil {
LogError(e) LogError(e)
close_done()
close(data_in.ch) close(data_in.ch)
close(data_out.ch)
return submoduleList, e return submoduleList, e
} }
done.Lock() done.Lock()
close_done()
close(data_in.ch)
return submoduleList, err return submoduleList, err
} }
func (e *GitHandlerImpl) GitSubmoduleCommitId(cwd, packageName, commitId string) (subCommitId string, valid bool) { func (e *GitHandlerImpl) GitSubmoduleCommitId(cwd, packageName, commitId string) (subCommitId string, valid bool) {
data_in, data_out := ChanIO{make(chan byte)}, ChanIO{make(chan byte)} done_signal := make(chan struct{})
var once sync.Once
close_done := func() {
once.Do(func() {
close(done_signal)
})
}
data_in, data_out := ChanIO{make(chan byte), done_signal}, ChanIO{make(chan byte), done_signal}
var wg sync.WaitGroup var wg sync.WaitGroup
wg.Add(1) wg.Add(1)
@@ -1036,17 +1180,18 @@ func (e *GitHandlerImpl) GitSubmoduleCommitId(cwd, packageName, commitId string)
}() }()
defer wg.Done() defer wg.Done()
defer close_done()
defer close(data_out.ch) defer close(data_out.ch)
data_out.Write([]byte(commitId)) data_out.Write([]byte(commitId))
data_out.ch <- '\x00' data_out.Write([]byte{0})
c, err := parseGitCommit(data_in.ch) c, err := parseGitCommit(data_in.ch)
if err != nil { if err != nil {
LogError("Error parsing git commit:", err) LogError("Error parsing git commit:", err)
panic(err) panic(err)
} }
data_out.Write([]byte(c.Tree)) data_out.Write([]byte(c.Tree))
data_out.ch <- '\x00' data_out.Write([]byte{0})
tree, err := parseGitTree(data_in.ch) tree, err := parseGitTree(data_in.ch)
if err != nil { if err != nil {
@@ -1078,12 +1223,14 @@ func (e *GitHandlerImpl) GitSubmoduleCommitId(cwd, packageName, commitId string)
LogDebug("command run:", cmd.Args) LogDebug("command run:", cmd.Args)
if e := cmd.Run(); e != nil { if e := cmd.Run(); e != nil {
LogError(e) LogError(e)
close_done()
close(data_in.ch) close(data_in.ch)
close(data_out.ch)
return subCommitId, false return subCommitId, false
} }
wg.Wait() wg.Wait()
close_done()
close(data_in.ch)
return subCommitId, len(subCommitId) > 0 return subCommitId, len(subCommitId) > 0
} }

View File

@@ -28,6 +28,7 @@ import (
"slices" "slices"
"strings" "strings"
"testing" "testing"
"time"
) )
func TestGitClone(t *testing.T) { func TestGitClone(t *testing.T) {
@@ -717,3 +718,44 @@ func TestGitDirectoryListRepro(t *testing.T) {
t.Errorf("Expected 'subdir' in directory list, got %v", dirs) t.Errorf("Expected 'subdir' in directory list, got %v", dirs)
} }
} }
func TestGitDeadlockFix(t *testing.T) {
gitDir := t.TempDir()
testDir, _ := os.Getwd()
cmd := exec.Command("/usr/bin/bash", path.Join(testDir, "tsetup.sh"))
cmd.Dir = gitDir
_, err := cmd.CombinedOutput()
gh, err := AllocateGitWorkTree(gitDir, "Test", "test@example.com")
if err != nil {
t.Fatal(err)
}
h, err := gh.ReadExistingPath(".")
if err != nil {
t.Fatal(err)
}
defer h.Close()
// Use a blob ID to trigger error in GitParseCommits
// This ensures that the function returns error immediately and doesn't deadlock
blobId := "81aba862107f1e2f5312e165453955485f424612f313d6c2fb1b31fef9f82a14"
done := make(chan error)
go func() {
_, err := h.GitParseCommits("", []string{blobId})
done <- err
}()
select {
case err := <-done:
if err == nil {
t.Error("Expected error from GitParseCommits with blob ID, got nil")
} else {
// This is expected
t.Logf("Got expected error: %v", err)
}
case <-time.After(2 * time.Second):
t.Fatal("GitParseCommits deadlocked! Fix is NOT working.")
}
}

View File

@@ -83,3 +83,260 @@ func (c *MockObsStatusFetcherWithStateBuildStatusWithStateCall) DoAndReturn(f fu
c.Call = c.Call.DoAndReturn(f) c.Call = c.Call.DoAndReturn(f)
return c return c
} }
// MockObsClientInterface is a mock of ObsClientInterface interface.
type MockObsClientInterface struct {
ctrl *gomock.Controller
recorder *MockObsClientInterfaceMockRecorder
isgomock struct{}
}
// MockObsClientInterfaceMockRecorder is the mock recorder for MockObsClientInterface.
type MockObsClientInterfaceMockRecorder struct {
mock *MockObsClientInterface
}
// NewMockObsClientInterface creates a new mock instance.
func NewMockObsClientInterface(ctrl *gomock.Controller) *MockObsClientInterface {
mock := &MockObsClientInterface{ctrl: ctrl}
mock.recorder = &MockObsClientInterfaceMockRecorder{mock}
return mock
}
// EXPECT returns an object that allows the caller to indicate expected use.
func (m *MockObsClientInterface) EXPECT() *MockObsClientInterfaceMockRecorder {
return m.recorder
}
// BuildStatus mocks base method.
func (m *MockObsClientInterface) BuildStatus(project string, packages ...string) (*common.BuildResultList, error) {
m.ctrl.T.Helper()
varargs := []any{project}
for _, a := range packages {
varargs = append(varargs, a)
}
ret := m.ctrl.Call(m, "BuildStatus", varargs...)
ret0, _ := ret[0].(*common.BuildResultList)
ret1, _ := ret[1].(error)
return ret0, ret1
}
// BuildStatus indicates an expected call of BuildStatus.
func (mr *MockObsClientInterfaceMockRecorder) BuildStatus(project any, packages ...any) *MockObsClientInterfaceBuildStatusCall {
mr.mock.ctrl.T.Helper()
varargs := append([]any{project}, packages...)
call := mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "BuildStatus", reflect.TypeOf((*MockObsClientInterface)(nil).BuildStatus), varargs...)
return &MockObsClientInterfaceBuildStatusCall{Call: call}
}
// MockObsClientInterfaceBuildStatusCall wrap *gomock.Call
type MockObsClientInterfaceBuildStatusCall struct {
*gomock.Call
}
// Return rewrite *gomock.Call.Return
func (c *MockObsClientInterfaceBuildStatusCall) Return(arg0 *common.BuildResultList, arg1 error) *MockObsClientInterfaceBuildStatusCall {
c.Call = c.Call.Return(arg0, arg1)
return c
}
// Do rewrite *gomock.Call.Do
func (c *MockObsClientInterfaceBuildStatusCall) Do(f func(string, ...string) (*common.BuildResultList, error)) *MockObsClientInterfaceBuildStatusCall {
c.Call = c.Call.Do(f)
return c
}
// DoAndReturn rewrite *gomock.Call.DoAndReturn
func (c *MockObsClientInterfaceBuildStatusCall) DoAndReturn(f func(string, ...string) (*common.BuildResultList, error)) *MockObsClientInterfaceBuildStatusCall {
c.Call = c.Call.DoAndReturn(f)
return c
}
// DeleteProject mocks base method.
func (m *MockObsClientInterface) DeleteProject(project string) error {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "DeleteProject", project)
ret0, _ := ret[0].(error)
return ret0
}
// DeleteProject indicates an expected call of DeleteProject.
func (mr *MockObsClientInterfaceMockRecorder) DeleteProject(project any) *MockObsClientInterfaceDeleteProjectCall {
mr.mock.ctrl.T.Helper()
call := mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "DeleteProject", reflect.TypeOf((*MockObsClientInterface)(nil).DeleteProject), project)
return &MockObsClientInterfaceDeleteProjectCall{Call: call}
}
// MockObsClientInterfaceDeleteProjectCall wrap *gomock.Call
type MockObsClientInterfaceDeleteProjectCall struct {
*gomock.Call
}
// Return rewrite *gomock.Call.Return
func (c *MockObsClientInterfaceDeleteProjectCall) Return(arg0 error) *MockObsClientInterfaceDeleteProjectCall {
c.Call = c.Call.Return(arg0)
return c
}
// Do rewrite *gomock.Call.Do
func (c *MockObsClientInterfaceDeleteProjectCall) Do(f func(string) error) *MockObsClientInterfaceDeleteProjectCall {
c.Call = c.Call.Do(f)
return c
}
// DoAndReturn rewrite *gomock.Call.DoAndReturn
func (c *MockObsClientInterfaceDeleteProjectCall) DoAndReturn(f func(string) error) *MockObsClientInterfaceDeleteProjectCall {
c.Call = c.Call.DoAndReturn(f)
return c
}
// GetHomeProject mocks base method.
func (m *MockObsClientInterface) GetHomeProject() string {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "GetHomeProject")
ret0, _ := ret[0].(string)
return ret0
}
// GetHomeProject indicates an expected call of GetHomeProject.
func (mr *MockObsClientInterfaceMockRecorder) GetHomeProject() *MockObsClientInterfaceGetHomeProjectCall {
mr.mock.ctrl.T.Helper()
call := mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "GetHomeProject", reflect.TypeOf((*MockObsClientInterface)(nil).GetHomeProject))
return &MockObsClientInterfaceGetHomeProjectCall{Call: call}
}
// MockObsClientInterfaceGetHomeProjectCall wrap *gomock.Call
type MockObsClientInterfaceGetHomeProjectCall struct {
*gomock.Call
}
// Return rewrite *gomock.Call.Return
func (c *MockObsClientInterfaceGetHomeProjectCall) Return(arg0 string) *MockObsClientInterfaceGetHomeProjectCall {
c.Call = c.Call.Return(arg0)
return c
}
// Do rewrite *gomock.Call.Do
func (c *MockObsClientInterfaceGetHomeProjectCall) Do(f func() string) *MockObsClientInterfaceGetHomeProjectCall {
c.Call = c.Call.Do(f)
return c
}
// DoAndReturn rewrite *gomock.Call.DoAndReturn
func (c *MockObsClientInterfaceGetHomeProjectCall) DoAndReturn(f func() string) *MockObsClientInterfaceGetHomeProjectCall {
c.Call = c.Call.DoAndReturn(f)
return c
}
// GetProjectMeta mocks base method.
func (m *MockObsClientInterface) GetProjectMeta(project string) (*common.ProjectMeta, error) {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "GetProjectMeta", project)
ret0, _ := ret[0].(*common.ProjectMeta)
ret1, _ := ret[1].(error)
return ret0, ret1
}
// GetProjectMeta indicates an expected call of GetProjectMeta.
func (mr *MockObsClientInterfaceMockRecorder) GetProjectMeta(project any) *MockObsClientInterfaceGetProjectMetaCall {
mr.mock.ctrl.T.Helper()
call := mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "GetProjectMeta", reflect.TypeOf((*MockObsClientInterface)(nil).GetProjectMeta), project)
return &MockObsClientInterfaceGetProjectMetaCall{Call: call}
}
// MockObsClientInterfaceGetProjectMetaCall wrap *gomock.Call
type MockObsClientInterfaceGetProjectMetaCall struct {
*gomock.Call
}
// Return rewrite *gomock.Call.Return
func (c *MockObsClientInterfaceGetProjectMetaCall) Return(arg0 *common.ProjectMeta, arg1 error) *MockObsClientInterfaceGetProjectMetaCall {
c.Call = c.Call.Return(arg0, arg1)
return c
}
// Do rewrite *gomock.Call.Do
func (c *MockObsClientInterfaceGetProjectMetaCall) Do(f func(string) (*common.ProjectMeta, error)) *MockObsClientInterfaceGetProjectMetaCall {
c.Call = c.Call.Do(f)
return c
}
// DoAndReturn rewrite *gomock.Call.DoAndReturn
func (c *MockObsClientInterfaceGetProjectMetaCall) DoAndReturn(f func(string) (*common.ProjectMeta, error)) *MockObsClientInterfaceGetProjectMetaCall {
c.Call = c.Call.DoAndReturn(f)
return c
}
// SetHomeProject mocks base method.
func (m *MockObsClientInterface) SetHomeProject(project string) {
m.ctrl.T.Helper()
m.ctrl.Call(m, "SetHomeProject", project)
}
// SetHomeProject indicates an expected call of SetHomeProject.
func (mr *MockObsClientInterfaceMockRecorder) SetHomeProject(project any) *MockObsClientInterfaceSetHomeProjectCall {
mr.mock.ctrl.T.Helper()
call := mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "SetHomeProject", reflect.TypeOf((*MockObsClientInterface)(nil).SetHomeProject), project)
return &MockObsClientInterfaceSetHomeProjectCall{Call: call}
}
// MockObsClientInterfaceSetHomeProjectCall wrap *gomock.Call
type MockObsClientInterfaceSetHomeProjectCall struct {
*gomock.Call
}
// Return rewrite *gomock.Call.Return
func (c *MockObsClientInterfaceSetHomeProjectCall) Return() *MockObsClientInterfaceSetHomeProjectCall {
c.Call = c.Call.Return()
return c
}
// Do rewrite *gomock.Call.Do
func (c *MockObsClientInterfaceSetHomeProjectCall) Do(f func(string)) *MockObsClientInterfaceSetHomeProjectCall {
c.Call = c.Call.Do(f)
return c
}
// DoAndReturn rewrite *gomock.Call.DoAndReturn
func (c *MockObsClientInterfaceSetHomeProjectCall) DoAndReturn(f func(string)) *MockObsClientInterfaceSetHomeProjectCall {
c.Call = c.Call.DoAndReturn(f)
return c
}
// SetProjectMeta mocks base method.
func (m *MockObsClientInterface) SetProjectMeta(meta *common.ProjectMeta) error {
m.ctrl.T.Helper()
ret := m.ctrl.Call(m, "SetProjectMeta", meta)
ret0, _ := ret[0].(error)
return ret0
}
// SetProjectMeta indicates an expected call of SetProjectMeta.
func (mr *MockObsClientInterfaceMockRecorder) SetProjectMeta(meta any) *MockObsClientInterfaceSetProjectMetaCall {
mr.mock.ctrl.T.Helper()
call := mr.mock.ctrl.RecordCallWithMethodType(mr.mock, "SetProjectMeta", reflect.TypeOf((*MockObsClientInterface)(nil).SetProjectMeta), meta)
return &MockObsClientInterfaceSetProjectMetaCall{Call: call}
}
// MockObsClientInterfaceSetProjectMetaCall wrap *gomock.Call
type MockObsClientInterfaceSetProjectMetaCall struct {
*gomock.Call
}
// Return rewrite *gomock.Call.Return
func (c *MockObsClientInterfaceSetProjectMetaCall) Return(arg0 error) *MockObsClientInterfaceSetProjectMetaCall {
c.Call = c.Call.Return(arg0)
return c
}
// Do rewrite *gomock.Call.Do
func (c *MockObsClientInterfaceSetProjectMetaCall) Do(f func(*common.ProjectMeta) error) *MockObsClientInterfaceSetProjectMetaCall {
c.Call = c.Call.Do(f)
return c
}
// DoAndReturn rewrite *gomock.Call.DoAndReturn
func (c *MockObsClientInterfaceSetProjectMetaCall) DoAndReturn(f func(*common.ProjectMeta) error) *MockObsClientInterfaceSetProjectMetaCall {
c.Call = c.Call.DoAndReturn(f)
return c
}

View File

@@ -46,6 +46,15 @@ type ObsStatusFetcherWithState interface {
BuildStatusWithState(project string, opts *BuildResultOptions, packages ...string) (*BuildResultList, error) BuildStatusWithState(project string, opts *BuildResultOptions, packages ...string) (*BuildResultList, error)
} }
type ObsClientInterface interface {
GetProjectMeta(project string) (*ProjectMeta, error)
SetProjectMeta(meta *ProjectMeta) error
DeleteProject(project string) error
BuildStatus(project string, packages ...string) (*BuildResultList, error)
GetHomeProject() string
SetHomeProject(project string)
}
type ObsClient struct { type ObsClient struct {
baseUrl *url.URL baseUrl *url.URL
client *http.Client client *http.Client
@@ -57,6 +66,14 @@ type ObsClient struct {
HomeProject string HomeProject string
} }
func (c *ObsClient) GetHomeProject() string {
return c.HomeProject
}
func (c *ObsClient) SetHomeProject(project string) {
c.HomeProject = project
}
func NewObsClient(host string) (*ObsClient, error) { func NewObsClient(host string) (*ObsClient, error) {
baseUrl, err := url.Parse(host) baseUrl, err := url.Parse(host)
if err != nil { if err != nil {
@@ -520,7 +537,7 @@ func ObsSafeProjectName(prjname string) string {
} }
var ValidBlockModes []string = []string{"all", "local", "never"} var ValidBlockModes []string = []string{"all", "local", "never"}
var ValidPrjLinkModes []string = []string{"off", "localdep", "alldirect", "all"} var ValidPrjLinkModes []string = []string{"off", "localdep", "alldirect", "alldirect_or_localdep", "all"}
var ValidTriggerModes []string = []string{"transitive", "direct", "local"} var ValidTriggerModes []string = []string{"transitive", "direct", "local"}
func (c *ObsClient) SetProjectMeta(meta *ProjectMeta) error { func (c *ObsClient) SetProjectMeta(meta *ProjectMeta) error {
@@ -694,13 +711,15 @@ func (r *BuildResultList) BuildResultSummary() (success, finished bool) {
if !ok { if !ok {
panic("Unknown result code: " + result.Code) panic("Unknown result code: " + result.Code)
} }
if r.isLastBuild && result.Code == "unknown" { if r.isLastBuild {
// it means the package has never build yet, // we are always finished, since it is the last result
// but we don't know the reason // also when there is "unknown" state, it just means it
detail.Finished = true // it was never done
finished = true
} else {
finished = finished && detail.Finished
} }
finished = finished && detail.Finished
success = success && detail.Success success = success && detail.Success
if !finished { if !finished {

View File

@@ -468,7 +468,7 @@ func (rs *PRSet) IsApproved(gitea GiteaPRChecker, maintainers MaintainershipData
LogError("Cannot fetch gita reaviews for PR:", err) LogError("Cannot fetch gita reaviews for PR:", err)
return false return false
} }
r.RequestedReviewers = reviewers r.SetRequiredReviewers(reviewers)
prjgit.Reviews = r prjgit.Reviews = r
if prjgit.Reviews.IsManualMergeOK() { if prjgit.Reviews.IsManualMergeOK() {
is_manually_reviewed_ok = true is_manually_reviewed_ok = true
@@ -489,7 +489,7 @@ func (rs *PRSet) IsApproved(gitea GiteaPRChecker, maintainers MaintainershipData
LogError("Cannot fetch gita reaviews for PR:", err) LogError("Cannot fetch gita reaviews for PR:", err)
return false return false
} }
r.RequestedReviewers = reviewers r.SetRequiredReviewers(reviewers)
pr.Reviews = r pr.Reviews = r
if !pr.Reviews.IsManualMergeOK() { if !pr.Reviews.IsManualMergeOK() {
LogInfo("Not approved manual merge. PR:", pr.PR.URL) LogInfo("Not approved manual merge. PR:", pr.PR.URL)
@@ -530,7 +530,7 @@ func (rs *PRSet) IsApproved(gitea GiteaPRChecker, maintainers MaintainershipData
LogError("Cannot fetch gitea reaviews for PR:", err) LogError("Cannot fetch gitea reaviews for PR:", err)
return false return false
} }
r.RequestedReviewers = reviewers r.SetRequiredReviewers(reviewers)
is_manually_reviewed_ok = r.IsApproved() is_manually_reviewed_ok = r.IsApproved()
LogDebug("PR to", pr.PR.Base.Repo.Name, "reviewed?", is_manually_reviewed_ok) LogDebug("PR to", pr.PR.Base.Repo.Name, "reviewed?", is_manually_reviewed_ok)
@@ -554,6 +554,144 @@ func (rs *PRSet) IsApproved(gitea GiteaPRChecker, maintainers MaintainershipData
return is_manually_reviewed_ok return is_manually_reviewed_ok
} }
func (rs *PRSet) AddMergeCommit(git Git, remote string, pr int) bool {
prinfo := rs.PRs[pr]
LogDebug("Adding merge commit for %s", PRtoString(prinfo.PR))
if !prinfo.PR.AllowMaintainerEdit {
LogError(" PR is not editable by maintainer")
return false
}
repo := prinfo.PR.Base.Repo
head := prinfo.PR.Head
br := rs.Config.Branch
if len(br) == 0 {
br = prinfo.PR.Base.Name
}
msg := fmt.Sprintf("Merge branch '%s' into %s", br, head.Name)
if err := git.GitExec(repo.Name, "merge", "--no-ff", "--no-commit", "-X", "theirs", head.Sha); err != nil {
if err := git.GitExec(repo.Name, "merge", "--no-ff", "--no-commit", "--allow-unrelated-histories", "-X", "theirs", head.Sha); err != nil {
return false
}
LogError("WARNING: Merging unrelated histories")
}
// ensure only files that are in head.Sha are kept
git.GitExecOrPanic(repo.Name, "read-tree", "--reset", "-u", head.Sha)
git.GitExecOrPanic(repo.Name, "commit", "-m", msg)
if !IsDryRun {
git.GitExecOrPanic(repo.Name, "push", remote, "HEAD:"+head.Name)
prinfo.PR.Head.Sha = strings.TrimSpace(git.GitExecWithOutputOrPanic(repo.Name, "rev-list", "-1", "HEAD")) // need to update as it's pushed but pr not refetched
}
return true
}
func (rs *PRSet) HasMerge(git Git, pr int) bool {
prinfo := rs.PRs[pr]
repo := prinfo.PR.Base.Repo
head := prinfo.PR.Head
br := rs.Config.Branch
if len(br) == 0 {
br = prinfo.PR.Base.Name
}
parents, err := git.GitExecWithOutput(repo.Name, "show", "-s", "--format=%P", head.Sha)
if err == nil {
p := strings.Fields(strings.TrimSpace(parents))
if len(p) == 2 {
targetHead, _ := git.GitExecWithOutput(repo.Name, "rev-parse", "HEAD")
targetHead = strings.TrimSpace(targetHead)
if p[0] == targetHead || p[1] == targetHead {
return true
}
}
}
return false
}
func (rs *PRSet) PrepareForMerge(git Git) bool {
// verify that package can merge here. Checkout current target branch of each PRSet, make a temporary branch
// PR_#_mergetest and perform the merge based
if rs.Config.MergeMode == MergeModeDevel {
return true // always can merge as we set branch here, not merge anything
} else {
// make sure that all the package PRs are in mergeable state
for idx, prinfo := range rs.PRs {
if rs.IsPrjGitPR(prinfo.PR) {
continue
}
repo := prinfo.PR.Base.Repo
head := prinfo.PR.Head
br := rs.Config.Branch
if len(br) == 0 {
br = prinfo.PR.Base.Name
}
remote, err := git.GitClone(repo.Name, br, repo.SSHURL)
if err != nil {
return false
}
git.GitExecOrPanic(repo.Name, "fetch", remote, head.Sha)
switch rs.Config.MergeMode {
case MergeModeFF:
if err := git.GitExec(repo.Name, "merge-base", "--is-ancestor", "HEAD", head.Sha); err != nil {
return false
}
case MergeModeReplace:
Verify:
if err := git.GitExec(repo.Name, "merge-base", "--is-ancestor", "HEAD", head.Sha); err != nil {
if !rs.HasMerge(git, idx) {
forkRemote, err := git.GitClone(repo.Name, head.Name, head.Repo.SSHURL)
if err != nil {
LogError("Failed to clone head repo:", head.Name, head.Repo.SSHURL)
return false
}
LogDebug("Merge commit is missing and this is not FF merge possibility")
git.GitExecOrPanic(repo.Name, "checkout", remote+"/"+br)
if !rs.AddMergeCommit(git, forkRemote, idx) {
return false
}
if !IsDryRun {
goto Verify
}
}
}
}
}
}
// now we check project git if mergeable
prjgit_info, err := rs.GetPrjGitPR()
if err != nil {
return false
}
prjgit := prjgit_info.PR
_, _, prjgitBranch := rs.Config.GetPrjGit()
remote, err := git.GitClone(DefaultGitPrj, prjgitBranch, prjgit.Base.Repo.SSHURL)
if err != nil {
return false
}
testBranch := fmt.Sprintf("PR_%d_mergetest", prjgit.Index)
git.GitExecOrPanic(DefaultGitPrj, "fetch", remote, prjgit.Head.Sha)
if err := git.GitExec(DefaultGitPrj, "checkout", "-B", testBranch, prjgit.Base.Sha); err != nil {
return false
}
if err := git.GitExec(DefaultGitPrj, "merge", "--no-ff", "--no-commit", prjgit.Head.Sha); err != nil {
return false
}
return true
}
func (rs *PRSet) Merge(gitea GiteaReviewUnrequester, git Git) error { func (rs *PRSet) Merge(gitea GiteaReviewUnrequester, git Git) error {
prjgit_info, err := rs.GetPrjGitPR() prjgit_info, err := rs.GetPrjGitPR()
if err != nil { if err != nil {
@@ -718,10 +856,8 @@ func (rs *PRSet) Merge(gitea GiteaReviewUnrequester, git Git) error {
prinfo.RemoteName, err = git.GitClone(repo.Name, br, repo.SSHURL) prinfo.RemoteName, err = git.GitClone(repo.Name, br, repo.SSHURL)
PanicOnError(err) PanicOnError(err)
git.GitExecOrPanic(repo.Name, "fetch", prinfo.RemoteName, head.Sha) git.GitExecOrPanic(repo.Name, "fetch", prinfo.RemoteName, head.Sha)
if rs.Config.MergeMode == MergeModeDevel || isNewRepo {
if isNewRepo { git.GitExecOrPanic(repo.Name, "checkout", "-B", br, head.Sha)
LogInfo("Force-pushing new repository branch", br, "to", head.Sha)
// we don't merge, we just set the branch to this commit
} else { } else {
git.GitExecOrPanic(repo.Name, "merge", "--ff", head.Sha) git.GitExecOrPanic(repo.Name, "merge", "--ff", head.Sha)
} }
@@ -748,11 +884,15 @@ func (rs *PRSet) Merge(gitea GiteaReviewUnrequester, git Git) error {
} }
if !IsDryRun { if !IsDryRun {
if isNewRepo { params := []string{"push"}
git.GitExecOrPanic(repo.Name, "push", "-f", prinfo.RemoteName, prinfo.PR.Head.Sha+":"+prinfo.PR.Base.Name) if rs.Config.MergeMode == MergeModeDevel || isNewRepo {
} else { params = append(params, "-f")
git.GitExecOrPanic(repo.Name, "push", prinfo.RemoteName)
} }
params = append(params, prinfo.RemoteName)
if isNewRepo {
params = append(params, prinfo.PR.Head.Sha+":"+prinfo.PR.Base.Name)
}
git.GitExecOrPanic(repo.Name, params...)
} else { } else {
LogInfo("*** WOULD push", repo.Name, "to", prinfo.RemoteName) LogInfo("*** WOULD push", repo.Name, "to", prinfo.RemoteName)
} }

View File

@@ -73,6 +73,7 @@ func TestPRSet_Merge_Special(t *testing.T) {
// Clone and fetch for new-pkg // Clone and fetch for new-pkg
mockGit.EXPECT().GitClone("new-pkg", "main", "pkg-ssh-url").Return("origin", nil) mockGit.EXPECT().GitClone("new-pkg", "main", "pkg-ssh-url").Return("origin", nil)
mockGit.EXPECT().GitExecOrPanic("new-pkg", "fetch", "origin", "pkg-head-sha") mockGit.EXPECT().GitExecOrPanic("new-pkg", "fetch", "origin", "pkg-head-sha")
mockGit.EXPECT().GitExecOrPanic("new-pkg", "checkout", "-B", "main", "pkg-head-sha")
// Pushing changes // Pushing changes
mockGit.EXPECT().GitExecOrPanic("_ObsPrj", "push", "origin") mockGit.EXPECT().GitExecOrPanic("_ObsPrj", "push", "origin")

View File

@@ -2,6 +2,7 @@ package common_test
import ( import (
"errors" "errors"
"fmt"
"os" "os"
"os/exec" "os/exec"
"path" "path"
@@ -807,9 +808,8 @@ func TestFindMissingAndExtraReviewers(t *testing.T) {
Base: &models.PRBranchInfo{Name: "main", Repo: &models.Repository{Name: "pkg", Owner: &models.User{UserName: "org"}}}, Base: &models.PRBranchInfo{Name: "main", Repo: &models.Repository{Name: "pkg", Owner: &models.User{UserName: "org"}}},
}, },
Reviews: &common.PRReviews{ Reviews: &common.PRReviews{
Reviews: []*models.PullReview{{State: common.ReviewStateRequestReview, User: &models.User{UserName: "m1"}}}, Reviews: []*models.PullReview{{State: common.ReviewStateRequestReview, User: &models.User{UserName: "m1"}}},
RequestedReviewers: []string{"m1"}, RequestedReviewers: []*models.TimelineComment{
FullTimeline: []*models.TimelineComment{
{User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "m1"}, Type: common.TimelineCommentType_ReviewRequested}, {User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "m1"}, Type: common.TimelineCommentType_ReviewRequested},
}, },
}, },
@@ -919,8 +919,7 @@ func TestFindMissingAndExtraReviewers(t *testing.T) {
}, },
Reviews: &common.PRReviews{ Reviews: &common.PRReviews{
Reviews: []*models.PullReview{{State: common.ReviewStateRequestReview, User: &models.User{UserName: "reviewer"}}}, Reviews: []*models.PullReview{{State: common.ReviewStateRequestReview, User: &models.User{UserName: "reviewer"}}},
RequestedReviewers: []string{"reviewer"}, RequestedReviewers: []*models.TimelineComment{{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "reviewer"}}},
FullTimeline: []*models.TimelineComment{{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "reviewer"}}},
}, },
}, },
{ {
@@ -930,8 +929,7 @@ func TestFindMissingAndExtraReviewers(t *testing.T) {
}, },
Reviews: &common.PRReviews{ Reviews: &common.PRReviews{
Reviews: []*models.PullReview{{State: common.ReviewStateRequestReview, User: &models.User{UserName: "reviewer"}}}, Reviews: []*models.PullReview{{State: common.ReviewStateRequestReview, User: &models.User{UserName: "reviewer"}}},
RequestedReviewers: []string{"reviewer"}, RequestedReviewers: []*models.TimelineComment{{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "reviewer"}}},
FullTimeline: []*models.TimelineComment{{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "reviewer"}}},
}, },
}, },
}, },
@@ -966,8 +964,7 @@ func TestFindMissingAndExtraReviewers(t *testing.T) {
{State: common.ReviewStateApproved, User: &models.User{UserName: "pkgmaintainer"}}, {State: common.ReviewStateApproved, User: &models.User{UserName: "pkgmaintainer"}},
{State: common.ReviewStatePending, User: &models.User{UserName: "prjmaintainer"}}, {State: common.ReviewStatePending, User: &models.User{UserName: "prjmaintainer"}},
}, },
RequestedReviewers: []string{"user2", "pkgmaintainer", "prjmaintainer"}, RequestedReviewers: []*models.TimelineComment{
FullTimeline: []*models.TimelineComment{
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "user2"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "user2"}},
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "pkgmaintainer"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "pkgmaintainer"}},
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prjmaintainer"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prjmaintainer"}},
@@ -985,8 +982,7 @@ func TestFindMissingAndExtraReviewers(t *testing.T) {
{State: common.ReviewStateRequestChanges, User: &models.User{UserName: "user1"}}, {State: common.ReviewStateRequestChanges, User: &models.User{UserName: "user1"}},
{State: common.ReviewStateRequestReview, User: &models.User{UserName: "autogits_obs_staging_bot"}}, {State: common.ReviewStateRequestReview, User: &models.User{UserName: "autogits_obs_staging_bot"}},
}, },
RequestedReviewers: []string{"user1", "autogits_obs_staging_bot"}, RequestedReviewers: []*models.TimelineComment{
FullTimeline: []*models.TimelineComment{
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "user1"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "user1"}},
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "autogits_obs_staging_bot"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "autogits_obs_staging_bot"}},
}, },
@@ -1026,8 +1022,7 @@ func TestFindMissingAndExtraReviewers(t *testing.T) {
{State: common.ReviewStatePending, User: &models.User{UserName: "prj2"}}, {State: common.ReviewStatePending, User: &models.User{UserName: "prj2"}},
{State: common.ReviewStatePending, User: &models.User{UserName: "someother"}}, {State: common.ReviewStatePending, User: &models.User{UserName: "someother"}},
}, },
RequestedReviewers: []string{"user2", "pkgmaintainer", "prjmaintainer", "pkgm1", "pkgm2", "someother", "prj1", "prj2"}, RequestedReviewers: []*models.TimelineComment{
FullTimeline: []*models.TimelineComment{
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "pkgmaintainer"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "pkgmaintainer"}},
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prjmaintainer"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prjmaintainer"}},
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prj1"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prj1"}},
@@ -1050,8 +1045,7 @@ func TestFindMissingAndExtraReviewers(t *testing.T) {
{State: common.ReviewStatePending, User: &models.User{UserName: "prj1"}}, {State: common.ReviewStatePending, User: &models.User{UserName: "prj1"}},
{State: common.ReviewStatePending, User: &models.User{UserName: "prj2"}}, {State: common.ReviewStatePending, User: &models.User{UserName: "prj2"}},
}, },
RequestedReviewers: []string{"user1", "autogits_obs_staging_bot", "prj1", "prj2"}, RequestedReviewers: []*models.TimelineComment{
FullTimeline: []*models.TimelineComment{
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "autogits_obs_staging_bot"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "autogits_obs_staging_bot"}},
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prj1"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prj1"}},
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prj2"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prj2"}},
@@ -1090,8 +1084,7 @@ func TestFindMissingAndExtraReviewers(t *testing.T) {
{State: common.ReviewStateRequestReview, User: &models.User{UserName: "prj1"}}, {State: common.ReviewStateRequestReview, User: &models.User{UserName: "prj1"}},
{State: common.ReviewStateRequestReview, User: &models.User{UserName: "someother"}}, {State: common.ReviewStateRequestReview, User: &models.User{UserName: "someother"}},
}, },
RequestedReviewers: []string{"user2", "pkgmaintainer", "prjmaintainer", "pkgm1", "someother", "prj1"}, RequestedReviewers: []*models.TimelineComment{
FullTimeline: []*models.TimelineComment{
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "pkgm1"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "pkgm1"}},
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "pkgmaintainer"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "pkgmaintainer"}},
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prjmaintainer"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prjmaintainer"}},
@@ -1112,8 +1105,7 @@ func TestFindMissingAndExtraReviewers(t *testing.T) {
{State: common.ReviewStateRequestReview, User: &models.User{UserName: "autogits_obs_staging_bot"}}, {State: common.ReviewStateRequestReview, User: &models.User{UserName: "autogits_obs_staging_bot"}},
{State: common.ReviewStateRequestReview, User: &models.User{UserName: "prj1"}}, {State: common.ReviewStateRequestReview, User: &models.User{UserName: "prj1"}},
}, },
RequestedReviewers: []string{"user1", "autogits_obs_staging_bot", "prj1"}, RequestedReviewers: []*models.TimelineComment{
FullTimeline: []*models.TimelineComment{
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "autogits_obs_staging_bot"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "autogits_obs_staging_bot"}},
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prj1"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "bot"}, Assignee: &models.User{UserName: "prj1"}},
{Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "!bot"}, Assignee: &models.User{UserName: "user1"}}, {Type: common.TimelineCommentType_ReviewRequested, User: &models.User{UserName: "!bot"}, Assignee: &models.User{UserName: "user1"}},
@@ -1199,6 +1191,9 @@ func TestFindMissingAndExtraReviewers(t *testing.T) {
t.Run(test.name, func(t *testing.T) { t.Run(test.name, func(t *testing.T) {
test.prset.HasAutoStaging = !test.noAutoStaging test.prset.HasAutoStaging = !test.noAutoStaging
for idx, pr := range test.prset.PRs { for idx, pr := range test.prset.PRs {
if pr.Reviews != nil {
pr.Reviews.SetRequiredReviewers(test.prset.Config.Reviewers)
}
missing, extra := test.prset.FindMissingAndExtraReviewers(test.maintainers, idx) missing, extra := test.prset.FindMissingAndExtraReviewers(test.maintainers, idx)
// avoid nil dereference below, by adding empty array elements // avoid nil dereference below, by adding empty array elements
@@ -1273,7 +1268,7 @@ func TestPRMerge(t *testing.T) {
Owner: &models.User{ Owner: &models.User{
UserName: "org", UserName: "org",
}, },
SSHURL: "file://" + path.Join(repoDir, "prjgit"), SSHURL: "ssh://git@src.opensuse.org/org/prj.git",
}, },
}, },
Head: &models.PRBranchInfo{ Head: &models.PRBranchInfo{
@@ -1295,7 +1290,7 @@ func TestPRMerge(t *testing.T) {
Owner: &models.User{ Owner: &models.User{
UserName: "org", UserName: "org",
}, },
SSHURL: "file://" + path.Join(cmd.Dir, "prjgit"), SSHURL: "ssh://git@src.opensuse.org/org/prj.git",
}, },
}, },
Head: &models.PRBranchInfo{ Head: &models.PRBranchInfo{
@@ -1405,3 +1400,345 @@ func TestPRChanges(t *testing.T) {
}) })
} }
} }
func TestPRPrepareForMerge(t *testing.T) {
tests := []struct {
name string
setup func(*mock_common.MockGit, *models.PullRequest, *models.PullRequest)
config *common.AutogitConfig
expected bool
editable bool
}{
{
name: "Success Devel",
config: &common.AutogitConfig{
Organization: "org",
GitProjectName: "org/_ObsPrj#master",
MergeMode: common.MergeModeDevel,
},
setup: func(m *mock_common.MockGit, prjPR, pkgPR *models.PullRequest) {},
expected: true,
},
{
name: "Success FF",
config: &common.AutogitConfig{
Organization: "org",
GitProjectName: "org/_ObsPrj#master",
MergeMode: common.MergeModeFF,
},
setup: func(m *mock_common.MockGit, prjPR, pkgPR *models.PullRequest) {
m.EXPECT().GitClone("pkg", "master", pkgPR.Base.Repo.SSHURL).Return("origin", nil)
m.EXPECT().GitExecOrPanic("pkg", "fetch", "origin", pkgPR.Head.Sha)
m.EXPECT().GitExec("pkg", "merge-base", "--is-ancestor", "HEAD", pkgPR.Head.Sha).Return(nil)
m.EXPECT().GitClone("_ObsPrj", "master", prjPR.Base.Repo.SSHURL).Return("origin", nil)
m.EXPECT().GitExecOrPanic("_ObsPrj", "fetch", "origin", prjPR.Head.Sha)
m.EXPECT().GitExec("_ObsPrj", "checkout", "-B", "PR_1_mergetest", prjPR.Base.Sha).Return(nil)
m.EXPECT().GitExec("_ObsPrj", "merge", "--no-ff", "--no-commit", prjPR.Head.Sha).Return(nil)
},
expected: true,
},
{
name: "Success Replace MergeCommit",
config: &common.AutogitConfig{
Organization: "org",
GitProjectName: "org/_ObsPrj#master",
MergeMode: common.MergeModeReplace,
},
setup: func(m *mock_common.MockGit, prjPR, pkgPR *models.PullRequest) {
m.EXPECT().GitClone("pkg", "master", pkgPR.Base.Repo.SSHURL).Return("origin", nil)
m.EXPECT().GitExecOrPanic("pkg", "fetch", "origin", pkgPR.Head.Sha)
// merge-base fails initially
m.EXPECT().GitExec("pkg", "merge-base", "--is-ancestor", "HEAD", pkgPR.Head.Sha).Return(fmt.Errorf("not ancestor"))
// HasMerge returns true
m.EXPECT().GitExecWithOutput("pkg", "show", "-s", "--format=%P", pkgPR.Head.Sha).Return("parent1 target_head", nil)
m.EXPECT().GitExecWithOutput("pkg", "rev-parse", "HEAD").Return("target_head", nil)
m.EXPECT().GitClone("_ObsPrj", "master", prjPR.Base.Repo.SSHURL).Return("origin", nil)
m.EXPECT().GitExecOrPanic("_ObsPrj", "fetch", "origin", prjPR.Head.Sha)
m.EXPECT().GitExec("_ObsPrj", "checkout", "-B", "PR_1_mergetest", prjPR.Base.Sha).Return(nil)
m.EXPECT().GitExec("_ObsPrj", "merge", "--no-ff", "--no-commit", prjPR.Head.Sha).Return(nil)
},
expected: true,
},
{
name: "Merge Conflict in PrjGit",
config: &common.AutogitConfig{
Organization: "org",
GitProjectName: "org/_ObsPrj#master",
MergeMode: common.MergeModeFF,
},
setup: func(m *mock_common.MockGit, prjPR, pkgPR *models.PullRequest) {
m.EXPECT().GitClone("pkg", "master", pkgPR.Base.Repo.SSHURL).Return("origin", nil)
m.EXPECT().GitExecOrPanic("pkg", "fetch", "origin", pkgPR.Head.Sha)
m.EXPECT().GitExec("pkg", "merge-base", "--is-ancestor", "HEAD", pkgPR.Head.Sha).Return(nil)
m.EXPECT().GitClone("_ObsPrj", "master", prjPR.Base.Repo.SSHURL).Return("origin", nil)
m.EXPECT().GitExecOrPanic("_ObsPrj", "fetch", "origin", prjPR.Head.Sha)
m.EXPECT().GitExec("_ObsPrj", "checkout", "-B", "PR_1_mergetest", prjPR.Base.Sha).Return(nil)
m.EXPECT().GitExec("_ObsPrj", "merge", "--no-ff", "--no-commit", prjPR.Head.Sha).Return(fmt.Errorf("conflict"))
},
expected: false,
},
{
name: "Not FF in PkgGit",
config: &common.AutogitConfig{
Organization: "org",
GitProjectName: "org/_ObsPrj#master",
MergeMode: common.MergeModeFF,
},
setup: func(m *mock_common.MockGit, prjPR, pkgPR *models.PullRequest) {
m.EXPECT().GitClone("pkg", "master", pkgPR.Base.Repo.SSHURL).Return("origin", nil)
m.EXPECT().GitExecOrPanic("pkg", "fetch", "origin", pkgPR.Head.Sha)
m.EXPECT().GitExec("pkg", "merge-base", "--is-ancestor", "HEAD", pkgPR.Head.Sha).Return(fmt.Errorf("not ancestor"))
},
expected: false,
},
{
name: "Success Replace with AddMergeCommit",
config: &common.AutogitConfig{
Organization: "org",
GitProjectName: "org/_ObsPrj#master",
MergeMode: common.MergeModeReplace,
},
editable: true,
setup: func(m *mock_common.MockGit, prjPR, pkgPR *models.PullRequest) {
m.EXPECT().GitClone("pkg", "master", pkgPR.Base.Repo.SSHURL).Return("origin", nil)
m.EXPECT().GitExecOrPanic("pkg", "fetch", "origin", pkgPR.Head.Sha)
// First merge-base fails
m.EXPECT().GitExec("pkg", "merge-base", "--is-ancestor", "HEAD", pkgPR.Head.Sha).Return(fmt.Errorf("not ancestor"))
// HasMerge returns false
m.EXPECT().GitExecWithOutput("pkg", "show", "-s", "--format=%P", pkgPR.Head.Sha).Return("parent1", nil)
m.EXPECT().GitClone("pkg", pkgPR.Head.Name, pkgPR.Base.Repo.SSHURL).Return("origin_fork", nil)
// AddMergeCommit is called
m.EXPECT().GitExecOrPanic("pkg", "checkout", "origin/master")
m.EXPECT().GitExec("pkg", "merge", "--no-ff", "--no-commit", "-X", "theirs", pkgPR.Head.Sha).Return(nil)
m.EXPECT().GitExecOrPanic("pkg", "read-tree", "--reset", "-u", pkgPR.Head.Sha)
m.EXPECT().GitExecOrPanic("pkg", "commit", "-m", gomock.Any())
m.EXPECT().GitExecOrPanic("pkg", "push", "origin_fork", "HEAD:"+pkgPR.Head.Name)
m.EXPECT().GitExecWithOutputOrPanic("pkg", "rev-list", "-1", "HEAD").Return("new_pkg_head_sha")
// Second merge-base succeeds (after goto Verify)
m.EXPECT().GitExec("pkg", "merge-base", "--is-ancestor", "HEAD", "new_pkg_head_sha").Return(nil)
m.EXPECT().GitClone("_ObsPrj", "master", prjPR.Base.Repo.SSHURL).Return("origin", nil)
m.EXPECT().GitExecOrPanic("_ObsPrj", "fetch", "origin", prjPR.Head.Sha)
m.EXPECT().GitExec("_ObsPrj", "checkout", "-B", "PR_1_mergetest", prjPR.Base.Sha).Return(nil)
m.EXPECT().GitExec("_ObsPrj", "merge", "--no-ff", "--no-commit", prjPR.Head.Sha).Return(nil)
},
expected: true,
},
}
for _, test := range tests {
t.Run(test.name, func(t *testing.T) {
prjPR := &models.PullRequest{
Index: 1,
Base: &models.PRBranchInfo{
Name: "master",
Sha: "base_sha",
Repo: &models.Repository{
Owner: &models.User{UserName: "org"},
Name: "_ObsPrj",
SSHURL: "ssh://git@src.opensuse.org/org/_ObsPrj.git",
},
},
Head: &models.PRBranchInfo{
Sha: "head_sha",
Repo: &models.Repository{
Owner: &models.User{UserName: "org"},
Name: "_ObsPrj",
SSHURL: "ssh://git@src.opensuse.org/org/_ObsPrj.git",
},
},
}
pkgPR := &models.PullRequest{
Index: 2,
Base: &models.PRBranchInfo{
Name: "master",
Sha: "pkg_base_sha",
Repo: &models.Repository{
Owner: &models.User{UserName: "org"},
Name: "pkg",
SSHURL: "ssh://git@src.opensuse.org/org/pkg.git",
},
},
Head: &models.PRBranchInfo{
Name: "branch_name",
Sha: "pkg_head_sha",
Repo: &models.Repository{
Owner: &models.User{UserName: "org"},
Name: "pkg",
SSHURL: "ssh://git@src.opensuse.org/org/pkg.git",
},
},
AllowMaintainerEdit: test.editable,
}
ctl := gomock.NewController(t)
git := mock_common.NewMockGit(ctl)
test.setup(git, prjPR, pkgPR)
prset := &common.PRSet{
Config: test.config,
PRs: []*common.PRInfo{
{PR: prjPR},
{PR: pkgPR},
},
}
if res := prset.PrepareForMerge(git); res != test.expected {
t.Errorf("Expected %v, got %v", test.expected, res)
}
})
}
}
func TestPRMergeMock(t *testing.T) {
tests := []struct {
name string
setup func(*mock_common.MockGit, *models.PullRequest, *models.PullRequest)
config *common.AutogitConfig
}{
{
name: "Success FF",
config: &common.AutogitConfig{
Organization: "org",
GitProjectName: "org/_ObsPrj#master",
MergeMode: common.MergeModeFF,
},
setup: func(m *mock_common.MockGit, prjPR, pkgPR *models.PullRequest) {
m.EXPECT().GitClone("_ObsPrj", "master", prjPR.Base.Repo.SSHURL).Return("origin", nil)
m.EXPECT().GitExecOrPanic("_ObsPrj", "fetch", "origin", prjPR.Head.Sha)
m.EXPECT().GitExec("_ObsPrj", "merge", "--no-ff", "-m", gomock.Any(), prjPR.Head.Sha).Return(nil)
m.EXPECT().GitClone("pkg", "master", pkgPR.Base.Repo.SSHURL).Return("origin_pkg", nil)
m.EXPECT().GitExecOrPanic("pkg", "fetch", "origin_pkg", pkgPR.Head.Sha)
m.EXPECT().GitExecOrPanic("pkg", "merge", "--ff", pkgPR.Head.Sha)
m.EXPECT().GitExecOrPanic("pkg", "push", "origin_pkg")
m.EXPECT().GitExecOrPanic("_ObsPrj", "push", "origin")
},
},
{
name: "Success Devel",
config: &common.AutogitConfig{
Organization: "org",
GitProjectName: "org/_ObsPrj#master",
MergeMode: common.MergeModeDevel,
},
setup: func(m *mock_common.MockGit, prjPR, pkgPR *models.PullRequest) {
m.EXPECT().GitClone("_ObsPrj", "master", prjPR.Base.Repo.SSHURL).Return("origin", nil)
m.EXPECT().GitExecOrPanic("_ObsPrj", "fetch", "origin", prjPR.Head.Sha)
m.EXPECT().GitExec("_ObsPrj", "merge", "--no-ff", "-m", gomock.Any(), prjPR.Head.Sha).Return(nil)
m.EXPECT().GitClone("pkg", "master", pkgPR.Base.Repo.SSHURL).Return("origin_pkg", nil)
m.EXPECT().GitExecOrPanic("pkg", "fetch", "origin_pkg", pkgPR.Head.Sha)
m.EXPECT().GitExecOrPanic("pkg", "checkout", "-B", "master", pkgPR.Head.Sha)
m.EXPECT().GitExecOrPanic("pkg", "push", "-f", "origin_pkg")
m.EXPECT().GitExecOrPanic("_ObsPrj", "push", "origin")
},
},
}
for _, test := range tests {
t.Run(test.name, func(t *testing.T) {
prjPR := &models.PullRequest{
Index: 1,
Base: &models.PRBranchInfo{
Name: "master",
Sha: "prj_base_sha",
Repo: &models.Repository{
Owner: &models.User{UserName: "org"},
Name: "_ObsPrj",
SSHURL: "ssh://git@src.opensuse.org/org/_ObsPrj.git",
},
},
Head: &models.PRBranchInfo{
Sha: "prj_head_sha",
},
}
pkgPR := &models.PullRequest{
Index: 2,
Base: &models.PRBranchInfo{
Name: "master",
Sha: "pkg_base_sha",
Repo: &models.Repository{
Owner: &models.User{UserName: "org"},
Name: "pkg",
SSHURL: "ssh://git@src.opensuse.org/org/pkg.git",
},
},
Head: &models.PRBranchInfo{
Sha: "pkg_head_sha",
},
}
ctl := gomock.NewController(t)
git := mock_common.NewMockGit(ctl)
reviewUnrequestMock := mock_common.NewMockGiteaReviewUnrequester(ctl)
reviewUnrequestMock.EXPECT().UnrequestReview(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes().Return(nil)
test.setup(git, prjPR, pkgPR)
prset := &common.PRSet{
Config: test.config,
PRs: []*common.PRInfo{
{PR: prjPR},
{PR: pkgPR},
},
}
if err := prset.Merge(reviewUnrequestMock, git); err != nil {
t.Errorf("Unexpected error: %v", err)
}
})
}
}
func TestPRAddMergeCommit(t *testing.T) {
pkgPR := &models.PullRequest{
Index: 2,
Base: &models.PRBranchInfo{
Name: "master",
Sha: "pkg_base_sha",
Repo: &models.Repository{
Owner: &models.User{UserName: "org"},
Name: "pkg",
SSHURL: "ssh://git@src.opensuse.org/org/pkg.git",
},
},
Head: &models.PRBranchInfo{
Name: "branch_name",
Sha: "pkg_head_sha",
},
AllowMaintainerEdit: true,
}
config := &common.AutogitConfig{
Organization: "org",
GitProjectName: "org/_ObsPrj#master",
MergeMode: common.MergeModeReplace,
}
ctl := gomock.NewController(t)
git := mock_common.NewMockGit(ctl)
git.EXPECT().GitExec("pkg", "merge", "--no-ff", "--no-commit", "-X", "theirs", pkgPR.Head.Sha).Return(nil)
git.EXPECT().GitExecOrPanic("pkg", "read-tree", "--reset", "-u", pkgPR.Head.Sha)
git.EXPECT().GitExecOrPanic("pkg", "commit", "-m", gomock.Any())
git.EXPECT().GitExecOrPanic("pkg", "push", "origin", "HEAD:branch_name")
git.EXPECT().GitExecWithOutputOrPanic("pkg", "rev-list", "-1", "HEAD").Return("new_head_sha")
prset := &common.PRSet{
Config: config,
PRs: []*common.PRInfo{
{PR: &models.PullRequest{}}, // prjgit at index 0
{PR: pkgPR}, // pkg at index 1
},
}
if res := prset.AddMergeCommit(git, "origin", 1); !res {
t.Errorf("Expected true, got %v", res)
}
}

View File

@@ -8,12 +8,28 @@ import (
"src.opensuse.org/autogits/common/gitea-generated/models" "src.opensuse.org/autogits/common/gitea-generated/models"
) )
type ReviewInterface interface {
IsManualMergeOK() bool
IsApproved() bool
MisingReviews() []string
FindReviewRequester(reviewer string) *models.TimelineComment
HasPendingReviewBy(reviewer string) bool
IsReviewedBy(reviewer string) bool
IsReviewedByOneOf(reviewers ...string) bool
SetRequiredReviewers(reviewers []string)
}
type PRReviews struct { type PRReviews struct {
Reviews []*models.PullReview Reviews []*models.PullReview
RequestedReviewers []string RequestedReviewers []*models.TimelineComment
Comments []*models.TimelineComment Comments []*models.TimelineComment
FullTimeline []*models.TimelineComment RequiredReviewers []string
}
func (r *PRReviews) SetRequiredReviewers(reviewers []string) {
r.RequiredReviewers = reviewers
} }
func FetchGiteaReviews(rf GiteaReviewTimelineFetcher, org, repo string, no int64) (*PRReviews, error) { func FetchGiteaReviews(rf GiteaReviewTimelineFetcher, org, repo string, no int64) (*PRReviews, error) {
@@ -28,11 +44,11 @@ func FetchGiteaReviews(rf GiteaReviewTimelineFetcher, org, repo string, no int64
} }
reviews := make([]*models.PullReview, 0, 10) reviews := make([]*models.PullReview, 0, 10)
needNewReviews := []string{}
var comments []*models.TimelineComment var comments []*models.TimelineComment
var foundUsers []string
alreadyHaveUserReview := func(user string) bool { alreadyHaveUserReview := func(user string) bool {
if slices.Contains(needNewReviews, user) { if slices.Contains(foundUsers, user) {
return true return true
} }
for _, r := range reviews { for _, r := range reviews {
@@ -48,20 +64,24 @@ func FetchGiteaReviews(rf GiteaReviewTimelineFetcher, org, repo string, no int64
LogDebug("Number of items in timeline:", len(timeline)) LogDebug("Number of items in timeline:", len(timeline))
cutOffIdx := len(timeline) cutOffIdx := len(timeline)
var PendingRequestedReviews []*models.TimelineComment
for idx, item := range timeline { for idx, item := range timeline {
if item.Type == TimelineCommentType_Review || item.Type == TimelineCommentType_ReviewRequested { if item.Type == TimelineCommentType_Review {
for _, r := range rawReviews { for _, r := range rawReviews {
if r.ID == item.ReviewID { if r.ID == item.ReviewID && r.User != nil {
if !alreadyHaveUserReview(r.User.UserName) { if !alreadyHaveUserReview(r.User.UserName) {
if item.Type == TimelineCommentType_Review && idx > cutOffIdx { if idx < cutOffIdx {
needNewReviews = append(needNewReviews, r.User.UserName)
} else {
reviews = append(reviews, r) reviews = append(reviews, r)
} }
foundUsers = append(foundUsers, r.User.UserName)
} }
break break
} }
} }
} else if item.Type == TimelineCommentType_ReviewRequested && item.Assignee != nil && !alreadyHaveUserReview(item.Assignee.UserName) {
PendingRequestedReviews = append(PendingRequestedReviews, item)
} else if item.Type == TimelineCommentType_DismissReview && item.Assignee != nil && !alreadyHaveUserReview(item.Assignee.UserName) {
foundUsers = append(foundUsers, item.Assignee.UserName)
} else if item.Type == TimelineCommentType_Comment && cutOffIdx > idx { } else if item.Type == TimelineCommentType_Comment && cutOffIdx > idx {
comments = append(comments, item) comments = append(comments, item)
} else if item.Type == TimelineCommentType_PushPull && cutOffIdx == len(timeline) { } else if item.Type == TimelineCommentType_PushPull && cutOffIdx == len(timeline) {
@@ -74,9 +94,9 @@ func FetchGiteaReviews(rf GiteaReviewTimelineFetcher, org, repo string, no int64
LogDebug("num comments:", len(comments), "timeline:", len(reviews)) LogDebug("num comments:", len(comments), "timeline:", len(reviews))
return &PRReviews{ return &PRReviews{
Reviews: reviews, Reviews: reviews,
Comments: comments, Comments: comments,
FullTimeline: timeline, RequestedReviewers: PendingRequestedReviews,
}, nil }, nil
} }
@@ -104,7 +124,7 @@ func (r *PRReviews) IsManualMergeOK() bool {
continue continue
} }
LogDebug("comment:", c.User.UserName, c.Body) LogDebug("comment:", c.User.UserName, c.Body)
if slices.Contains(r.RequestedReviewers, c.User.UserName) { if slices.Contains(r.RequiredReviewers, c.User.UserName) {
if bodyCommandManualMergeOK(c.Body) { if bodyCommandManualMergeOK(c.Body) {
return true return true
} }
@@ -115,7 +135,7 @@ func (r *PRReviews) IsManualMergeOK() bool {
if c.Updated != c.Submitted { if c.Updated != c.Submitted {
continue continue
} }
if slices.Contains(r.RequestedReviewers, c.User.UserName) { if slices.Contains(r.RequiredReviewers, c.User.UserName) {
if bodyCommandManualMergeOK(c.Body) { if bodyCommandManualMergeOK(c.Body) {
return true return true
} }
@@ -131,7 +151,7 @@ func (r *PRReviews) IsApproved() bool {
} }
goodReview := true goodReview := true
for _, reviewer := range r.RequestedReviewers { for _, reviewer := range r.RequiredReviewers {
goodReview = false goodReview = false
for _, review := range r.Reviews { for _, review := range r.Reviews {
if review.User.UserName == reviewer && review.State == ReviewStateApproved && !review.Stale && !review.Dismissed { if review.User.UserName == reviewer && review.State == ReviewStateApproved && !review.Stale && !review.Dismissed {
@@ -155,7 +175,7 @@ func (r *PRReviews) MissingReviews() []string {
return missing return missing
} }
for _, reviewer := range r.RequestedReviewers { for _, reviewer := range r.RequiredReviewers {
if !r.IsReviewedBy(reviewer) { if !r.IsReviewedBy(reviewer) {
missing = append(missing, reviewer) missing = append(missing, reviewer)
} }
@@ -168,12 +188,11 @@ func (r *PRReviews) FindReviewRequester(reviewer string) *models.TimelineComment
return nil return nil
} }
for _, r := range r.FullTimeline { for _, t := range r.RequestedReviewers {
if r.Type == TimelineCommentType_ReviewRequested && r.Assignee.UserName == reviewer { if t.Assignee.UserName == reviewer {
return r return t
} }
} }
return nil return nil
} }
@@ -193,6 +212,13 @@ func (r *PRReviews) HasPendingReviewBy(reviewer string) bool {
} }
} }
// at this point, we do not have actual review by user. Check if we have a pending review
for _, t := range r.RequestedReviewers {
if t.Assignee != nil && t.Assignee.UserName == reviewer {
return true
}
}
return false return false
} }

View File

@@ -137,6 +137,61 @@ func TestReviews(t *testing.T) {
isApproved: false, isApproved: false,
isReviewedByTest1: true, isReviewedByTest1: true,
}, },
{
name: "Ghost user review",
reviews: []*models.PullReview{
{State: common.ReviewStateApproved, User: nil},
},
reviewers: []string{"user1"},
isApproved: false,
},
{
name: "ReviewRequested predates PushPull should be seen as pending",
reviews: []*models.PullReview{},
timeline: []*models.TimelineComment{
{Type: common.TimelineCommentType_PushPull},
{Type: common.TimelineCommentType_ReviewRequested, Assignee: &models.User{UserName: "user1"}},
},
reviewers: []string{"user1"},
isPendingByTest1: true,
},
{
name: "ReviewRequested postdates PushPull but blocked by older dismiss",
reviews: []*models.PullReview{},
timeline: []*models.TimelineComment{
{Type: common.TimelineCommentType_ReviewRequested, Assignee: &models.User{UserName: "user1"}},
{Type: common.TimelineCommentType_PushPull},
{Type: common.TimelineCommentType_ReviewDismissed, Assignee: &models.User{UserName: "user1"}},
},
reviewers: []string{"user1"},
isPendingByTest1: true,
},
{
name: "ReviewRequested predates PushPull should be seen as pending",
reviews: []*models.PullReview{
{ID: 101, State: common.ReviewStateRequestReview, User: &models.User{UserName: "user1"}},
},
timeline: []*models.TimelineComment{
{Type: common.TimelineCommentType_PushPull},
{Type: common.TimelineCommentType_ReviewRequested, Assignee: &models.User{UserName: "user1"}},
},
reviewers: []string{"user1"},
isPendingByTest1: true,
},
{
name: "Review requested, review, then push needs re-requesting",
reviews: []*models.PullReview{
{ID: 100, State: common.ReviewStateRequestChanges, User: &models.User{UserName: "user1"}},
},
timeline: []*models.TimelineComment{
{Type: common.TimelineCommentType_PushPull},
{Type: common.TimelineCommentType_Review, ReviewID: 100},
{Type: common.TimelineCommentType_ReviewRequested, Assignee: &models.User{UserName: "user1"}},
},
reviewers: []string{"user1"},
isReviewedByTest1: false, // Should be stale
isPendingByTest1: false, // Should be stale
},
} }
for _, test := range tests { for _, test := range tests {
@@ -158,7 +213,7 @@ func TestReviews(t *testing.T) {
} }
return return
} }
reviews.RequestedReviewers = test.reviewers reviews.SetRequiredReviewers(test.reviewers)
if r := reviews.IsApproved(); r != test.isApproved { if r := reviews.IsApproved(); r != test.isApproved {
t.Fatal("Unexpected IsReviewed():", r, "vs. expected", test.isApproved) t.Fatal("Unexpected IsReviewed():", r, "vs. expected", test.isApproved)

View File

@@ -8,6 +8,7 @@ import (
) )
const ( const (
TimelineCommentType_ReviewDismissed = "dismiss_review"
TimelineCommentType_ReviewRequested = "review_request" TimelineCommentType_ReviewRequested = "review_request"
TimelineCommentType_Review = "review" TimelineCommentType_Review = "review"
TimelineCommentType_PushPull = "pull_push" TimelineCommentType_PushPull = "pull_push"

View File

@@ -92,10 +92,13 @@ func ConnectToExchangeForPublish(host, username, password string) {
auth = username + ":" + password + "@" auth = username + ":" + password + "@"
} }
connection, err := rabbitmq.DialTLS("amqps://"+auth+host, &tls.Config{ connection, err := rabbitmq.DialConfig("amqps://"+auth+host, rabbitmq.Config{
ServerName: host, Dial: rabbitmq.DefaultDial(10 * time.Second),
TLSClientConfig: &tls.Config{
ServerName: host,
},
}) })
failOnError(err, "Cannot connect to rabbit.opensuse.org") failOnError(err, "Cannot connect to "+host)
defer connection.Close() defer connection.Close()
ch, err := connection.Channel() ch, err := connection.Channel()

View File

@@ -2,7 +2,7 @@
<title>openSUSE Leap 16.0 based on SLFO</title> <title>openSUSE Leap 16.0 based on SLFO</title>
<description>Leap 16.0 based on SLES 16.0 (specifically SLFO:1.2)</description> <description>Leap 16.0 based on SLES 16.0 (specifically SLFO:1.2)</description>
<link project="openSUSE:Backports:SLE-16.0"/> <link project="openSUSE:Backports:SLE-16.0"/>
<scmsync>http://gitea-test:3000/products/SLFO#main</scmsync> <scmsync>http://gitea-test:3000/myproducts/mySLFO#staging-main</scmsync>
<person userid="dimstar_suse" role="maintainer"/> <person userid="dimstar_suse" role="maintainer"/>
<person userid="lkocman-factory" role="maintainer"/> <person userid="lkocman-factory" role="maintainer"/>
<person userid="maxlin_factory" role="maintainer"/> <person userid="maxlin_factory" role="maintainer"/>

View File

@@ -8,6 +8,7 @@ services:
gitea: gitea:
build: ./gitea build: ./gitea
container_name: gitea-test container_name: gitea-test
init: true
environment: environment:
- GITEA_WORK_DIR=/var/lib/gitea - GITEA_WORK_DIR=/var/lib/gitea
networks: networks:
@@ -27,6 +28,7 @@ services:
rabbitmq: rabbitmq:
image: rabbitmq:3.13.7-management image: rabbitmq:3.13.7-management
container_name: rabbitmq-test container_name: rabbitmq-test
init: true
healthcheck: healthcheck:
test: ["CMD", "rabbitmq-diagnostics", "check_running", "-q"] test: ["CMD", "rabbitmq-diagnostics", "check_running", "-q"]
interval: 30s interval: 30s
@@ -55,6 +57,7 @@ services:
context: .. context: ..
dockerfile: integration/gitea-events-rabbitmq-publisher/Dockerfile${GIWTF_IMAGE_SUFFIX} dockerfile: integration/gitea-events-rabbitmq-publisher/Dockerfile${GIWTF_IMAGE_SUFFIX}
container_name: gitea-publisher container_name: gitea-publisher
init: true
networks: networks:
- gitea-network - gitea-network
depends_on: depends_on:
@@ -75,6 +78,7 @@ services:
context: .. context: ..
dockerfile: integration/workflow-pr/Dockerfile${GIWTF_IMAGE_SUFFIX} dockerfile: integration/workflow-pr/Dockerfile${GIWTF_IMAGE_SUFFIX}
container_name: workflow-pr container_name: workflow-pr
init: true
networks: networks:
- gitea-network - gitea-network
depends_on: depends_on:
@@ -103,6 +107,7 @@ services:
mock-obs: mock-obs:
build: ./mock-obs build: ./mock-obs
container_name: mock-obs container_name: mock-obs
init: true
networks: networks:
- gitea-network - gitea-network
ports: ports:
@@ -116,6 +121,7 @@ services:
context: .. context: ..
dockerfile: integration/obs-staging-bot/Dockerfile${GIWTF_IMAGE_SUFFIX} dockerfile: integration/obs-staging-bot/Dockerfile${GIWTF_IMAGE_SUFFIX}
container_name: obs-staging-bot container_name: obs-staging-bot
init: true
networks: networks:
- gitea-network - gitea-network
depends_on: depends_on:

View File

@@ -7,4 +7,10 @@ markers =
t005: Test case 005 t005: Test case 005
t006: Test case 006 t006: Test case 006
t007: Test case 007 t007: Test case 007
t008: Test case 008
t009: Test case 009
t010: Test case 010
t011: Test case 011
t012: Test case 012
t013: Test case 013
dependency: pytest-dependency marker dependency: pytest-dependency marker

View File

@@ -59,23 +59,29 @@ The testing will be conducted in a dedicated test environment that mimics the pr
| **TC-SYNC-002** | P | **Update ProjectGit PR from PackageGit PR** | 1. Push a new commit to an existing PackageGit PR. | 1. The corresponding ProjectGit PR's head branch is updated with the new commit. | High | | **TC-SYNC-002** | P | **Update ProjectGit PR from PackageGit PR** | 1. Push a new commit to an existing PackageGit PR. | 1. The corresponding ProjectGit PR's head branch is updated with the new commit. | High |
| **TC-SYNC-003** | P | **WIP Flag Synchronization** | 1. Mark a PackageGit PR as "Work In Progress".<br>2. Remove the WIP flag from the PackageGit PR. | 1. The corresponding ProjectGit PR is also marked as "Work In Progress".<br>2. The WIP flag on the ProjectGit PR is removed. | Medium | | **TC-SYNC-003** | P | **WIP Flag Synchronization** | 1. Mark a PackageGit PR as "Work In Progress".<br>2. Remove the WIP flag from the PackageGit PR. | 1. The corresponding ProjectGit PR is also marked as "Work In Progress".<br>2. The WIP flag on the ProjectGit PR is removed. | Medium |
| **TC-SYNC-004** | - | **WIP Flag (multiple referenced package PRs)** | 1. Create a ProjectGit PR that references multiple PackageGit PRs.<br>2. Mark one of the PackageGit PRs as "Work In Progress".<br>3. Remove the "Work In Progress" flag from all PackageGit PRs. | 1. The ProjectGit PR is marked as "Work In Progress".<br>2. The "Work In Progress" flag is removed from the ProjectGit PR only after it has been removed from all associated PackageGit PRs. | Medium | | **TC-SYNC-004** | - | **WIP Flag (multiple referenced package PRs)** | 1. Create a ProjectGit PR that references multiple PackageGit PRs.<br>2. Mark one of the PackageGit PRs as "Work In Progress".<br>3. Remove the "Work In Progress" flag from all PackageGit PRs. | 1. The ProjectGit PR is marked as "Work In Progress".<br>2. The "Work In Progress" flag is removed from the ProjectGit PR only after it has been removed from all associated PackageGit PRs. | Medium |
| **TC-SYNC-005** | x | **NoProjectGitPR = true, edits disabled** | 1. Set `NoProjectGitPR = true` in `workflow.config`.<br>2. Create a PackageGit PR without "Allow edits from maintainers" enabled. <br>3. Push a new commit to the PackageGit PR. | 1. No ProjectGit PR is created.<br>2. The bot adds a warning comment to the PackageGit PR explaining that it cannot update the PR. | High | | **TC-SYNC-005** | S | **NoProjectGitPR = true, edits disabled** | 1. Set `NoProjectGitPR = true` in `workflow.config`.<br>2. Create a PackageGit PR without "Allow edits from maintainers" enabled. <br>3. Push a new commit to the PackageGit PR. | 1. No ProjectGit PR is created.<br>2. The bot adds a warning comment to the PackageGit PR explaining that it cannot update the PR. | High |
| **TC-SYNC-006** | x | **NoProjectGitPR = true, edits enabled** | 1. Set `NoProjectGitPR = true` in `workflow.config`.<br>2. Create a PackageGit PR with "Allow edits from maintainers" enabled.<br>3. Push a new commit to the PackageGit PR. | 1. No ProjectGit PR is created.<br>2. The submodule commit on the project PR is updated with the new commit from the PackageGit PR. | High | | **TC-SYNC-006** | S | **NoProjectGitPR = true, edits enabled** | 1. Set `NoProjectGitPR = true` in `workflow.config`.<br>2. Create a PackageGit PR with "Allow edits from maintainers" enabled.<br>3. Push a new commit to the PackageGit PR. | 1. No ProjectGit PR is created.<br>2. The submodule commit on the project PR is updated with the new commit from the PackageGit PR. | High |
| **TC-COMMENT-001** | - | **Detect duplicate comments** | 1. Create a PackageGit PR.<br>2. Wait for the `workflow-pr` bot to act on the PR.<br>3. Edit the body of the PR to trigger the bot a second time. | 1. The bot should not post a duplicate comment. | High | | **TC-COMMENT-001** | - | **Detect duplicate comments** | 1. Create a PackageGit PR.<br>2. Wait for the `workflow-pr` bot to act on the PR.<br>3. Edit the body of the PR to trigger the bot a second time. | 1. The bot should not post a duplicate comment. | High |
| **TC-REVIEW-001** | P | **Add mandatory reviewers** | 1. Create a new PackageGit PR. | 1. All mandatory reviewers are added to both the PackageGit and ProjectGit PRs. | High | | **TC-REVIEW-001** | P | **Add mandatory reviewers** | 1. Create a new PackageGit PR. | 1. All mandatory reviewers are added to both the PackageGit and ProjectGit PRs. | High |
| **TC-REVIEW-002** | - | **Add advisory reviewers** | 1. Create a new PackageGit PR with advisory reviewers defined in the configuration. | 1. Advisory reviewers are added to the PR, but their approval is not required for merging. | Medium | | **TC-REVIEW-002** | - | **Add advisory reviewers** | 1. Create a new PackageGit PR with advisory reviewers defined in the configuration. | 1. Advisory reviewers are added to the PR, but their approval is not required for merging. | Medium |
| **TC-REVIEW-003** | - | **Re-add reviewers** | 1. Push a new commit to a PackageGit PR after it has been approved. | 1. The original reviewers are re-added to the PR. | Medium | | **TC-REVIEW-003** | - | **Re-add reviewers** | 1. Push a new commit to a PackageGit PR after it has been approved. | 1. The original reviewers are re-added to the PR. | Medium |
| **TC-REVIEW-004** | x | **Package PR created by a maintainer** | 1. Create a PackageGit PR from the account of a package maintainer. | 1. No review is requested from other package maintainers. | High | | **TC-REVIEW-004** | P | **Package PR created by a maintainer** | 1. Create a PackageGit PR from the account of a package maintainer. | 1. No review is requested from other package maintainers. | High |
| **TC-REVIEW-005** | P | **Package PR created by an external user (approve)** | 1. Create a PackageGit PR from the account of a user who is not a package maintainer.<br>2. One of the package maintainers approves the PR. | 1. All package maintainers are added as reviewers.<br>2. Once one maintainer approves the PR, the other maintainers are removed as reviewers. | High | | **TC-REVIEW-005** | P | **Package PR created by an external user (approve)** | 1. Create a PackageGit PR from the account of a user who is not a package maintainer.<br>2. One of the package maintainers approves the PR. | 1. All package maintainers are added as reviewers.<br>2. Once one maintainer approves the PR, the other maintainers are removed as reviewers. | High |
| **TC-REVIEW-006** | P | **Package PR created by an external user (reject)** | 1. Create a PackageGit PR from the account of a user who is not a package maintainer.<br>2. One of the package maintainers rejects the PR. | 1. All package maintainers are added as reviewers.<br>2. Once one maintainer rejects the PR, the other maintainers are removed as reviewers. | High | | **TC-REVIEW-006** | P | **Package PR created by an external user (reject)** | 1. Create a PackageGit PR from the account of a user who is not a package maintainer.<br>2. One of the package maintainers rejects the PR. | 1. All package maintainers are added as reviewers.<br>2. Once one maintainer rejects the PR, the other maintainers are removed as reviewers. | High |
| **TC-REVIEW-007** | P | **Package PR created by a maintainer with ReviewRequired=true** | 1. Set `ReviewRequired = true` in `workflow.config`.<br>2. Create a PackageGit PR from the account of a package maintainer. | 1. A review is requested from other package maintainers if available. | High | | **TC-REVIEW-007** | P | **Package PR created by a maintainer with ReviewRequired=true** | 1. Set `ReviewRequired = true` in `workflow.config`.<br>2. Create a PackageGit PR from the account of a package maintainer. | 1. A review is requested from other package maintainers if available. | High |
| **TC-MERGE-001** | x | **Automatic Merge** | 1. Create a PackageGit PR.<br>2. Ensure all mandatory reviews are completed on both project and package PRs. | 1. The PR is automatically merged. | High | | **TC-MERGE-001** | P | **Automatic Merge** | 1. Create a PackageGit PR.<br>2. Ensure all mandatory reviews are completed on both project and package PRs. | 1. The PR is automatically merged. | High |
| **TC-MERGE-002** | - | **ManualMergeOnly with Package Maintainer** | 1. Create a PackageGit PR with `ManualMergeOnly` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the package PR from the account of a package maintainer for that package. | 1. The PR is merged. | High | | **TC-MERGE-002** | P | **ManualMergeOnly with Package Maintainer** | 1. Create a PackageGit PR with `ManualMergeOnly` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the package PR from the account of a package maintainer (or requested reviewer). | 1. The PR is merged. | High |
| **TC-MERGE-003** | - | **ManualMergeOnly with unauthorized user** | 1. Create a PackageGit PR with `ManualMergeOnly` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the package PR from the account of a user who is not a maintainer for that package. | 1. The PR is not merged. | High | | **TC-MERGE-003** | - | **ManualMergeOnly with unauthorized user** | 1. Create a PackageGit PR with `ManualMergeOnly` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the package PR from the account of a user who is not a maintainer for that package. | 1. The PR is not merged. | High |
| **TC-MERGE-004** | - | **ManualMergeOnly with multiple packages** | 1. Create a ProjectGit PR that references multiple PackageGit PRs with `ManualMergeOnly` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on each package PR from the account of a package maintainer. | 1. The PR is merged only after "merge ok" is commented on all associated PackageGit PRs. | High | | **TC-MERGE-004** | - | **ManualMergeOnly with multiple packages** | 1. Create a ProjectGit PR that references multiple PackageGit PRs with `ManualMergeOnly` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on each package PR from the account of a package maintainer. | 1. The PR is merged only after "merge ok" is commented on all associated PackageGit PRs. | High |
| **TC-MERGE-005** | - | **ManualMergeOnly with Project Maintainer** | 1. Create a PackageGit PR with `ManualMergeOnly` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the package PR from the account of a project maintainer. | 1. The PR is merged. | High | | **TC-MERGE-005** | - | **ManualMergeOnly with Project Maintainer** | 1. Create a PackageGit PR with `ManualMergeOnly` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the package PR from the account of a project maintainer. | 1. The PR is merged. | High |
| **TC-MERGE-006** | - | **ManualMergeProject with Project Maintainer** | 1. Create a PackageGit PR with `ManualMergeProject` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the project PR from the account of a project maintainer. | 1. The PR is merged. | High | | **TC-MERGE-006** | - | **ManualMergeProject with Project Maintainer** | 1. Create a PackageGit PR with `ManualMergeProject` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the project PR from the account of a project maintainer. | 1. The PR is merged. | High |
| **TC-MERGE-007** | - | **ManualMergeProject with unauthorized user** | 1. Create a PackageGit PR with `ManualMergeProject` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the project PR from the account of a package maintainer. | 1. The PR is not merged. | High | | **TC-MERGE-007** | - | **ManualMergeProject with unauthorized user** | 1. Create a PackageGit PR with `ManualMergeProject` set to `true`.<br>2. Ensure all mandatory reviews are completed on both project and package PRs.<br>3. Comment "merge ok" on the project PR from the account of a package maintainer. | 1. The PR is not merged. | High |
| **TC-MERGE-008** | P | **MergeMode: ff-only (Success)** | 1. Set `MergeMode = "ff-only"`.<br>2. Create a FF-mergeable PackageGit PR.<br>3. Approve reviews on both PRs. | 1. Both PRs are automatically merged successfully. | High |
| **TC-MERGE-009** | P | **MergeMode: ff-only (Failure)** | 1. Set `MergeMode = "ff-only"`.<br>2. Create a PackageGit PR that adds a new file.<br>3. Commit the same file with different content to the base branch to create a content conflict.<br>4. Approve reviews and trigger a sync by pushing another change. | 1. The bot detects it is not FF-mergeable.<br>2. The PR is NOT merged. | High |
| **TC-MERGE-010** | P | **MergeMode: devel (Force-push)** | 1. Set `MergeMode = "devel"`.<br>2. Create a PackageGit PR that adds a new file.<br>3. Commit the same file with different content to the base branch to create a content conflict.<br>4. Approve reviews. | 1. Both PRs are merged.<br>2. The `pkgA` submodule points to the PR's head SHA. | High |
| **TC-MERGE-011** | P | **MergeMode: replace (Merge-commit)** | 1. Set `MergeMode = "replace"`.<br>2. Create a PackageGit PR that adds a new file.<br>3. Enable "Allow edits from maintainers" on the PR.<br>4. Commit the same file with different content to the base branch to create a content conflict.<br>5. Approve reviews. | 1. Both PRs are merged.<br>2. The project branch HEAD is a merge commit (has >1 parent).<br>3. The `pkgA` submodule points to the PR's head SHA. | High |
| **TC-MERGE-012** | P | **MergeMode: devel (No Conflict, Fast-forward)** | 1. Set `MergeMode = "devel"`.<br>2. Create a FF-mergeable PackageGit PR.<br>3. Approve reviews. | 1. Both PRs are merged.<br>2. The package branch HEAD matches the PR head (FF). | High |
| **TC-MERGE-013** | P | **MergeMode: replace (No Conflict, Fast-forward)** | 1. Set `MergeMode = "replace"`.<br>2. Create a FF-mergeable PackageGit PR.<br>3. Approve reviews. | 1. Both PRs are merged.<br>2. The package branch HEAD matches the PR head (FF). | High |
| **TC-CONFIG-001** | - | **Invalid Configuration** | 1. Provide an invalid `workflow.config` file. | 1. The bot reports an error and does not process any PRs. | High | | **TC-CONFIG-001** | - | **Invalid Configuration** | 1. Provide an invalid `workflow.config` file. | 1. The bot reports an error and does not process any PRs. | High |
| **TC-LABEL-001** | P | **Apply `staging/Auto` label** | 1. Create a new PackageGit PR. | 1. The `staging/Auto` label is applied to the ProjectGit PR. | High | | **TC-LABEL-001** | P | **Apply `staging/Auto` label** | 1. Create a new PackageGit PR. | 1. The `staging/Auto` label is applied to the ProjectGit PR. | High |
| **TC-LABEL-002** | x | **Apply `review/Pending` label** | 1. Create a new PackageGit PR. | 1. The `review/Pending` label is applied to the ProjectGit PR when there are pending reviews. | Medium | | **TC-LABEL-002** | x | **Apply `review/Pending` label** | 1. Create a new PackageGit PR. | 1. The `review/Pending` label is applied to the ProjectGit PR when there are pending reviews. | Medium |
@@ -84,6 +90,7 @@ The testing will be conducted in a dedicated test environment that mimics the pr
#### Legend: #### Legend:
* P = implemented and passing; * P = implemented and passing;
* S = skipped because is not implemented yet to save some execution time;
* x = likely implemented, but investigation is needed; * x = likely implemented, but investigation is needed;
* X = implemented and likely to pass, but someteimes may fail, but troubleshooting is needed; * X = implemented and likely to pass, but someteimes may fail, but troubleshooting is needed;
* - = test is not implemented * - = test is not implemented

View File

@@ -13,9 +13,9 @@ from tests.lib.common_test_utils import GiteaAPIClient
BRANCH_CONFIG_COMMON = { BRANCH_CONFIG_COMMON = {
"workflow.config": { "workflow.config": {
"Workflows": ["pr"], "Workflows": ["pr"],
"Organization": "pool", "Organization": "mypool",
"Reviewers": ["-autogits_obs_staging_bot"], "Reviewers": ["-autogits_obs_staging_bot"],
"GitProjectName": "products/SLFO#{branch}" "GitProjectName": "myproducts/mySLFO#{branch}"
}, },
"_maintainership.json": { "_maintainership.json": {
"": ["ownerX", "ownerY"], "": ["ownerX", "ownerY"],
@@ -25,7 +25,8 @@ BRANCH_CONFIG_COMMON = {
} }
BRANCH_CONFIG_CUSTOM = { BRANCH_CONFIG_CUSTOM = {
"main": { "main": {},
"staging-main": {
"workflow.config": { "workflow.config": {
"ManualMergeProject": True "ManualMergeProject": True
}, },
@@ -54,6 +55,12 @@ BRANCH_CONFIG_CUSTOM = {
"NoProjectGitPR": True "NoProjectGitPR": True
} }
}, },
"manual-merge": {
"workflow.config": {
"ManualMergeOnly": True,
"Reviewers": ["+usera", "+userb", "-autogits_obs_staging_bot"]
}
},
"label-test": { "label-test": {
"workflow.config": { "workflow.config": {
"ManualMergeProject": True, "ManualMergeProject": True,
@@ -64,9 +71,31 @@ BRANCH_CONFIG_CUSTOM = {
"ReviewPending": "review/Pending" "ReviewPending": "review/Pending"
} }
} }
},
"merge-ff": {
"workflow.config": {
"MergeMode": "ff-only"
}
},
"merge-replace": {
"workflow.config": {
"MergeMode": "replace"
}
},
"merge-devel": {
"workflow.config": {
"MergeMode": "devel"
}
} }
} }
# Global state to track created Gitea objects during a pytest run
_CREATED_ORGS = set()
_CREATED_REPOS = set()
_CREATED_USERS = set()
_CREATED_LABELS = set()
_ADDED_COLLABORATORS = set() # format: (org_repo, username)
def setup_users_from_config(client: GiteaAPIClient, wf: dict, mt: dict): def setup_users_from_config(client: GiteaAPIClient, wf: dict, mt: dict):
""" """
Parses workflow.config and _maintainership.json, creates users, and adds them as collaborators. Parses workflow.config and _maintainership.json, creates users, and adds them as collaborators.
@@ -87,18 +116,27 @@ def setup_users_from_config(client: GiteaAPIClient, wf: dict, mt: dict):
# Create all users # Create all users
for username in all_users: for username in all_users:
client.create_user(username, "password123", f"{username}@example.com") if username not in _CREATED_USERS:
client.add_collaborator("products", "SLFO", username, "write") client.create_user(username, "password123", f"{username}@example.com")
_CREATED_USERS.add(username)
if ("myproducts/mySLFO", username) not in _ADDED_COLLABORATORS:
client.add_collaborator("myproducts", "mySLFO", username, "write")
_ADDED_COLLABORATORS.add(("myproducts/mySLFO", username))
# Set specific repository permissions based on maintainership # Set specific repository permissions based on maintainership
for pkg, users in mt.items(): for pkg, users in mt.items():
repo_name = pkg if pkg else None repo_name = pkg if pkg else None
for username in users: for username in users:
if not repo_name: if not repo_name:
client.add_collaborator("pool", "pkgA", username, "write") for r in ["pkgA", "pkgB"]:
client.add_collaborator("pool", "pkgB", username, "write") if (f"mypool/{r}", username) not in _ADDED_COLLABORATORS:
client.add_collaborator("mypool", r, username, "write")
_ADDED_COLLABORATORS.add((f"mypool/{r}", username))
else: else:
client.add_collaborator("pool", repo_name, username, "write") if (f"mypool/{repo_name}", username) not in _ADDED_COLLABORATORS:
client.add_collaborator("mypool", repo_name, username, "write")
_ADDED_COLLABORATORS.add((f"mypool/{repo_name}", username))
def ensure_config_file(client: GiteaAPIClient, owner: str, repo: str, branch: str, file_name: str, expected_content_dict: dict): def ensure_config_file(client: GiteaAPIClient, owner: str, repo: str, branch: str, file_name: str, expected_content_dict: dict):
""" """
@@ -149,33 +187,41 @@ def gitea_env():
raise Exception("Gitea not available.") raise Exception("Gitea not available.")
print("--- Starting Gitea Global Setup ---") print("--- Starting Gitea Global Setup ---")
client.create_org("products") for org in ["myproducts", "mypool"]:
client.create_org("pool") if org not in _CREATED_ORGS:
client.create_repo("products", "SLFO") client.create_org(org)
client.create_repo("pool", "pkgA") _CREATED_ORGS.add(org)
client.create_repo("pool", "pkgB")
client.update_repo_settings("products", "SLFO") for org, repo in [("myproducts", "mySLFO"), ("mypool", "pkgA"), ("mypool", "pkgB")]:
client.update_repo_settings("pool", "pkgA") if f"{org}/{repo}" not in _CREATED_REPOS:
client.update_repo_settings("pool", "pkgB") client.create_repo(org, repo)
client.update_repo_settings(org, repo)
_CREATED_REPOS.add(f"{org}/{repo}")
# Create labels # Create labels
client.create_label("products", "SLFO", "staging/Backlog", color="#0000ff") for name, color in [("staging/Backlog", "#0000ff"), ("review/Pending", "#ffff00")]:
client.create_label("products", "SLFO", "review/Pending", color="#ffff00") if ("myproducts/mySLFO", name) not in _CREATED_LABELS:
client.create_label("myproducts", "mySLFO", name, color=color)
_CREATED_LABELS.add(("myproducts/mySLFO", name))
# Submodules in SLFO # Submodules in mySLFO
client.add_submodules("products", "SLFO") client.add_submodules("myproducts", "mySLFO")
client.add_collaborator("products", "SLFO", "autogits_obs_staging_bot", "write") for repo_full, bot in [("myproducts/mySLFO", "autogits_obs_staging_bot"),
client.add_collaborator("products", "SLFO", "workflow-pr", "write") ("myproducts/mySLFO", "workflow-pr"),
client.add_collaborator("pool", "pkgA", "workflow-pr", "write") ("mypool/pkgA", "workflow-pr"),
client.add_collaborator("pool", "pkgB", "workflow-pr", "write") ("mypool/pkgB", "workflow-pr")]:
if (repo_full, bot) not in _ADDED_COLLABORATORS:
org_part, repo_part = repo_full.split("/")
client.add_collaborator(org_part, repo_part, bot, "write")
_ADDED_COLLABORATORS.add((repo_full, bot))
restart_needed = False restart_needed = False
# Setup all branches and configs # Setup all branches and configs
for branch_name, custom_configs in BRANCH_CONFIG_CUSTOM.items(): for branch_name, custom_configs in BRANCH_CONFIG_CUSTOM.items():
# Ensure branch exists in all 3 repos # Ensure branch exists in all 3 repos
for owner, repo in [("products", "SLFO"), ("pool", "pkgA"), ("pool", "pkgB")]: for owner, repo in [("myproducts", "mySLFO"), ("mypool", "pkgA"), ("mypool", "pkgB")]:
if branch_name != "main": if branch_name != "main":
try: try:
main_sha = client._request("GET", f"repos/{owner}/{repo}/branches/main").json()["commit"]["id"] main_sha = client._request("GET", f"repos/{owner}/{repo}/branches/main").json()["commit"]["id"]
@@ -201,9 +247,9 @@ def gitea_env():
else: else:
merged_configs[file_name] = custom_content merged_configs[file_name] = custom_content
# Ensure config files in products/SLFO # Ensure config files in myproducts/mySLFO
for file_name, content_dict in merged_configs.items(): for file_name, content_dict in merged_configs.items():
if ensure_config_file(client, "products", "SLFO", branch_name, file_name, content_dict): if ensure_config_file(client, "myproducts", "mySLFO", branch_name, file_name, content_dict):
restart_needed = True restart_needed = True
# Setup users (using configs from this branch) # Setup users (using configs from this branch)
@@ -218,23 +264,47 @@ def gitea_env():
@pytest.fixture(scope="session") @pytest.fixture(scope="session")
def automerge_env(gitea_env): def automerge_env(gitea_env):
return gitea_env, "products/SLFO", "merge" return gitea_env, "myproducts/mySLFO", "merge"
@pytest.fixture(scope="session")
def staging_main_env(gitea_env):
return gitea_env, "myproducts/mySLFO", "staging-main"
@pytest.fixture(scope="session")
def manual_merge_env(gitea_env):
return gitea_env, "myproducts/mySLFO", "manual-merge"
@pytest.fixture(scope="session") @pytest.fixture(scope="session")
def maintainer_env(gitea_env): def maintainer_env(gitea_env):
return gitea_env, "products/SLFO", "maintainer-merge" return gitea_env, "myproducts/mySLFO", "maintainer-merge"
@pytest.fixture(scope="session") @pytest.fixture(scope="session")
def review_required_env(gitea_env): def review_required_env(gitea_env):
return gitea_env, "products/SLFO", "review-required" return gitea_env, "myproducts/mySLFO", "review-required"
@pytest.fixture(scope="session") @pytest.fixture(scope="session")
def no_project_git_pr_env(gitea_env): def no_project_git_pr_env(gitea_env):
return gitea_env, "products/SLFO", "dev" return gitea_env, "myproducts/mySLFO", "dev"
@pytest.fixture(scope="session") @pytest.fixture(scope="session")
def label_env(gitea_env): def label_env(gitea_env):
return gitea_env, "products/SLFO", "label-test" return gitea_env, "myproducts/mySLFO", "label-test"
@pytest.fixture(scope="session")
def merge_ff_env(gitea_env):
return gitea_env, "myproducts/mySLFO", "merge-ff"
@pytest.fixture(scope="session")
def merge_replace_env(gitea_env):
return gitea_env, "myproducts/mySLFO", "merge-replace"
@pytest.fixture(scope="session")
def merge_devel_env(gitea_env):
return gitea_env, "myproducts/mySLFO", "merge-devel"
@pytest.fixture(scope="session")
def usera_client(gitea_env):
return GiteaAPIClient(base_url=gitea_env.base_url, token=gitea_env.headers["Authorization"].split(" ")[1], sudo="usera")
@pytest.fixture(scope="session") @pytest.fixture(scope="session")
def ownerA_client(gitea_env): def ownerA_client(gitea_env):
@@ -256,6 +326,6 @@ def staging_bot_client(gitea_env):
def test_user_client(gitea_env): def test_user_client(gitea_env):
username = f"test-user-{int(time.time())}" username = f"test-user-{int(time.time())}"
gitea_env.create_user(username, "password123", f"{username}@example.com") gitea_env.create_user(username, "password123", f"{username}@example.com")
gitea_env.add_collaborator("pool", "pkgA", username, "write") gitea_env.add_collaborator("mypool", "pkgA", username, "write")
gitea_env.add_collaborator("products", "SLFO", username, "write") gitea_env.add_collaborator("myproducts", "mySLFO", username, "write")
return GiteaAPIClient(base_url=gitea_env.base_url, token=gitea_env.headers["Authorization"].split(" ")[1], sudo=username) return GiteaAPIClient(base_url=gitea_env.base_url, token=gitea_env.headers["Authorization"].split(" ")[1], sudo=username)

View File

@@ -1,21 +1,21 @@
<resultlist state="0fef640bfb56c3e76fcfb698b19b59c0"> <resultlist state="0fef640bfb56c3e76fcfb698b19b59c0">
<result project="SUSE:SLFO:Main:PullRequest:1881" repository="standard" arch="aarch64" code="unpublished" state="unpublished"> <result project="SUSE:SLFO:Main:PullRequest:1881" repository="standard" arch="aarch64" code="unpublished" state="unpublished">
<scmsync>https://src.suse.de/products/SLFO.git?onlybuild=openjpeg2#d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scmsync> <scmsync>https://src.suse.de/myproducts/mySLFO.git?onlybuild=openjpeg2#d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scmsync>
<scminfo>d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scminfo> <scminfo>d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scminfo>
<status package="openjpeg2" code="succeeded"/> <status package="openjpeg2" code="succeeded"/>
</result> </result>
<result project="SUSE:SLFO:Main:PullRequest:1881" repository="standard" arch="ppc64le" code="unpublished" state="unpublished"> <result project="SUSE:SLFO:Main:PullRequest:1881" repository="standard" arch="ppc64le" code="unpublished" state="unpublished">
<scmsync>https://src.suse.de/products/SLFO.git?onlybuild=openjpeg2#d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scmsync> <scmsync>https://src.suse.de/myproducts/mySLFO.git?onlybuild=openjpeg2#d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scmsync>
<scminfo>d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scminfo> <scminfo>d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scminfo>
<status package="openjpeg2" code="succeeded"/> <status package="openjpeg2" code="succeeded"/>
</result> </result>
<result project="SUSE:SLFO:Main:PullRequest:1881" repository="standard" arch="x86_64" code="unpublished" state="unpublished"> <result project="SUSE:SLFO:Main:PullRequest:1881" repository="standard" arch="x86_64" code="unpublished" state="unpublished">
<scmsync>https://src.suse.de/products/SLFO.git?onlybuild=openjpeg2#d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scmsync> <scmsync>https://src.suse.de/myproducts/mySLFO.git?onlybuild=openjpeg2#d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scmsync>
<scminfo>d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scminfo> <scminfo>d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scminfo>
<status package="openjpeg2" code="succeeded"/> <status package="openjpeg2" code="succeeded"/>
</result> </result>
<result project="SUSE:SLFO:Main:PullRequest:1881" repository="standard" arch="s390x" code="unpublished" state="unpublished"> <result project="SUSE:SLFO:Main:PullRequest:1881" repository="standard" arch="s390x" code="unpublished" state="unpublished">
<scmsync>https://src.suse.de/products/SLFO.git?onlybuild=openjpeg2#d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scmsync> <scmsync>https://src.suse.de/myproducts/mySLFO.git?onlybuild=openjpeg2#d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scmsync>
<scminfo>d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scminfo> <scminfo>d99ac14dedf9f44e1744c71aaf221d15f6bed479ca11f15738e98f3bf9ae05a1</scminfo>
<status package="openjpeg2" code="succeeded"/> <status package="openjpeg2" code="succeeded"/>
</result> </result>

View File

@@ -2,7 +2,7 @@
<title>openSUSE Leap 16.0 based on SLFO</title> <title>openSUSE Leap 16.0 based on SLFO</title>
<description>Leap 16.0 based on SLES 16.0 (specifically SLFO:1.2)</description> <description>Leap 16.0 based on SLES 16.0 (specifically SLFO:1.2)</description>
<link project="openSUSE:Backports:SLE-16.0"/> <link project="openSUSE:Backports:SLE-16.0"/>
<scmsync>http://gitea-test:3000/products/SLFO#main</scmsync> <scmsync>http://gitea-test:3000/myproducts/mySLFO#staging-main</scmsync>
<person userid="dimstar_suse" role="maintainer"/> <person userid="dimstar_suse" role="maintainer"/>
<person userid="lkocman-factory" role="maintainer"/> <person userid="lkocman-factory" role="maintainer"/>
<person userid="maxlin_factory" role="maintainer"/> <person userid="maxlin_factory" role="maintainer"/>
@@ -56,4 +56,4 @@
<arch>ppc64le</arch> <arch>ppc64le</arch>
<arch>s390x</arch> <arch>s390x</arch>
</repository> </repository>
</project> </project>

View File

@@ -3,6 +3,7 @@ import time
import pytest import pytest
import requests import requests
import json import json
import re
import xml.etree.ElementTree as ET import xml.etree.ElementTree as ET
from pathlib import Path from pathlib import Path
import base64 import base64
@@ -172,8 +173,8 @@ class GiteaAPIClient:
raise raise
# Get latest commit SHAs for the submodules # Get latest commit SHAs for the submodules
pkg_a_sha = self._request("GET", "repos/pool/pkgA/branches/main").json()["commit"]["id"] pkg_a_sha = self._request("GET", "repos/mypool/pkgA/branches/main").json()["commit"]["id"]
pkg_b_sha = self._request("GET", "repos/pool/pkgB/branches/main").json()["commit"]["id"] pkg_b_sha = self._request("GET", "repos/mypool/pkgB/branches/main").json()["commit"]["id"]
if not pkg_a_sha or not pkg_b_sha: if not pkg_a_sha or not pkg_b_sha:
raise Exception("Error: Could not get submodule commit SHAs. Cannot apply patch.") raise Exception("Error: Could not get submodule commit SHAs. Cannot apply patch.")
@@ -186,10 +187,10 @@ index 0000000..f1838bd
@@ -0,0 +1,6 @@ @@ -0,0 +1,6 @@
+[submodule "pkgA"] +[submodule "pkgA"]
+ path = pkgA + path = pkgA
+ url = ../../pool/pkgA.git + url = ../../mypool/pkgA.git
+[submodule "pkgB"] +[submodule "pkgB"]
+ path = pkgB + path = pkgB
+ url = ../../pool/pkgB.git + url = ../../mypool/pkgB.git
diff --git a/pkgA b/pkgA diff --git a/pkgA b/pkgA
new file mode 160000 new file mode 160000
index 0000000..{pkg_a_sha} index 0000000..{pkg_a_sha}
@@ -272,12 +273,12 @@ index 0000000..{pkg_b_sha}
owner, repo = repo_full_name.split("/") owner, repo = repo_full_name.split("/")
head_owner, head_repo = owner, repo head_owner, head_repo = owner, repo
new_branch_name = f"pr-branch-{int(time.time()*1000)}"
if use_fork: if use_fork:
sudo_user = self.headers.get("Sudo") sudo_user = self.headers.get("Sudo")
head_owner = sudo_user head_owner = sudo_user
head_repo = repo head_repo = repo
new_branch_name = f"pr-branch-{int(time.time()*1000)}"
print(f"--- Forking {repo_full_name} ---") print(f"--- Forking {repo_full_name} ---")
try: try:
@@ -290,29 +291,11 @@ index 0000000..{pkg_b_sha}
else: else:
raise raise
# Create a unique branch in the FORK # Apply the diff using diffpatch and create the new branch automatically
base_commit_sha = self._request("GET", f"repos/{owner}/{repo}/branches/{base_branch}").json()["commit"]["id"] print(f"--- Applying diff to {head_owner}/{head_repo} from {base_branch} to new branch {new_branch_name} ---")
print(f"--- Creating branch {new_branch_name} in {head_owner}/{head_repo} from {base_branch} ({base_commit_sha}) ---")
self._request("POST", f"repos/{head_owner}/{head_repo}/branches", json={
"new_branch_name": new_branch_name,
"old_ref": base_commit_sha
})
else:
new_branch_name = f"pr-branch-{int(time.time()*1000)}"
# Get the latest commit SHA of the base branch from the ORIGINAL repo
base_commit_sha = self._request("GET", f"repos/{owner}/{repo}/branches/{base_branch}").json()["commit"]["id"]
# Try to create the branch in the ORIGINAL repo
print(f"--- Creating branch {new_branch_name} in {repo_full_name} ---")
self._request("POST", f"repos/{owner}/{repo}/branches", json={
"new_branch_name": new_branch_name,
"old_ref": base_commit_sha
})
# Apply the diff using diffpatch in the branch (wherever it is)
print(f"--- Applying diff to {head_owner}/{head_repo} branch {new_branch_name} ---")
self._request("POST", f"repos/{head_owner}/{head_repo}/diffpatch", json={ self._request("POST", f"repos/{head_owner}/{head_repo}/diffpatch", json={
"branch": new_branch_name, "branch": base_branch,
"new_branch": new_branch_name,
"content": diff_content, "content": diff_content,
"message": title "message": title
}) })
@@ -389,6 +372,14 @@ index 0000000..{pkg_b_sha}
response = self._request("PATCH", url, json=kwargs) response = self._request("PATCH", url, json=kwargs)
return response.json() return response.json()
def create_issue_comment(self, repo_full_name: str, issue_number: int, body: str):
owner, repo = repo_full_name.split("/")
url = f"repos/{owner}/{repo}/issues/{issue_number}/comments"
data = {"body": body}
print(f"--- Creating comment on {repo_full_name} issue #{issue_number} ---")
response = self._request("POST", url, json=data)
return response.json()
def get_timeline_events(self, repo_full_name: str, pr_number: int): def get_timeline_events(self, repo_full_name: str, pr_number: int):
owner, repo = repo_full_name.split("/") owner, repo = repo_full_name.split("/")
url = f"repos/{owner}/{repo}/issues/{pr_number}/timeline" url = f"repos/{owner}/{repo}/issues/{pr_number}/timeline"
@@ -522,3 +513,45 @@ index 0000000..{pkg_b_sha}
print(f"Error restarting service {service_name}: {e}") print(f"Error restarting service {service_name}: {e}")
raise raise
def wait_for_project_pr(self, package_pr_repo, package_pr_number, project_pr_repo="myproducts/mySLFO", timeout=60):
print(f"Polling {package_pr_repo} PR #{package_pr_number} timeline for forwarded PR event in {project_pr_repo}...")
for _ in range(timeout):
time.sleep(1)
timeline_events = self.get_timeline_events(package_pr_repo, package_pr_number)
for event in timeline_events:
if event.get("type") == "pull_ref":
if not (ref_issue := event.get("ref_issue")):
continue
url_to_check = ref_issue.get("html_url", "")
match = re.search(fr"{project_pr_repo}/pulls/(\d+)", url_to_check)
if match:
return int(match.group(1))
return None
def approve_and_wait_merge(self, package_pr_repo, package_pr_number, project_pr_number, project_pr_repo="myproducts/mySLFO", timeout=30):
print(f"Approving reviews and verifying both PRs are merged ({package_pr_repo}#{package_pr_number} and {project_pr_repo}#{project_pr_number})...")
package_merged = False
project_merged = False
for i in range(timeout):
self.approve_requested_reviews(package_pr_repo, package_pr_number)
self.approve_requested_reviews(project_pr_repo, project_pr_number)
if not package_merged:
pkg_details = self.get_pr_details(package_pr_repo, package_pr_number)
if pkg_details.get("merged"):
package_merged = True
print(f"Package PR {package_pr_repo}#{package_pr_number} merged.")
if not project_merged:
prj_details = self.get_pr_details(project_pr_repo, project_pr_number)
if prj_details.get("merged"):
project_merged = True
print(f"Project PR {project_pr_repo}#{project_pr_number} merged.")
if package_merged and project_merged:
return True, True
time.sleep(1)
return package_merged, project_merged

View File

@@ -14,42 +14,26 @@ from tests.lib.common_test_utils import (
# ============================================================================= # =============================================================================
def test_pr_workflow_succeeded(gitea_env, mock_build_result): def test_pr_workflow_succeeded(staging_main_env, mock_build_result):
"""End-to-end test for a successful PR workflow.""" """End-to-end test for a successful PR workflow."""
gitea_env, test_full_repo_name, merge_branch_name = staging_main_env
diff = "diff --git a/test.txt b/test.txt\nnew file mode 100644\nindex 0000000..e69de29\n" diff = "diff --git a/test.txt b/test.txt\nnew file mode 100644\nindex 0000000..e69de29\n"
pr = gitea_env.create_gitea_pr("pool/pkgA", diff, "Test PR - should succeed", False) pr = gitea_env.create_gitea_pr("mypool/pkgA", diff, "Test PR - should succeed", False, base_branch=merge_branch_name)
initial_pr_number = pr["number"] initial_pr_number = pr["number"]
compose_dir = Path(__file__).parent.parent compose_dir = Path(__file__).parent.parent
forwarded_pr_number = None forwarded_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", initial_pr_number)
print(
f"Polling pool/pkgA PR #{initial_pr_number} timeline for forwarded PR event..."
)
for _ in range(20):
time.sleep(1)
timeline_events = gitea_env.get_timeline_events("pool/pkgA", initial_pr_number)
for event in timeline_events:
if event.get("type") == "pull_ref":
if not (ref_issue := event.get("ref_issue")):
continue
url_to_check = ref_issue.get("html_url", "")
match = re.search(r"products/SLFO/pulls/(\d+)", url_to_check)
if match:
forwarded_pr_number = match.group(1)
break
if forwarded_pr_number:
break
assert ( assert (
forwarded_pr_number is not None forwarded_pr_number is not None
), "Workflow bot did not create a pull_ref event on the timeline." ), "Workflow bot did not create a project PR."
print(f"Found forwarded PR: products/SLFO #{forwarded_pr_number}") print(f"Found forwarded PR: myproducts/mySLFO #{forwarded_pr_number}")
print(f"Polling products/SLFO PR #{forwarded_pr_number} for reviewer assignment...") print(f"Polling myproducts/mySLFO PR #{forwarded_pr_number} for reviewer assignment...")
reviewer_added = False reviewer_added = False
for _ in range(15): for _ in range(15):
time.sleep(1) time.sleep(1)
pr_details = gitea_env.get_pr_details("products/SLFO", forwarded_pr_number) pr_details = gitea_env.get_pr_details("myproducts/mySLFO", forwarded_pr_number)
if any( if any(
r.get("login") == "autogits_obs_staging_bot" r.get("login") == "autogits_obs_staging_bot"
for r in pr_details.get("requested_reviewers", []) for r in pr_details.get("requested_reviewers", [])
@@ -69,11 +53,11 @@ def test_pr_workflow_succeeded(gitea_env, mock_build_result):
capture_output=True, capture_output=True,
) )
print(f"Polling products/SLFO PR #{forwarded_pr_number} for final status...") print(f"Polling myproducts/mySLFO PR #{forwarded_pr_number} for final status...")
status_comment_found = False status_comment_found = False
for _ in range(20): for _ in range(20):
time.sleep(1) time.sleep(1)
timeline_events = gitea_env.get_timeline_events("products/SLFO", forwarded_pr_number) timeline_events = gitea_env.get_timeline_events("myproducts/mySLFO", forwarded_pr_number)
for event in timeline_events: for event in timeline_events:
print(event.get("body", "not a body")) print(event.get("body", "not a body"))
if event.get("body") and "successful" in event["body"]: if event.get("body") and "successful" in event["body"]:
@@ -84,42 +68,26 @@ def test_pr_workflow_succeeded(gitea_env, mock_build_result):
assert status_comment_found, "Staging bot did not post a 'successful' comment." assert status_comment_found, "Staging bot did not post a 'successful' comment."
def test_pr_workflow_failed(gitea_env, mock_build_result): def test_pr_workflow_failed(staging_main_env, mock_build_result):
"""End-to-end test for a failed PR workflow.""" """End-to-end test for a failed PR workflow."""
gitea_env, test_full_repo_name, merge_branch_name = staging_main_env
diff = "diff --git a/another_test.txt b/another_test.txt\nnew file mode 100644\nindex 0000000..e69de29\n" diff = "diff --git a/another_test.txt b/another_test.txt\nnew file mode 100644\nindex 0000000..e69de29\n"
pr = gitea_env.create_gitea_pr("pool/pkgA", diff, "Test PR - should fail", False) pr = gitea_env.create_gitea_pr("mypool/pkgA", diff, "Test PR - should fail", False, base_branch=merge_branch_name)
initial_pr_number = pr["number"] initial_pr_number = pr["number"]
compose_dir = Path(__file__).parent.parent compose_dir = Path(__file__).parent.parent
forwarded_pr_number = None forwarded_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", initial_pr_number)
print(
f"Polling pool/pkgA PR #{initial_pr_number} timeline for forwarded PR event..."
)
for _ in range(20):
time.sleep(1)
timeline_events = gitea_env.get_timeline_events("pool/pkgA", initial_pr_number)
for event in timeline_events:
if event.get("type") == "pull_ref":
if not (ref_issue := event.get("ref_issue")):
continue
url_to_check = ref_issue.get("html_url", "")
match = re.search(r"products/SLFO/pulls/(\d+)", url_to_check)
if match:
forwarded_pr_number = match.group(1)
break
if forwarded_pr_number:
break
assert ( assert (
forwarded_pr_number is not None forwarded_pr_number is not None
), "Workflow bot did not create a pull_ref event on the timeline." ), "Workflow bot did not create a project PR."
print(f"Found forwarded PR: products/SLFO #{forwarded_pr_number}") print(f"Found forwarded PR: myproducts/mySLFO #{forwarded_pr_number}")
print(f"Polling products/SLFO PR #{forwarded_pr_number} for reviewer assignment...") print(f"Polling myproducts/mySLFO PR #{forwarded_pr_number} for reviewer assignment...")
reviewer_added = False reviewer_added = False
for _ in range(15): for _ in range(15):
time.sleep(1) time.sleep(1)
pr_details = gitea_env.get_pr_details("products/SLFO", forwarded_pr_number) pr_details = gitea_env.get_pr_details("myproducts/mySLFO", forwarded_pr_number)
if any( if any(
r.get("login") == "autogits_obs_staging_bot" r.get("login") == "autogits_obs_staging_bot"
for r in pr_details.get("requested_reviewers", []) for r in pr_details.get("requested_reviewers", [])
@@ -139,11 +107,11 @@ def test_pr_workflow_failed(gitea_env, mock_build_result):
capture_output=True, capture_output=True,
) )
print(f"Polling products/SLFO PR #{forwarded_pr_number} for final status...") print(f"Polling myproducts/mySLFO PR #{forwarded_pr_number} for final status...")
status_comment_found = False status_comment_found = False
for _ in range(20): for _ in range(20):
time.sleep(1) time.sleep(1)
timeline_events = gitea_env.get_timeline_events("products/SLFO", forwarded_pr_number) timeline_events = gitea_env.get_timeline_events("myproducts/mySLFO", forwarded_pr_number)
for event in timeline_events: for event in timeline_events:
if event.get("body") and "failed" in event["body"]: if event.get("body") and "failed" in event["body"]:
status_comment_found = True status_comment_found = True

View File

@@ -29,40 +29,24 @@ def test_001_project_pr_labels(label_env, staging_bot_client):
new file mode 100644 new file mode 100644
index 0000000..e69de29 index 0000000..e69de29
""" """
print(f"--- Creating package PR in pool/pkgA on branch {branch_name} ---") print(f"--- Creating package PR in mypool/pkgA on branch {branch_name} ---")
package_pr = gitea_env.create_gitea_pr("pool/pkgA", diff, "Test Labels Fixture", False, base_branch=branch_name) package_pr = gitea_env.create_gitea_pr("mypool/pkgA", diff, "Test Labels Fixture", False, base_branch=branch_name)
package_pr_number = package_pr["number"] package_pr_number = package_pr["number"]
print(f"Created package PR pool/pkgA#{package_pr_number}") print(f"Created package PR mypool/pkgA#{package_pr_number}")
# 2. Make sure the workflow-pr service created related project PR # 2. Make sure the workflow-pr service created related project PR
project_pr_number = None project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number)
print(f"Polling pool/pkgA PR #{package_pr_number} timeline for forwarded PR event...")
for _ in range(40):
time.sleep(1)
timeline_events = gitea_env.get_timeline_events("pool/pkgA", package_pr_number)
for event in timeline_events:
if event.get("type") == "pull_ref":
if not (ref_issue := event.get("ref_issue")):
continue
url_to_check = ref_issue.get("html_url", "")
match = re.search(r"products/SLFO/pulls/(\d+)", url_to_check)
if match:
project_pr_number = int(match.group(1))
break
if project_pr_number:
break
assert project_pr_number is not None, "Workflow bot did not create a project PR." assert project_pr_number is not None, "Workflow bot did not create a project PR."
print(f"Found project PR: products/SLFO#{project_pr_number}") print(f"Found project PR: myproducts/mySLFO#{project_pr_number}")
# 3. Wait for the project PR to have the label "staging/Backlog" # 3. Wait for the project PR to have the label "staging/Backlog"
print(f"Checking for 'staging/Backlog' label on project PR products/SLFO#{project_pr_number}...") print(f"Checking for 'staging/Backlog' label on project PR myproducts/mySLFO#{project_pr_number}...")
backlog_label_found = False backlog_label_found = False
expected_backlog_label = "staging/Backlog" expected_backlog_label = "staging/Backlog"
for _ in range(20): for _ in range(20):
project_pr_details = gitea_env.get_pr_details("products/SLFO", project_pr_number) project_pr_details = gitea_env.get_pr_details("myproducts/mySLFO", project_pr_number)
labels = project_pr_details.get("labels", []) labels = project_pr_details.get("labels", [])
label_names = [l["name"] for l in labels] label_names = [l["name"] for l in labels]
if expected_backlog_label in label_names: if expected_backlog_label in label_names:
@@ -70,21 +54,21 @@ index 0000000..e69de29
break break
time.sleep(1) time.sleep(1)
assert backlog_label_found, f"Project PR products/SLFO#{project_pr_number} does not have the expected label '{expected_backlog_label}'." assert backlog_label_found, f"Project PR myproducts/mySLFO#{project_pr_number} does not have the expected label '{expected_backlog_label}'."
print(f"Project PR products/SLFO#{project_pr_number} has the expected label '{expected_backlog_label}'.") print(f"Project PR myproducts/mySLFO#{project_pr_number} has the expected label '{expected_backlog_label}'.")
# 4. Post approval from autogits_obs_staging_bot # 4. Post approval from autogits_obs_staging_bot
print(f"--- Posting approval from autogits_obs_staging_bot on project PR products/SLFO#{project_pr_number} ---") print(f"--- Posting approval from autogits_obs_staging_bot on project PR myproducts/mySLFO#{project_pr_number} ---")
staging_bot_client.create_review("products/SLFO", project_pr_number, event="APPROVED", body="Staging OK") staging_bot_client.create_review("myproducts/mySLFO", project_pr_number, event="APPROVED", body="Staging OK")
# 5. Check that the project PR has the label "review/Pending" # 5. Check that the project PR has the label "review/Pending"
print(f"Checking for 'review/Pending' label on project PR products/SLFO#{project_pr_number}...") print(f"Checking for 'review/Pending' label on project PR myproducts/mySLFO#{project_pr_number}...")
pending_label_found = False pending_label_found = False
expected_pending_label = "review/Pending" expected_pending_label = "review/Pending"
for _ in range(20): for _ in range(20):
project_pr_details = gitea_env.get_pr_details("products/SLFO", project_pr_number) project_pr_details = gitea_env.get_pr_details("myproducts/mySLFO", project_pr_number)
labels = project_pr_details.get("labels", []) labels = project_pr_details.get("labels", [])
label_names = [l["name"] for l in labels] label_names = [l["name"] for l in labels]
print(f"Current labels: {label_names}") print(f"Current labels: {label_names}")
@@ -93,5 +77,5 @@ index 0000000..e69de29
break break
time.sleep(1) time.sleep(1)
assert pending_label_found, f"Project PR products/SLFO#{project_pr_number} does not have the expected label '{expected_pending_label}'." assert pending_label_found, f"Project PR myproducts/mySLFO#{project_pr_number} does not have the expected label '{expected_pending_label}'."
print(f"Project PR products/SLFO#{project_pr_number} has the expected label '{expected_pending_label}'.") print(f"Project PR myproducts/mySLFO#{project_pr_number} has the expected label '{expected_pending_label}'.")

View File

@@ -5,78 +5,486 @@ from pathlib import Path
from tests.lib.common_test_utils import GiteaAPIClient from tests.lib.common_test_utils import GiteaAPIClient
@pytest.mark.t001 @pytest.mark.t001
@pytest.mark.xfail(reason="The bot sometimes re-request reviews despite having all the approvals")
def test_001_automerge(automerge_env, test_user_client): def test_001_automerge(automerge_env, test_user_client):
""" """
Test scenario: Test scenario TC-MERGE-001:
1. Setup custom workflow.config with mandatory reviewers (+usera, +userb). 1. Create a PackageGit PR.
2. Create a package PR in 'merge' branch. 2. Ensure all mandatory reviews are completed on both project and package PRs.
3. Make sure the workflow-pr service created related project PR in 'merge' branch. 3. Verify the PR is merged automatically.
4. React on 'requested' reviews by approving them.
5. Make sure both PRs are merged automatically by the workflow-pr service.
""" """
gitea_env, test_full_repo_name, merge_branch_name = automerge_env gitea_env, test_full_repo_name, merge_branch_name = automerge_env
# 1. Create a package PR # 1. Create a package PR
diff = """diff --git a/merge_test_fixture.txt b/merge_test_fixture.txt diff = """diff --git a/automerge_test.txt b/automerge_test.txt
new file mode 100644 new file mode 100644
index 0000000..e69de29 index 0000000..e69de29
""" """
print(f"--- Creating package PR in pool/pkgA on branch {merge_branch_name} ---") print(f"--- Creating package PR in mypool/pkgA on branch {merge_branch_name} ---")
package_pr = test_user_client.create_gitea_pr("pool/pkgA", diff, "Test Automerge Fixture", False, base_branch=merge_branch_name) package_pr = test_user_client.create_gitea_pr("mypool/pkgA", diff, "Test Automerge Fixture", False, base_branch=merge_branch_name)
package_pr_number = package_pr["number"] package_pr_number = package_pr["number"]
print(f"Created package PR pool/pkgA#{package_pr_number}") print(f"Created package PR mypool/pkgA#{package_pr_number}")
# 2. Make sure the workflow-pr service created related project PR # 2. Make sure the workflow-pr service created related project PR
project_pr_number = None project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number)
print(f"Polling pool/pkgA PR #{package_pr_number} timeline for forwarded PR event...")
for _ in range(40):
time.sleep(1)
timeline_events = gitea_env.get_timeline_events("pool/pkgA", package_pr_number)
for event in timeline_events:
if event.get("type") == "pull_ref":
if not (ref_issue := event.get("ref_issue")):
continue
url_to_check = ref_issue.get("html_url", "")
match = re.search(r"products/SLFO/pulls/(\d+)", url_to_check)
if match:
project_pr_number = int(match.group(1))
break
if project_pr_number:
break
assert project_pr_number is not None, "Workflow bot did not create a project PR." assert project_pr_number is not None, "Workflow bot did not create a project PR."
print(f"Found project PR: products/SLFO#{project_pr_number}") print(f"Found project PR: myproducts/mySLFO#{project_pr_number}")
# 4. Make sure both PRs are merged automatically by the workflow-pr service # 3. Approve reviews and verify merged
print("Polling for PR merge status and reacting on REQUEST_REVIEW...") pkg_merged, prj_merged = gitea_env.approve_and_wait_merge("mypool/pkgA", package_pr_number, project_pr_number)
assert pkg_merged, f"Package PR mypool/pkgA#{package_pr_number} was not merged automatically."
assert prj_merged, f"Project PR myproducts/mySLFO#{project_pr_number} was not merged automatically."
print("Both PRs merged successfully.")
@pytest.mark.t002
def test_002_manual_merge(manual_merge_env, test_user_client, usera_client, staging_bot_client):
"""
Test scenario TC-MERGE-002:
1. Create a PackageGit PR with ManualMergeOnly set to true.
2. Ensure all mandatory reviews are completed on both project and package PRs.
3. Comment "merge ok" on the package PR from the account of a requested reviewer.
4. Verify the PR is merged.
"""
gitea_env, test_full_repo_name, merge_branch_name = manual_merge_env
# 1. Create a package PR
diff = """diff --git a/manual_merge_test.txt b/manual_merge_test.txt
new file mode 100644
index 0000000..e69de29
"""
print(f"--- Creating package PR in mypool/pkgA on branch {merge_branch_name} ---")
package_pr = test_user_client.create_gitea_pr("mypool/pkgA", diff, "Test Manual Merge Fixture", False, base_branch=merge_branch_name)
package_pr_number = package_pr["number"]
print(f"Created package PR mypool/pkgA#{package_pr_number}")
# 2. Make sure the workflow-pr service created related project PR
project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number)
assert project_pr_number is not None, "Workflow bot did not create a project PR."
print(f"Found project PR: myproducts/mySLFO#{project_pr_number}")
# 3. Approve reviews and verify NOT merged
print("Waiting for all expected review requests and approving them...")
# Expected reviewers based on manual-merge branch config and pkgA maintainership
expected_reviewers = {"usera", "userb", "ownerA", "ownerX", "ownerY"}
# ManualMergeOnly still requires regular reviews to be satisfied.
# We poll until all expected reviewers are requested, then approve them.
all_requested = False
for _ in range(30):
# Trigger approvals for whatever is already requested
gitea_env.approve_requested_reviews("mypool/pkgA", package_pr_number)
gitea_env.approve_requested_reviews("myproducts/mySLFO", project_pr_number)
# Explicitly handle staging bot if it is requested or pending
prj_reviews = gitea_env.list_reviews("myproducts/mySLFO", project_pr_number)
if any(r["user"]["login"] == "autogits_obs_staging_bot" and r["state"] in ["REQUEST_REVIEW", "PENDING"] for r in prj_reviews):
print("Staging bot has a pending/requested review. Approving...")
staging_bot_client.create_review("myproducts/mySLFO", project_pr_number, event="APPROVED", body="Staging bot approves")
# Check if all expected reviewers have at least one review record (any state)
pkg_reviews = gitea_env.list_reviews("mypool/pkgA", package_pr_number)
current_reviewers = {r["user"]["login"] for r in pkg_reviews}
if expected_reviewers.issubset(current_reviewers):
# Also ensure they are all approved (not just requested)
approved_reviewers = {r["user"]["login"] for r in pkg_reviews if r["state"] == "APPROVED"}
if expected_reviewers.issubset(approved_reviewers):
# And check project PR for bot approval
prj_approved = any(r["user"]["login"] == "autogits_obs_staging_bot" and r["state"] == "APPROVED" for r in prj_reviews)
if prj_approved:
all_requested = True
print(f"All expected reviewers {expected_reviewers} and staging bot have approved.")
break
pkg_details = gitea_env.get_pr_details("mypool/pkgA", package_pr_number)
prj_details = gitea_env.get_pr_details("myproducts/mySLFO", project_pr_number)
assert not pkg_details.get("merged"), "Package PR merged prematurely (ManualMergeOnly ignored?)"
assert not prj_details.get("merged"), "Project PR merged prematurely (ManualMergeOnly ignored?)"
time.sleep(2)
assert all_requested, f"Timed out waiting for all expected reviewers {expected_reviewers} to approve. Current: {current_reviewers}"
print("Both PRs have all required approvals but are not merged (as expected with ManualMergeOnly).")
# 4. Comment "merge ok" from a requested reviewer (usera)
print("Commenting 'merge ok' on package PR...")
usera_client.create_issue_comment("mypool/pkgA", package_pr_number, "merge ok")
# 5. Verify both PRs are merged
print("Polling for PR merge status...")
package_merged = False package_merged = False
project_merged = False project_merged = False
for i in range(15): # Poll for up to 15 seconds for i in range(20): # Poll for up to 20 seconds
# Package PR
if not package_merged: if not package_merged:
pkg_details = gitea_env.get_pr_details("pool/pkgA", package_pr_number) pkg_details = gitea_env.get_pr_details("mypool/pkgA", package_pr_number)
if pkg_details.get("merged"): if pkg_details.get("merged"):
package_merged = True package_merged = True
print(f"Package PR pool/pkgA#{package_pr_number} merged.") print(f"Package PR mypool/pkgA#{package_pr_number} merged.")
else:
gitea_env.approve_requested_reviews("pool/pkgA", package_pr_number)
# Project PR
if not project_merged: if not project_merged:
prj_details = gitea_env.get_pr_details("products/SLFO", project_pr_number) prj_details = gitea_env.get_pr_details("myproducts/mySLFO", project_pr_number)
if prj_details.get("merged"): if prj_details.get("merged"):
project_merged = True project_merged = True
print(f"Project PR products/SLFO#{project_pr_number} merged.") print(f"Project PR myproducts/mySLFO#{project_pr_number} merged.")
else:
gitea_env.approve_requested_reviews("products/SLFO", project_pr_number)
if package_merged and project_merged: if package_merged and project_merged:
break break
time.sleep(1) time.sleep(1)
assert package_merged, f"Package PR pool/pkgA#{package_pr_number} was not merged automatically." assert package_merged, f"Package PR mypool/pkgA#{package_pr_number} was not merged after 'merge ok'."
assert project_merged, f"Project PR products/SLFO#{project_pr_number} was not merged automatically." assert project_merged, f"Project PR myproducts/mySLFO#{project_pr_number} was not merged after 'merge ok'."
print("Both PRs merged successfully.") print("Both PRs merged successfully after 'merge ok'.")
@pytest.mark.t003
def test_003_refuse_manual_merge(manual_merge_env, test_user_client, ownerB_client, staging_bot_client):
"""
Test scenario TC-MERGE-003:
1. Create a PackageGit PR with ManualMergeOnly set to true.
2. Ensure all mandatory reviews are completed on both project and package PRs.
3. Comment "merge ok" on the package PR from the account of a not requested reviewer.
4. Verify the PR is not merged.
"""
gitea_env, test_full_repo_name, merge_branch_name = manual_merge_env
# 1. Create a package PR
diff = """diff --git a/manual_merge_test.txt b/manual_merge_test.txt
new file mode 100644
index 0000000..e69de29
"""
print(f"--- Creating package PR in mypool/pkgA on branch {merge_branch_name} ---")
package_pr = test_user_client.create_gitea_pr("mypool/pkgA", diff, "Test Manual Merge Fixture", False, base_branch=merge_branch_name)
package_pr_number = package_pr["number"]
print(f"Created package PR mypool/pkgA#{package_pr_number}")
# 2. Make sure the workflow-pr service created related project PR
project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number)
assert project_pr_number is not None, "Workflow bot did not create a project PR."
print(f"Found project PR: myproducts/mySLFO#{project_pr_number}")
# 3. Approve reviews and verify NOT merged
print("Waiting for all expected review requests and approving them...")
# Expected reviewers based on manual-merge branch config and pkgA maintainership
expected_reviewers = {"usera", "userb", "ownerA", "ownerX", "ownerY"}
# ManualMergeOnly still requires regular reviews to be satisfied.
# We poll until all expected reviewers are requested, then approve them.
all_requested = False
for _ in range(30):
# Trigger approvals for whatever is already requested
gitea_env.approve_requested_reviews("mypool/pkgA", package_pr_number)
gitea_env.approve_requested_reviews("myproducts/mySLFO", project_pr_number)
# Explicitly handle staging bot if it is requested or pending
prj_reviews = gitea_env.list_reviews("myproducts/mySLFO", project_pr_number)
if any(r["user"]["login"] == "autogits_obs_staging_bot" and r["state"] in ["REQUEST_REVIEW", "PENDING"] for r in prj_reviews):
print("Staging bot has a pending/requested review. Approving...")
staging_bot_client.create_review("myproducts/mySLFO", project_pr_number, event="APPROVED", body="Staging bot approves")
# Check if all expected reviewers have at least one review record (any state)
pkg_reviews = gitea_env.list_reviews("mypool/pkgA", package_pr_number)
current_reviewers = {r["user"]["login"] for r in pkg_reviews}
if expected_reviewers.issubset(current_reviewers):
# Also ensure they are all approved (not just requested)
approved_reviewers = {r["user"]["login"] for r in pkg_reviews if r["state"] == "APPROVED"}
if expected_reviewers.issubset(approved_reviewers):
# And check project PR for bot approval
prj_approved = any(r["user"]["login"] == "autogits_obs_staging_bot" and r["state"] == "APPROVED" for r in prj_reviews)
if prj_approved:
all_requested = True
print(f"All expected reviewers {expected_reviewers} and staging bot have approved.")
break
pkg_details = gitea_env.get_pr_details("mypool/pkgA", package_pr_number)
prj_details = gitea_env.get_pr_details("myproducts/mySLFO", project_pr_number)
assert not pkg_details.get("merged"), "Package PR merged prematurely (ManualMergeOnly ignored?)"
assert not prj_details.get("merged"), "Project PR merged prematurely (ManualMergeOnly ignored?)"
time.sleep(2)
assert all_requested, f"Timed out waiting for all expected reviewers {expected_reviewers} to approve. Current: {current_reviewers}"
print("Both PRs have all required approvals but are not merged (as expected with ManualMergeOnly).")
# 4. Comment "merge ok" from a requested reviewer (ownerB)
print("Commenting 'merge ok' on package PR as user ownerB ...")
ownerB_client.create_issue_comment("mypool/pkgA", package_pr_number, "merge ok")
# 5. Verify both PRs are merged
print("Polling for PR merge status...")
package_merged = False
project_merged = False
for i in range(20): # Poll for up to 20 seconds
if not package_merged:
pkg_details = gitea_env.get_pr_details("mypool/pkgA", package_pr_number)
if pkg_details.get("merged"):
package_merged = True
print(f"Package PR mypool/pkgA#{package_pr_number} merged.")
if not project_merged:
prj_details = gitea_env.get_pr_details("myproducts/mySLFO", project_pr_number)
if prj_details.get("merged"):
project_merged = True
print(f"Project PR myproducts/mySLFO#{project_pr_number} merged.")
if package_merged and project_merged:
break
time.sleep(1)
assert not package_merged, f"Package PR mypool/pkgA#{package_pr_number} was merged after 'merge ok'."
assert not project_merged, f"Project PR myproducts/mySLFO#{project_pr_number} was merged after 'merge ok'."
print("Both PRs merged not after 'merge ok'.")
@pytest.mark.t008
def test_008_merge_mode_ff_only_success(merge_ff_env, test_user_client):
"""
Test MergeMode "ff-only" - Success case (FF-mergeable)
"""
gitea_env, test_full_repo_name, merge_branch_name = merge_ff_env
# 1. Create a package PR (this will be FF-mergeable by default)
diff = """diff --git a/ff_test.txt b/ff_test.txt
new file mode 100644
index 0000000..e69de29
"""
package_pr = test_user_client.create_gitea_pr("mypool/pkgA", diff, "Test FF Merge", False, base_branch=merge_branch_name)
package_pr_number = package_pr["number"]
project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number)
assert project_pr_number is not None
pkg_merged, prj_merged = gitea_env.approve_and_wait_merge("mypool/pkgA", package_pr_number, project_pr_number)
assert pkg_merged and prj_merged
@pytest.mark.t009
def test_009_merge_mode_ff_only_failure(merge_ff_env, ownerA_client):
"""
Test MergeMode "ff-only" - Failure case (Content Conflict, should NOT merge)
"""
gitea_env, test_full_repo_name, merge_branch_name = merge_ff_env
ts = time.strftime("%H%M%S")
filename = f"ff_fail_test_{ts}.txt"
# 1. Create a package PR that adds a file
diff = f"""diff --git a/{filename} b/{filename}
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/{filename}
@@ -0,0 +1 @@
+PR content
"""
package_pr = ownerA_client.create_gitea_pr("mypool/pkgA", diff, "Test FF Merge Failure (Conflict)", False, base_branch=merge_branch_name)
package_pr_number = package_pr["number"]
# 2. Wait for project PR to be created
project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number)
assert project_pr_number is not None
print("Making PR non-FF by creating a content conflict in the base branch...")
gitea_env.create_file("mypool", "pkgA", filename, "Conflicting base content\n", branch=merge_branch_name)
print("Approving reviews initially...")
gitea_env.approve_requested_reviews("mypool/pkgA", package_pr_number)
gitea_env.approve_requested_reviews("myproducts/mySLFO", project_pr_number)
print("Pushing another change to PR branch to trigger sync...")
gitea_env.modify_gitea_pr("mypool/pkgA", package_pr_number,
"diff --git a/sync_test.txt b/sync_test.txt\nnew file mode 100644\nindex 0000000..e69de29\n",
"Trigger Sync")
# The bot should detect it's not FF and NOT merge, and re-request reviews because of the new commit
print("Waiting for reviews to be re-requested and approving again...")
time.sleep(10) # Wait for bot to process sync
# Approve again and verify it is NOT merged
print("Approving again and verifying PR is NOT merged (because it's not FF)...")
for i in range(15):
gitea_env.approve_requested_reviews("mypool/pkgA", package_pr_number)
gitea_env.approve_requested_reviews("myproducts/mySLFO", project_pr_number)
time.sleep(1)
pkg_details = gitea_env.get_pr_details("mypool/pkgA", package_pr_number)
assert not pkg_details.get("merged"), "Package PR merged despite NOT being FF-mergeable!"
print("FF-only failure (not merged after sync) verified.")
@pytest.mark.t010
def test_010_merge_mode_devel_success(merge_devel_env, ownerA_client):
"""
Test MergeMode "devel" - Success case (Content Conflict, should still merge via force-push)
"""
gitea_env, test_full_repo_name, merge_branch_name = merge_devel_env
ts = time.strftime("%H%M%S")
filename = f"devel_test_{ts}.txt"
# 1. Create a package PR that adds a file
diff = f"""diff --git a/{filename} b/{filename}
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/{filename}
@@ -0,0 +1 @@
+PR content
"""
package_pr = ownerA_client.create_gitea_pr("mypool/pkgA", diff, "Test Devel Merge (Conflict)", False, base_branch=merge_branch_name)
package_pr_number = package_pr["number"]
# 2. Create a content conflict by committing the same file to the base branch
gitea_env.create_file("mypool", "pkgA", filename, "Conflicting base content\n", branch=merge_branch_name)
project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number)
assert project_pr_number is not None
# Before merge, get the head sha of the package pr and project pr
pkg_details = gitea_env.get_pr_details("mypool/pkgA", package_pr_number)
pkg_head_sha = pkg_details["head"]["sha"]
prj_details = gitea_env.get_pr_details("myproducts/mySLFO", project_pr_number)
prj_head_sha = prj_details["head"]["sha"]
pkg_merged, prj_merged = gitea_env.approve_and_wait_merge("mypool/pkgA", package_pr_number, project_pr_number)
assert pkg_merged and prj_merged
print("Devel merge (force-push) successful.")
# Verify that pkgA submodule points to the correct SHA
pkgA_submodule_info = gitea_env.get_file_info("myproducts", "mySLFO", "pkgA", branch=merge_branch_name)
assert pkgA_submodule_info["sha"] == pkg_head_sha, f"Submodule pkgA should point to {pkg_head_sha} but points to {pkgA_submodule_info['sha']}"
@pytest.mark.t011
def test_011_merge_mode_replace_success(merge_replace_env, ownerA_client):
"""
Test MergeMode "replace" - Success case (Content Conflict, bot should add merge commit)
"""
gitea_env, test_full_repo_name, merge_branch_name = merge_replace_env
ts = time.strftime("%H%M%S")
filename = f"replace_test_{ts}.txt"
# 1. Create a package PR that adds a file
diff = f"""diff --git a/{filename} b/{filename}
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/{filename}
@@ -0,0 +1 @@
+PR content
"""
package_pr = ownerA_client.create_gitea_pr("mypool/pkgA", diff, "Test Replace Merge (Conflict)", False, base_branch=merge_branch_name)
package_pr_number = package_pr["number"]
# Enable "Allow edits from maintainers"
ownerA_client.update_gitea_pr_properties("mypool/pkgA", package_pr_number, allow_maintainer_edit=True)
# 2. Create a content conflict by committing the same file to the base branch
gitea_env.create_file("mypool", "pkgA", filename, "Conflicting base content\n", branch=merge_branch_name)
project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number)
assert project_pr_number is not None
# Before merge, get the head sha of the package pr and project pr
pkg_details = gitea_env.get_pr_details("mypool/pkgA", package_pr_number)
pkg_head_sha = pkg_details["head"]["sha"]
prj_details = gitea_env.get_pr_details("myproducts/mySLFO", project_pr_number)
prj_head_sha = prj_details["head"]["sha"]
pkg_merged, prj_merged = gitea_env.approve_and_wait_merge("mypool/pkgA", package_pr_number, project_pr_number, timeout=60)
assert pkg_merged and prj_merged
print("Replace merge successful.")
# Verify that the project branch HEAD is a merge commit
branch_info = gitea_env._request("GET", f"repos/myproducts/mySLFO/branches/{merge_branch_name}").json()
new_head_sha = branch_info["commit"]["id"]
commit_details = gitea_env._request("GET", f"repos/myproducts/mySLFO/git/commits/{new_head_sha}").json()
assert len(commit_details["parents"]) > 1, f"Project branch {merge_branch_name} HEAD should be a merge commit but has {len(commit_details['parents'])} parents"
# Verify that pkgA submodule points to the correct SHA
pkgA_submodule_info = gitea_env.get_file_info("myproducts", "mySLFO", "pkgA", branch=merge_branch_name)
assert pkgA_submodule_info["sha"] == pkg_head_sha, f"Submodule pkgA should point to {pkg_head_sha} but points to {pkgA_submodule_info['sha']}"
@pytest.mark.t012
def test_012_merge_mode_devel_ff_success(merge_devel_env, ownerA_client):
"""
Test MergeMode "devel" - Success case (No Conflict, should fast-forward)
"""
gitea_env, test_full_repo_name, merge_branch_name = merge_devel_env
ts = time.strftime("%H%M%S")
filename = f"devel_ff_test_{ts}.txt"
# 1. Create a package PR (this will be FF-mergeable by default)
diff = f"""diff --git a/{filename} b/{filename}
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/{filename}
@@ -0,0 +1 @@
+PR content
"""
package_pr = ownerA_client.create_gitea_pr("mypool/pkgA", diff, "Test Devel FF Merge", False, base_branch=merge_branch_name)
package_pr_number = package_pr["number"]
project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number)
assert project_pr_number is not None
pkg_details = gitea_env.get_pr_details("mypool/pkgA", package_pr_number)
pkg_head_sha = pkg_details["head"]["sha"]
pkg_merged, prj_merged = gitea_env.approve_and_wait_merge("mypool/pkgA", package_pr_number, project_pr_number)
assert pkg_merged and prj_merged
print("Devel FF merge successful.")
# Verify that the package base branch HEAD is the same as the PR head (FF)
branch_info = gitea_env._request("GET", f"repos/mypool/pkgA/branches/{merge_branch_name}").json()
new_head_sha = branch_info["commit"]["id"]
assert new_head_sha == pkg_head_sha, f"Package branch {merge_branch_name} HEAD should be {pkg_head_sha} but is {new_head_sha}"
commit_details = gitea_env._request("GET", f"repos/mypool/pkgA/git/commits/{new_head_sha}").json()
assert len(commit_details["parents"]) == 1, f"Package branch {merge_branch_name} HEAD should have 1 parent but has {len(commit_details['parents'])}"
@pytest.mark.t013
def test_013_merge_mode_replace_ff_success(merge_replace_env, ownerA_client):
"""
Test MergeMode "replace" - Success case (No Conflict, should fast-forward)
"""
gitea_env, test_full_repo_name, merge_branch_name = merge_replace_env
ts = time.strftime("%H%M%S")
filename = f"replace_ff_test_{ts}.txt"
# 1. Create a package PR (this will be FF-mergeable by default)
diff = f"""diff --git a/{filename} b/{filename}
new file mode 100644
index 0000000..e69de29
--- /dev/null
+++ b/{filename}
@@ -0,0 +1 @@
+PR content
"""
package_pr = ownerA_client.create_gitea_pr("mypool/pkgA", diff, "Test Replace FF Merge", False, base_branch=merge_branch_name)
package_pr_number = package_pr["number"]
project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number)
assert project_pr_number is not None
pkg_details = gitea_env.get_pr_details("mypool/pkgA", package_pr_number)
pkg_head_sha = pkg_details["head"]["sha"]
pkg_merged, prj_merged = gitea_env.approve_and_wait_merge("mypool/pkgA", package_pr_number, project_pr_number)
assert pkg_merged and prj_merged
print("Replace FF merge successful.")
# Verify that the package base branch HEAD is the same as the PR head (FF)
branch_info = gitea_env._request("GET", f"repos/mypool/pkgA/branches/{merge_branch_name}").json()
new_head_sha = branch_info["commit"]["id"]
assert new_head_sha == pkg_head_sha, f"Package branch {merge_branch_name} HEAD should be {pkg_head_sha} but is {new_head_sha}"
commit_details = gitea_env._request("GET", f"repos/mypool/pkgA/git/commits/{new_head_sha}").json()
assert len(commit_details["parents"]) == 1, f"Package branch {merge_branch_name} HEAD should have 1 parent but has {len(commit_details['parents'])}"

View File

@@ -15,22 +15,24 @@ def test_001_review_requests_matching_config(automerge_env, ownerA_client):
""" """
gitea_env, test_full_repo_name, branch_name = automerge_env gitea_env, test_full_repo_name, branch_name = automerge_env
# 1. Create a package PR for pool/pkgB as ownerA # 1. Create a package PR for mypool/pkgB as ownerA
diff = """diff --git a/pkgB_test_001.txt b/pkgB_test_001.txt ts = int(time.time() * 1000)
filename = f"pkgB_test_{ts}.txt"
diff = f"""diff --git a/{filename} b/{filename}
new file mode 100644 new file mode 100644
index 0000000..e69de29 index 0000000..e69de29
""" """
print(f"--- Creating package PR in pool/pkgB on branch {branch_name} as ownerA ---") print(f"--- Creating package PR in mypool/pkgB on branch {branch_name} as ownerA ---")
package_pr = ownerA_client.create_gitea_pr("pool/pkgB", diff, "Test Review Requests Config", True, base_branch=branch_name) package_pr = ownerA_client.create_gitea_pr("mypool/pkgB", diff, "Test Review Requests Config", True, base_branch=branch_name)
package_pr_number = package_pr["number"] package_pr_number = package_pr["number"]
print(f"Created package PR pool/pkgB#{package_pr_number}") print(f"Created package PR mypool/pkgB#{package_pr_number}")
# 2. Check that review requests came to ownerB, ownerBB, usera, and userb # 2. Check that review requests came to ownerB, ownerBB, usera, and userb
print("Checking for review requests from maintainers and workflow.config...") print("Checking for review requests from maintainers and workflow.config...")
reviewers_requested = set() reviewers_requested = set()
expected_reviewers = {"ownerB", "ownerBB", "usera", "userb"} expected_reviewers = {"ownerB", "ownerBB", "usera", "userb"}
for _ in range(30): for _ in range(30):
reviews = gitea_env.list_reviews("pool/pkgB", package_pr_number) reviews = gitea_env.list_reviews("mypool/pkgB", package_pr_number)
reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"} reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"}
if expected_reviewers.issubset(reviewers_requested): if expected_reviewers.issubset(reviewers_requested):
break break
@@ -43,7 +45,6 @@ index 0000000..e69de29
@pytest.mark.t004 @pytest.mark.t004
@pytest.mark.xfail(reason="the bot sometimes re-requests review from autogits_obs_staging_bot despite having the approval")
def test_004_maintainer(maintainer_env, ownerA_client): def test_004_maintainer(maintainer_env, ownerA_client):
""" """
Test scenario: Test scenario:
@@ -62,9 +63,9 @@ def test_004_maintainer(maintainer_env, ownerA_client):
# 0.1 Verify all users from config exist # 0.1 Verify all users from config exist
print("--- Verifying all users from config exist ---") print("--- Verifying all users from config exist ---")
import json import json
wf_file = gitea_env.get_file_info("products", "SLFO", "workflow.config", branch=branch_name) wf_file = gitea_env.get_file_info("myproducts", "mySLFO", "workflow.config", branch=branch_name)
wf = json.loads(base64.b64decode(wf_file["content"]).decode("utf-8")) wf = json.loads(base64.b64decode(wf_file["content"]).decode("utf-8"))
mt_file = gitea_env.get_file_info("products", "SLFO", "_maintainership.json", branch=branch_name) mt_file = gitea_env.get_file_info("myproducts", "mySLFO", "_maintainership.json", branch=branch_name)
mt = json.loads(base64.b64decode(mt_file["content"]).decode("utf-8")) mt = json.loads(base64.b64decode(mt_file["content"]).decode("utf-8"))
expected_users = set() expected_users = set()
@@ -81,35 +82,21 @@ def test_004_maintainer(maintainer_env, ownerA_client):
print(f"Verified user exists: {username}") print(f"Verified user exists: {username}")
# 1. Create a package PR as ownerA # 1. Create a package PR as ownerA
diff = """diff --git a/maintainer_test_fixture.txt b/maintainer_test_fixture.txt ts = int(time.time() * 1000)
filename = f"maintainer_test_{ts}.txt"
diff = f"""diff --git a/{filename} b/{filename}
new file mode 100644 new file mode 100644
index 0000000..e69de29 index 0000000..e69de29
""" """
print(f"--- Creating package PR in pool/pkgA on branch {branch_name} as ownerA ---") print(f"--- Creating package PR in mypool/pkgA on branch {branch_name} as ownerA ---")
package_pr = ownerA_client.create_gitea_pr("pool/pkgA", diff, "Test Maintainer Merge", True, base_branch=branch_name) package_pr = ownerA_client.create_gitea_pr("mypool/pkgA", diff, "Test Maintainer Merge", True, base_branch=branch_name)
package_pr_number = package_pr["number"] package_pr_number = package_pr["number"]
print(f"Created package PR pool/pkgA#{package_pr_number}") print(f"Created package PR mypool/pkgA#{package_pr_number}")
# 2. Make sure the workflow-pr service created related project PR # 2. Make sure the workflow-pr service created related project PR
project_pr_number = None project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number)
print(f"Polling pool/pkgA PR #{package_pr_number} timeline for forwarded PR event...")
for _ in range(40):
time.sleep(1)
timeline_events = gitea_env.get_timeline_events("pool/pkgA", package_pr_number)
for event in timeline_events:
if event.get("type") == "pull_ref":
if not (ref_issue := event.get("ref_issue")):
continue
url_to_check = ref_issue.get("html_url", "")
match = re.search(r"products/SLFO/pulls/(\d+)", url_to_check)
if match:
project_pr_number = int(match.group(1))
break
if project_pr_number:
break
assert project_pr_number is not None, "Workflow bot did not create a project PR." assert project_pr_number is not None, "Workflow bot did not create a project PR."
print(f"Found project PR: products/SLFO#{project_pr_number}") print(f"Found project PR: myproducts/mySLFO#{project_pr_number}")
# 3. Make sure both PRs are merged automatically WITHOUT manual approvals # 3. Make sure both PRs are merged automatically WITHOUT manual approvals
print("Polling for PR merge status (only bot approval allowed)...") print("Polling for PR merge status (only bot approval allowed)...")
@@ -119,35 +106,35 @@ index 0000000..e69de29
for i in range(15): # Poll for up to 15 seconds for i in range(15): # Poll for up to 15 seconds
# Package PR # Package PR
if not package_merged: if not package_merged:
pkg_details = gitea_env.get_pr_details("pool/pkgA", package_pr_number) pkg_details = gitea_env.get_pr_details("mypool/pkgA", package_pr_number)
if pkg_details.get("merged"): if pkg_details.get("merged"):
package_merged = True package_merged = True
print(f"Package PR pool/pkgA#{package_pr_number} merged.") print(f"Package PR mypool/pkgA#{package_pr_number} merged.")
else: else:
# Approve ONLY bot if requested # Approve ONLY bot if requested
reviews = gitea_env.list_reviews("pool/pkgA", package_pr_number) reviews = gitea_env.list_reviews("mypool/pkgA", package_pr_number)
if any(r["state"] == "REQUEST_REVIEW" and r["user"]["login"] == "autogits_obs_staging_bot" for r in reviews): if any(r["state"] == "REQUEST_REVIEW" and r["user"]["login"] == "autogits_obs_staging_bot" for r in reviews):
gitea_env.approve_requested_reviews("pool/pkgA", package_pr_number) gitea_env.approve_requested_reviews("mypool/pkgA", package_pr_number)
# Project PR # Project PR
if not project_merged: if not project_merged:
prj_details = gitea_env.get_pr_details("products/SLFO", project_pr_number) prj_details = gitea_env.get_pr_details("myproducts/mySLFO", project_pr_number)
if prj_details.get("merged"): if prj_details.get("merged"):
project_merged = True project_merged = True
print(f"Project PR products/SLFO#{project_pr_number} merged.") print(f"Project PR myproducts/mySLFO#{project_pr_number} merged.")
else: else:
# Approve ONLY bot if requested # Approve ONLY bot if requested
reviews = gitea_env.list_reviews("products/SLFO", project_pr_number) reviews = gitea_env.list_reviews("myproducts/mySLFO", project_pr_number)
if any(r["state"] == "REQUEST_REVIEW" and r["user"]["login"] == "autogits_obs_staging_bot" for r in reviews): if any(r["state"] == "REQUEST_REVIEW" and r["user"]["login"] == "autogits_obs_staging_bot" for r in reviews):
gitea_env.approve_requested_reviews("products/SLFO", project_pr_number) gitea_env.approve_requested_reviews("myproducts/mySLFO", project_pr_number)
if package_merged and project_merged: if package_merged and project_merged:
break break
time.sleep(1) time.sleep(1)
assert package_merged, f"Package PR pool/pkgA#{package_pr_number} was not merged automatically." assert package_merged, f"Package PR mypool/pkgA#{package_pr_number} was not merged automatically."
assert project_merged, f"Project PR products/SLFO#{project_pr_number} was not merged automatically." assert project_merged, f"Project PR myproducts/mySLFO#{project_pr_number} was not merged automatically."
print("Both PRs merged successfully by maintainer rule.") print("Both PRs merged successfully by maintainer rule.")
@@ -163,42 +150,28 @@ def test_005_any_maintainer_approval_sufficient(maintainer_env, ownerA_client, o
""" """
gitea_env, test_full_repo_name, branch_name = maintainer_env gitea_env, test_full_repo_name, branch_name = maintainer_env
# 1. Create a package PR for pool/pkgB as ownerA # 1. Create a package PR for mypool/pkgB as ownerA
diff = """diff --git a/pkgB_test_fixture.txt b/pkgB_test_fixture.txt ts = int(time.time() * 1000)
filename = f"pkgB_test_{ts}.txt"
diff = f"""diff --git a/{filename} b/{filename}
new file mode 100644 new file mode 100644
index 0000000..e69de29 index 0000000..e69de29
""" """
print(f"--- Creating package PR in pool/pkgB on branch {branch_name} as ownerA ---") print(f"--- Creating package PR in mypool/pkgB on branch {branch_name} as ownerA ---")
package_pr = ownerA_client.create_gitea_pr("pool/pkgB", diff, "Test Single Maintainer Merge", True, base_branch=branch_name) package_pr = ownerA_client.create_gitea_pr("mypool/pkgB", diff, "Test Single Maintainer Merge", True, base_branch=branch_name)
package_pr_number = package_pr["number"] package_pr_number = package_pr["number"]
print(f"Created package PR pool/pkgB#{package_pr_number}") print(f"Created package PR mypool/pkgB#{package_pr_number}")
# 2. Make sure the workflow-pr service created related project PR # 2. Make sure the workflow-pr service created related project PR
project_pr_number = None project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgB", package_pr_number)
print(f"Polling pool/pkgB PR #{package_pr_number} timeline for forwarded PR event...")
for _ in range(40):
time.sleep(1)
timeline_events = gitea_env.get_timeline_events("pool/pkgB", package_pr_number)
for event in timeline_events:
if event.get("type") == "pull_ref":
if not (ref_issue := event.get("ref_issue")):
continue
url_to_check = ref_issue.get("html_url", "")
match = re.search(r"products/SLFO/pulls/(\d+)", url_to_check)
if match:
project_pr_number = int(match.group(1))
break
if project_pr_number:
break
assert project_pr_number is not None, "Workflow bot did not create a project PR." assert project_pr_number is not None, "Workflow bot did not create a project PR."
print(f"Found project PR: products/SLFO#{project_pr_number}") print(f"Found project PR: myproducts/mySLFO#{project_pr_number}")
# 3. Check that review requests came to ownerB and ownerBB # 3. Check that review requests came to ownerB and ownerBB
print("Checking for review requests from ownerB and ownerBB...") print("Checking for review requests from ownerB and ownerBB...")
reviewers_requested = set() reviewers_requested = set()
for _ in range(20): for _ in range(20):
reviews = gitea_env.list_reviews("pool/pkgB", package_pr_number) reviews = gitea_env.list_reviews("mypool/pkgB", package_pr_number)
reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"} reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"}
if "ownerB" in reviewers_requested and "ownerBB" in reviewers_requested: if "ownerB" in reviewers_requested and "ownerBB" in reviewers_requested:
break break
@@ -210,41 +183,21 @@ index 0000000..e69de29
# 4. ownerBB leaves review, ownerB does not. # 4. ownerBB leaves review, ownerB does not.
print("ownerBB approving the PR...") print("ownerBB approving the PR...")
ownerBB_client.create_review("pool/pkgB", package_pr_number, event="APPROVED", body="Approval from ownerBB") ownerBB_client.create_review("mypool/pkgB", package_pr_number, event="APPROVED", body="Approval from ownerBB")
# 5. Check that both PRs are merged automatically # 5. Check that review request for ownerB is removed
print("Polling for PR merge status (only bot approval allowed for project PR)...") print("Polling for ownerB review request removal...")
package_merged = False ownerB_removed = False
project_merged = False for _ in range(30):
for i in range(15): # Poll for up to 15 seconds
# Package PR
if not package_merged:
pkg_details = gitea_env.get_pr_details("pool/pkgB", package_pr_number)
if pkg_details.get("merged"):
package_merged = True
print(f"Package PR pool/pkgB#{package_pr_number} merged.")
# Project PR
if not project_merged:
prj_details = gitea_env.get_pr_details("products/SLFO", project_pr_number)
if prj_details.get("merged"):
project_merged = True
print(f"Project PR products/SLFO#{project_pr_number} merged.")
else:
# Approve ONLY bot if requested
reviews = gitea_env.list_reviews("products/SLFO", project_pr_number)
if any(r["state"] == "REQUEST_REVIEW" and r["user"]["login"] == "autogits_obs_staging_bot" for r in reviews):
gitea_env.approve_requested_reviews("products/SLFO", project_pr_number)
if package_merged and project_merged:
break
time.sleep(1) time.sleep(1)
reviews = gitea_env.list_reviews("mypool/pkgB", package_pr_number)
assert package_merged, f"Package PR pool/pkgB#{package_pr_number} was not merged automatically." reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"}
assert project_merged, f"Project PR products/SLFO#{project_pr_number} was not merged automatically." if "ownerB" not in reviewers_requested:
print("Both PRs merged successfully with only one maintainer approval.") ownerB_removed = True
break
assert ownerB_removed, f"ownerB review request was not removed after ownerBB approval. Current requested: {reviewers_requested}"
print("Confirmed: ownerB review request removed after single maintainer approval.")
@pytest.mark.t006 @pytest.mark.t006
@@ -259,37 +212,39 @@ def test_006_maintainer_rejection_removes_other_requests(maintainer_env, ownerA_
""" """
gitea_env, test_full_repo_name, branch_name = maintainer_env gitea_env, test_full_repo_name, branch_name = maintainer_env
# 1. Create a package PR for pool/pkgB as ownerA # 1. Create a package PR for mypool/pkgB as ownerA
diff = """diff --git a/pkgB_rejection_test.txt b/pkgB_rejection_test.txt ts = int(time.time() * 1000)
filename = f"pkgB_rejection_test_{ts}.txt"
diff = f"""diff --git a/{filename} b/{filename}
new file mode 100644 new file mode 100644
index 0000000..e69de29 index 0000000..e69de29
""" """
print(f"--- Creating package PR in pool/pkgB on branch {branch_name} as ownerA ---") print(f"--- Creating package PR in mypool/pkgB on branch {branch_name} as ownerA ---")
package_pr = ownerA_client.create_gitea_pr("pool/pkgB", diff, "Test Maintainer Rejection", True, base_branch=branch_name) package_pr = ownerA_client.create_gitea_pr("mypool/pkgB", diff, "Test Maintainer Rejection", True, base_branch=branch_name)
package_pr_number = package_pr["number"] package_pr_number = package_pr["number"]
print(f"Created package PR pool/pkgB#{package_pr_number}") print(f"Created package PR mypool/pkgB#{package_pr_number}")
# 2. Check that review requests came to ownerB and ownerBB # 2. Check that review requests came to ownerB and ownerBB
print("Checking for review requests from ownerB and ownerBB...") print("Checking for review requests from ownerB and ownerBB...")
for _ in range(20): for _ in range(20):
reviews = gitea_env.list_reviews("pool/pkgB", package_pr_number) reviews = gitea_env.list_reviews("mypool/pkgB", package_pr_number)
reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"} reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"}
if "ownerB" in reviewers_requested and "ownerBB" in reviewers_requested: if "ownerB" in reviewers_requested and "ownerBB" in reviewers_requested:
break break
time.sleep(1) time.sleep(1)
else: else:
reviews = gitea_env.list_reviews("pool/pkgB", package_pr_number) reviews = gitea_env.list_reviews("mypool/pkgB", package_pr_number)
reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"} reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"}
pytest.fail(f"ownerB and ownerBB were not both requested. Got: {reviewers_requested}") pytest.fail(f"ownerB and ownerBB were not both requested. Got: {reviewers_requested}")
# 3. ownerBB rejects the PR # 3. ownerBB rejects the PR
print("ownerBB rejecting the PR...") print("ownerBB rejecting the PR...")
ownerBB_client.create_review("pool/pkgB", package_pr_number, event="REQUEST_CHANGES", body="Rejecting from ownerBB") ownerBB_client.create_review("mypool/pkgB", package_pr_number, event="REQUEST_CHANGES", body="Rejecting from ownerBB")
# 4. Check that review request for ownerB is removed # 4. Check that review request for ownerB is removed
print("Checking if ownerB's review request is removed...") print("Checking if ownerB's review request is removed...")
for _ in range(20): for _ in range(20):
reviews = gitea_env.list_reviews("pool/pkgB", package_pr_number) reviews = gitea_env.list_reviews("mypool/pkgB", package_pr_number)
reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"} reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"}
if "ownerB" not in reviewers_requested: if "ownerB" not in reviewers_requested:
print("Confirmed: ownerB's review request was removed.") print("Confirmed: ownerB's review request was removed.")
@@ -317,67 +272,53 @@ def test_007_review_required_needs_all_approvals(review_required_env, ownerA_cli
ownerA_client._request("GET", "users/admin") ownerA_client._request("GET", "users/admin")
print(f"ownerA_client smoke test passed") print(f"ownerA_client smoke test passed")
# 1. Create a package PR for pool/pkgB as ownerA # 1. Create a package PR for mypool/pkgB as ownerA
diff = """diff --git a/pkgB_review_required_test.txt b/pkgB_review_required_test.txt ts = int(time.time() * 1000)
filename = f"pkgB_review_required_test_{ts}.txt"
diff = f"""diff --git a/{filename} b/{filename}
new file mode 100644 new file mode 100644
index 0000000..e69de29 index 0000000..e69de29
""" """
print(f"--- Creating package PR in pool/pkgB on branch {branch_name} as ownerA ---") print(f"--- Creating package PR in mypool/pkgB on branch {branch_name} as ownerA ---")
package_pr = ownerA_client.create_gitea_pr("pool/pkgB", diff, "Test Review Required", True, base_branch=branch_name) package_pr = ownerA_client.create_gitea_pr("mypool/pkgB", diff, "Test Review Required", True, base_branch=branch_name)
package_pr_number = package_pr["number"] package_pr_number = package_pr["number"]
print(f"Created package PR pool/pkgB#{package_pr_number}") print(f"Created package PR mypool/pkgB#{package_pr_number}")
# 2. Make sure the workflow-pr service created related project PR # 2. Make sure the workflow-pr service created related project PR
project_pr_number = None project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgB", package_pr_number)
print(f"Polling pool/pkgB PR #{package_pr_number} timeline for forwarded PR event...")
for _ in range(40):
time.sleep(1)
timeline_events = gitea_env.get_timeline_events("pool/pkgB", package_pr_number)
for event in timeline_events:
if event.get("type") == "pull_ref":
if not (ref_issue := event.get("ref_issue")):
continue
url_to_check = ref_issue.get("html_url", "")
match = re.search(r"products/SLFO/pulls/(\d+)", url_to_check)
if match:
project_pr_number = int(match.group(1))
break
if project_pr_number:
break
assert project_pr_number is not None, "Workflow bot did not create a project PR." assert project_pr_number is not None, "Workflow bot did not create a project PR."
print(f"Found project PR: products/SLFO#{project_pr_number}") print(f"Found project PR: myproducts/mySLFO#{project_pr_number}")
# 3. Check that review requests came to ownerB and ownerBB # 3. Check that review requests came to ownerB and ownerBB
print("Checking for review requests from ownerB and ownerBB...") print("Checking for review requests from ownerB and ownerBB...")
for _ in range(20): for _ in range(20):
reviews = gitea_env.list_reviews("pool/pkgB", package_pr_number) reviews = gitea_env.list_reviews("mypool/pkgB", package_pr_number)
reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"} reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"}
if "ownerB" in reviewers_requested and "ownerBB" in reviewers_requested: if "ownerB" in reviewers_requested and "ownerBB" in reviewers_requested:
break break
time.sleep(1) time.sleep(1)
else: else:
reviews = gitea_env.list_reviews("pool/pkgB", package_pr_number) reviews = gitea_env.list_reviews("mypool/pkgB", package_pr_number)
reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"} reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"}
pytest.fail(f"ownerB and ownerBB were not both requested. Got: {reviewers_requested}") pytest.fail(f"ownerB and ownerBB were not both requested. Got: {reviewers_requested}")
# 4. ownerBB leaves review, ownerB does not. # 4. ownerBB leaves review, ownerB does not.
print("ownerBB approving the PR...") print("ownerBB approving the PR...")
ownerBB_client.create_review("pool/pkgB", package_pr_number, event="APPROVED", body="Approval from ownerBB") ownerBB_client.create_review("mypool/pkgB", package_pr_number, event="APPROVED", body="Approval from ownerBB")
# 5. Check that the PR is NOT merged automatically and ownerB request remains # 5. Check that the PR is NOT merged automatically and ownerB request remains
print("Waiting to ensure PR is NOT merged and ownerB request remains...") print("Waiting to ensure PR is NOT merged and ownerB request remains...")
for i in range(10): for i in range(10):
pkg_details = gitea_env.get_pr_details("pool/pkgB", package_pr_number) pkg_details = gitea_env.get_pr_details("mypool/pkgB", package_pr_number)
reviews = gitea_env.list_reviews("pool/pkgB", package_pr_number) reviews = gitea_env.list_reviews("mypool/pkgB", package_pr_number)
review_states = [(r["user"]["login"], r["state"]) for r in reviews] review_states = [(r["user"]["login"], r["state"]) for r in reviews]
print(f"Attempt {i+1}: Merged={pkg_details.get('merged')}, Reviews={review_states}") print(f"Attempt {i+1}: Merged={pkg_details.get('merged')}, Reviews={review_states}")
time.sleep(2) time.sleep(2)
pkg_details = gitea_env.get_pr_details("pool/pkgB", package_pr_number) pkg_details = gitea_env.get_pr_details("mypool/pkgB", package_pr_number)
assert not pkg_details.get("merged"), "Package PR was merged automatically but it should NOT have been (ReviewRequired=true)." assert not pkg_details.get("merged"), "Package PR was merged automatically but it should NOT have been (ReviewRequired=true)."
reviews = gitea_env.list_reviews("pool/pkgB", package_pr_number) reviews = gitea_env.list_reviews("mypool/pkgB", package_pr_number)
reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"} reviewers_requested = {r["user"]["login"] for r in reviews if r["state"] == "REQUEST_REVIEW"}
assert "ownerB" in reviewers_requested, f"ownerB's review request was removed, but it should have remained. All reviews: {[(r['user']['login'], r['state']) for r in reviews]}" assert "ownerB" in reviewers_requested, f"ownerB's review request was removed, but it should have remained. All reviews: {[(r['user']['login'], r['state']) for r in reviews]}"

View File

@@ -23,35 +23,16 @@ pytest.forwarded_pr_number = None
def test_001_project_pr(gitea_env): def test_001_project_pr(gitea_env):
"""Forwarded PR correct title""" """Forwarded PR correct title"""
diff = "diff --git a/another_test.txt b/another_test.txt\nnew file mode 100644\nindex 0000000..e69de29\n" diff = "diff --git a/another_test.txt b/another_test.txt\nnew file mode 100644\nindex 0000000..e69de29\n"
pytest.pr = gitea_env.create_gitea_pr("pool/pkgA", diff, "Test PR", False) pytest.pr = gitea_env.create_gitea_pr("mypool/pkgA", diff, "Test PR", False)
pytest.initial_pr_number = pytest.pr["number"] pytest.initial_pr_number = pytest.pr["number"]
time.sleep(5) # Give Gitea some time to process the PR and make the timeline available time.sleep(5) # Give Gitea some time to process the PR and make the timeline available
compose_dir = Path(__file__).parent.parent pytest.forwarded_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", pytest.initial_pr_number)
pytest.forwarded_pr_number = None
print(
f"Polling pool/pkgA PR #{pytest.initial_pr_number} timeline for forwarded PR event..."
)
# Instead of polling timeline, check if forwarded PR exists directly
for _ in range(20):
time.sleep(1)
timeline_events = gitea_env.get_timeline_events("pool/pkgA", pytest.initial_pr_number)
for event in timeline_events:
if event.get("type") == "pull_ref":
if not (ref_issue := event.get("ref_issue")):
continue
url_to_check = ref_issue.get("html_url", "")
match = re.search(r"products/SLFO/pulls/(\d+)", url_to_check)
if match:
pytest.forwarded_pr_number = match.group(1)
break
if pytest.forwarded_pr_number:
break
assert ( assert (
pytest.forwarded_pr_number is not None pytest.forwarded_pr_number is not None
), "Workflow bot did not create a forwarded PR." ), "Workflow bot did not create a forwarded PR."
pytest.pr_details = gitea_env.get_pr_details("products/SLFO", pytest.forwarded_pr_number) pytest.pr_details = gitea_env.get_pr_details("myproducts/mySLFO", pytest.forwarded_pr_number)
assert ( assert (
pytest.pr_details["title"] == "Forwarded PRs: pkgA" pytest.pr_details["title"] == "Forwarded PRs: pkgA"
), "Forwarded PR correct title" ), "Forwarded PR correct title"
@@ -62,13 +43,13 @@ def test_001_project_pr(gitea_env):
def test_002_updated_project_pr(gitea_env): def test_002_updated_project_pr(gitea_env):
"""Forwarded PR head is updated""" """Forwarded PR head is updated"""
diff = "diff --git a/another_test.txt b/another_test.txt\nnew file mode 100444\nindex 0000000..e69de21\n" diff = "diff --git a/another_test.txt b/another_test.txt\nnew file mode 100444\nindex 0000000..e69de21\n"
gitea_env.modify_gitea_pr("pool/pkgA", pytest.initial_pr_number, diff, "Tweaks") gitea_env.modify_gitea_pr("mypool/pkgA", pytest.initial_pr_number, diff, "Tweaks")
sha_old = pytest.pr_details["head"]["sha"] sha_old = pytest.pr_details["head"]["sha"]
sha_changed = False sha_changed = False
for _ in range(20): for _ in range(20):
time.sleep(1) time.sleep(1)
new_pr_details = gitea_env.get_pr_details("products/SLFO", pytest.forwarded_pr_number) new_pr_details = gitea_env.get_pr_details("myproducts/mySLFO", pytest.forwarded_pr_number)
sha_new = new_pr_details["head"]["sha"] sha_new = new_pr_details["head"]["sha"]
if sha_new != sha_old: if sha_new != sha_old:
print(f"Sha changed from {sha_old} to {sha_new}") print(f"Sha changed from {sha_old} to {sha_new}")
@@ -82,17 +63,17 @@ def test_002_updated_project_pr(gitea_env):
@pytest.mark.dependency(depends=["test_001_project_pr"]) @pytest.mark.dependency(depends=["test_001_project_pr"])
def test_003_wip(gitea_env): def test_003_wip(gitea_env):
"""WIP flag set for PR""" """WIP flag set for PR"""
# 1. set WIP flag in PR f"pool/pkgA#{pytest.initial_pr_number}" # 1. set WIP flag in PR f"mypool/pkgA#{pytest.initial_pr_number}"
initial_pr_details = gitea_env.get_pr_details("pool/pkgA", pytest.initial_pr_number) initial_pr_details = gitea_env.get_pr_details("mypool/pkgA", pytest.initial_pr_number)
wip_title = "WIP: " + initial_pr_details["title"] wip_title = "WIP: " + initial_pr_details["title"]
gitea_env.update_gitea_pr_properties("pool/pkgA", pytest.initial_pr_number, title=wip_title) gitea_env.update_gitea_pr_properties("mypool/pkgA", pytest.initial_pr_number, title=wip_title)
# 2. in loop check whether WIP flag is set for PR f"products/SLFO #{pytest.forwarded_pr_number}" # 2. in loop check whether WIP flag is set for PR f"myproducts/mySLFO #{pytest.forwarded_pr_number}"
wip_flag_set = False wip_flag_set = False
for _ in range(20): for _ in range(20):
time.sleep(1) time.sleep(1)
forwarded_pr_details = gitea_env.get_pr_details( forwarded_pr_details = gitea_env.get_pr_details(
"products/SLFO", pytest.forwarded_pr_number "myproducts/mySLFO", pytest.forwarded_pr_number
) )
if "WIP: " in forwarded_pr_details["title"]: if "WIP: " in forwarded_pr_details["title"]:
wip_flag_set = True wip_flag_set = True
@@ -100,19 +81,19 @@ def test_003_wip(gitea_env):
assert wip_flag_set, "WIP flag was not set in the forwarded PR." assert wip_flag_set, "WIP flag was not set in the forwarded PR."
# Remove WIP flag from PR f"pool/pkgA#{pytest.initial_pr_number}" # Remove WIP flag from PR f"mypool/pkgA#{pytest.initial_pr_number}"
initial_pr_details = gitea_env.get_pr_details("pool/pkgA", pytest.initial_pr_number) initial_pr_details = gitea_env.get_pr_details("mypool/pkgA", pytest.initial_pr_number)
non_wip_title = initial_pr_details["title"].replace("WIP: ", "") non_wip_title = initial_pr_details["title"].replace("WIP: ", "")
gitea_env.update_gitea_pr_properties( gitea_env.update_gitea_pr_properties(
"pool/pkgA", pytest.initial_pr_number, title=non_wip_title "mypool/pkgA", pytest.initial_pr_number, title=non_wip_title
) )
# In loop check whether WIP flag is removed for PR f"products/SLFO #{pytest.forwarded_pr_number}" # In loop check whether WIP flag is removed for PR f"myproducts/mySLFO #{pytest.forwarded_pr_number}"
wip_flag_removed = False wip_flag_removed = False
for _ in range(20): for _ in range(20):
time.sleep(1) time.sleep(1)
forwarded_pr_details = gitea_env.get_pr_details( forwarded_pr_details = gitea_env.get_pr_details(
"products/SLFO", pytest.forwarded_pr_number "myproducts/mySLFO", pytest.forwarded_pr_number
) )
if "WIP: " not in forwarded_pr_details["title"]: if "WIP: " not in forwarded_pr_details["title"]:
wip_flag_removed = True wip_flag_removed = True
@@ -121,7 +102,7 @@ def test_003_wip(gitea_env):
@pytest.mark.t005 @pytest.mark.t005
@pytest.mark.xfail(reason="works only in ibs_state branch?") @pytest.mark.skip(reason="works only in ibs_state branch?")
@pytest.mark.dependency() @pytest.mark.dependency()
def test_005_NoProjectGitPR_edits_disabled(no_project_git_pr_env, test_user_client): def test_005_NoProjectGitPR_edits_disabled(no_project_git_pr_env, test_user_client):
""" """
@@ -139,37 +120,23 @@ index 0000000..e69de29
@@ -0,0 +1 @@ @@ -0,0 +1 @@
+Initial content +Initial content
""" """
package_pr = test_user_client.create_gitea_pr("pool/pkgA", initial_diff, "Test PR for No Project PR, No Edits", False, base_branch=dev_branch_name) package_pr = test_user_client.create_gitea_pr("mypool/pkgA", initial_diff, "Test PR for No Project PR, No Edits", False, base_branch=dev_branch_name)
package_pr_number = package_pr["number"] package_pr_number = package_pr["number"]
print(f"Created Package PR #{package_pr_number}") print(f"Created Package PR #{package_pr_number}")
# 2. Verify that the workflow-pr bot did not create a Project PR # 2. Verify that the workflow-pr bot did not create a Project PR
project_pr_created = False project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number, timeout=10)
for i in range(10): # Poll for some time
time.sleep(2)
timeline_events = gitea_env.get_timeline_events("pool/pkgA", package_pr_number)
for event in timeline_events:
if event.get("type") == "pull_ref":
if not (ref_issue := event.get("ref_issue")):
continue
url_to_check = ref_issue.get("html_url", "")
match = re.search(r"products/SLFO/pulls/(\d+)", url_to_check)
if match:
project_pr_created = True
break
if project_pr_created:
break
assert not project_pr_created, "Workflow bot unexpectedly created a Project PR in products/SLFO." assert project_pr_number is None, "Workflow bot unexpectedly created a Project PR in myproducts/mySLFO."
print("Verification complete: No Project PR was created by the bot.") print("Verification complete: No Project PR was created by the bot.")
# 3. Manually create the Project PR # 3. Manually create the Project PR
pkgA_main_sha = gitea_env._request("GET", f"repos/pool/pkgA/branches/{dev_branch_name}").json()["commit"]["id"] pkgA_main_sha = gitea_env._request("GET", f"repos/mypool/pkgA/branches/{dev_branch_name}").json()["commit"]["id"]
package_pr_details = gitea_env.get_pr_details("pool/pkgA", package_pr_number) package_pr_details = gitea_env.get_pr_details("mypool/pkgA", package_pr_number)
pkgA_pr_head_sha = package_pr_details["head"]["sha"] pkgA_pr_head_sha = package_pr_details["head"]["sha"]
project_pr_title = "Forwarded PRs: pkgA (Manual)" project_pr_title = "Forwarded PRs: pkgA (Manual)"
project_pr_body = f"Manual Project PR for NoProjectGitPR. \nPR: pool/pkgA!{package_pr_number}" project_pr_body = f"Manual Project PR for NoProjectGitPR. \nPR: mypool/pkgA!{package_pr_number}"
project_pr_diff = f"""diff --git a/pkgA b/pkgA project_pr_diff = f"""diff --git a/pkgA b/pkgA
index {pkgA_main_sha[:7]}..{pkgA_pr_head_sha[:7]} 160000 index {pkgA_main_sha[:7]}..{pkgA_pr_head_sha[:7]} 160000
--- a/pkgA --- a/pkgA
@@ -199,14 +166,14 @@ index 0000000..e69de29
@@ -0,0 +1 @@ @@ -0,0 +1 @@
+Trigger content +Trigger content
""" """
test_user_client.modify_gitea_pr("pool/pkgA", package_pr_number, new_diff_content, "Trigger bot update") test_user_client.modify_gitea_pr("mypool/pkgA", package_pr_number, new_diff_content, "Trigger bot update")
# 5. Verify that the bot adds a warning comment because it cannot update the manual PR (edits disabled) # 5. Verify that the bot adds a warning comment because it cannot update the manual PR (edits disabled)
warning_found = False warning_found = False
print(f"Polling Package PR #{package_pr_number} for warning comment...") print(f"Polling Package PR #{package_pr_number} for warning comment...")
for _ in range(20): for _ in range(20):
time.sleep(3) time.sleep(3)
comments = gitea_env.get_comments("pool/pkgA", package_pr_number) comments = gitea_env.get_comments("mypool/pkgA", package_pr_number)
for comment in comments: for comment in comments:
# According to test-plan.md, the warning explains that it cannot update the PR. # According to test-plan.md, the warning explains that it cannot update the PR.
if "cannot update" in comment.get("body", "").lower(): if "cannot update" in comment.get("body", "").lower():
@@ -221,7 +188,7 @@ index 0000000..e69de29
@pytest.mark.t006 @pytest.mark.t006
@pytest.mark.xfail(reason="works only in ibs_state branch?") @pytest.mark.skip(reason="works only in ibs_state branch?")
@pytest.mark.dependency() @pytest.mark.dependency()
def test_006_NoProjectGitPR_edits_enabled(no_project_git_pr_env, test_user_client): def test_006_NoProjectGitPR_edits_enabled(no_project_git_pr_env, test_user_client):
""" """
@@ -239,42 +206,27 @@ index 0000000..e69de29
@@ -0,0 +1 @@ @@ -0,0 +1 @@
+New feature content +New feature content
""" """
package_pr = test_user_client.create_gitea_pr("pool/pkgA", diff, "Test PR for NoProjectGitPR", False, base_branch=dev_branch_name) package_pr = test_user_client.create_gitea_pr("mypool/pkgA", diff, "Test PR for NoProjectGitPR", False, base_branch=dev_branch_name)
package_pr_number = package_pr["number"] package_pr_number = package_pr["number"]
# Enable "Allow edits from maintainers" # Enable "Allow edits from maintainers"
test_user_client.update_gitea_pr_properties("pool/pkgA", package_pr_number, allow_maintainer_edit=True) test_user_client.update_gitea_pr_properties("mypool/pkgA", package_pr_number, allow_maintainer_edit=True)
print(f"Created Package PR #{package_pr_number} and enabled 'Allow edits from maintainers'.") print(f"Created Package PR #{package_pr_number} and enabled 'Allow edits from maintainers'.")
# Get SHAs needed for the manual Project PR diff # Get SHAs needed for the manual Project PR diff
pkgA_main_sha = gitea_env._request("GET", f"repos/pool/pkgA/branches/{dev_branch_name}").json()["commit"]["id"] pkgA_main_sha = gitea_env._request("GET", f"repos/mypool/pkgA/branches/{dev_branch_name}").json()["commit"]["id"]
package_pr_details = gitea_env.get_pr_details("pool/pkgA", package_pr_number) package_pr_details = gitea_env.get_pr_details("mypool/pkgA", package_pr_number)
pkgA_pr_head_sha = package_pr_details["head"]["sha"] pkgA_pr_head_sha = package_pr_details["head"]["sha"]
# 3. Assert that the workflow-pr bot did not create a Project PR in the products/SLFO repository # 3. Assert that the workflow-pr bot did not create a Project PR in the myproducts/mySLFO repository
project_pr_created = False project_pr_number = gitea_env.wait_for_project_pr("mypool/pkgA", package_pr_number, timeout=10)
for i in range(20): # Poll for a reasonable time
time.sleep(2) # Wait a bit longer to be sure
timeline_events = gitea_env.get_timeline_events("pool/pkgA", package_pr_number)
for event in timeline_events:
if event.get("type") == "pull_ref":
if not (ref_issue := event.get("ref_issue")):
continue
url_to_check = ref_issue.get("html_url", "")
# Regex now searches for products/SLFO/pulls/(\d+)
match = re.search(r"products/SLFO/pulls/(\d+)", url_to_check)
if match:
project_pr_created = True
break
if project_pr_created:
break
assert not project_pr_created, "Workflow bot unexpectedly created a Project PR in products/SLFO." assert project_pr_number is None, "Workflow bot unexpectedly created a Project PR in myproducts/mySLFO."
print("Verification complete: No Project PR was created in products/SLFO as expected.") print("Verification complete: No Project PR was created in myproducts/mySLFO as expected.")
# 1. Create that Project PR from the test code. # 1. Create that Project PR from the test code.
project_pr_title = "Forwarded PRs: pkgA" project_pr_title = "Forwarded PRs: pkgA"
project_pr_body = f"Test Project PR for NoProjectGitPR. \nPR: pool/pkgA!{package_pr_number}" project_pr_body = f"Test Project PR for NoProjectGitPR. \nPR: mypool/pkgA!{package_pr_number}"
project_pr_diff = f"""diff --git a/pkgA b/pkgA project_pr_diff = f"""diff --git a/pkgA b/pkgA
index {pkgA_main_sha[:7]}..{pkgA_pr_head_sha[:7]} 160000 index {pkgA_main_sha[:7]}..{pkgA_pr_head_sha[:7]} 160000
--- a/pkgA --- a/pkgA
@@ -304,7 +256,7 @@ index 0000000..f587a12
@@ -0,0 +1 @@ @@ -0,0 +1 @@
+Another file content +Another file content
""" """
test_user_client.modify_gitea_pr("pool/pkgA", package_pr_number, new_diff_content, "Add another file to Package PR") test_user_client.modify_gitea_pr("mypool/pkgA", package_pr_number, new_diff_content, "Add another file to Package PR")
print(f"Added new commit to Package PR #{package_pr_number}.") print(f"Added new commit to Package PR #{package_pr_number}.")
time.sleep(5) # Give the bot time to react time.sleep(5) # Give the bot time to react
@@ -322,5 +274,3 @@ index 0000000..f587a12
assert project_pr_updated, "Manually created Project PR was not updated by the bot." assert project_pr_updated, "Manually created Project PR was not updated by the bot."
print("Verification complete: Manually created Project PR was updated by the bot as expected.") print("Verification complete: Manually created Project PR was updated by the bot as expected.")

View File

@@ -19,8 +19,8 @@ export GITEA_TOKEN
echo "GITEA_TOKEN exported (length: ${#GITEA_TOKEN})" echo "GITEA_TOKEN exported (length: ${#GITEA_TOKEN})"
# Wait for the dummy data to be created by the gitea setup script # Wait for the dummy data to be created by the gitea setup script
echo "Waiting for workflow.config in products/SLFO..." echo "Waiting for workflow.config in myproducts/mySLFO..."
API_URL="http://gitea-test:3000/api/v1/repos/products/SLFO/contents/workflow.config" API_URL="http://gitea-test:3000/api/v1/repos/myproducts/mySLFO/contents/workflow.config"
HTTP_STATUS=$(curl -s -o /dev/null -w "%{http_code}" -H "Authorization: token $GITEA_TOKEN" "$API_URL") HTTP_STATUS=$(curl -s -o /dev/null -w "%{http_code}" -H "Authorization: token $GITEA_TOKEN" "$API_URL")
while [ "$HTTP_STATUS" != "200" ]; do while [ "$HTTP_STATUS" != "200" ]; do

View File

@@ -1,8 +1,13 @@
[ [
"products/SLFO#main", "myproducts/mySLFO#main",
"products/SLFO#dev", "myproducts/mySLFO#staging-main",
"products/SLFO#merge", "myproducts/mySLFO#dev",
"products/SLFO#maintainer-merge", "myproducts/mySLFO#merge",
"products/SLFO#review-required", "myproducts/mySLFO#maintainer-merge",
"products/SLFO#label-test" "myproducts/mySLFO#review-required",
"myproducts/mySLFO#label-test",
"myproducts/mySLFO#manual-merge",
"myproducts/mySLFO#merge-ff",
"myproducts/mySLFO#merge-replace",
"myproducts/mySLFO#merge-devel"
] ]

View File

@@ -50,6 +50,10 @@ const (
var runId uint var runId uint
var GitWorkTreeAllocate func(string, string, string) (common.GitHandlerGenerator, error) = func(basePath, gitAuthor, email string) (common.GitHandlerGenerator, error) {
return common.AllocateGitWorkTree(basePath, gitAuthor, email)
}
func FetchPrGit(git common.Git, pr *models.PullRequest) error { func FetchPrGit(git common.Git, pr *models.PullRequest) error {
// clone PR head via base (target) repo // clone PR head via base (target) repo
cloneURL := pr.Base.Repo.CloneURL cloneURL := pr.Base.Repo.CloneURL
@@ -127,6 +131,10 @@ func ProcessBuildStatus(project *common.BuildResultList) BuildStatusSummary {
found: found:
for j := 0; j < len(project.Result); j++ { for j := 0; j < len(project.Result); j++ {
common.LogDebug(" found match for @ idx:", j) common.LogDebug(" found match for @ idx:", j)
if project.Result[i].Dirty {
// ignore possible temporary failures and wait for settling
return BuildStatusSummaryBuilding
}
res := ProcessRepoBuildStatus(project.Result[i].Status) res := ProcessRepoBuildStatus(project.Result[i].Status)
switch res { switch res {
case BuildStatusSummarySuccess: case BuildStatusSummarySuccess:
@@ -144,9 +152,9 @@ func ProcessBuildStatus(project *common.BuildResultList) BuildStatusSummary {
func ProcessRepoBuildStatus(results []*common.PackageBuildStatus) (status BuildStatusSummary) { func ProcessRepoBuildStatus(results []*common.PackageBuildStatus) (status BuildStatusSummary) {
PackageBuildStatusSorter := func(a, b *common.PackageBuildStatus) int { PackageBuildStatusSorter := func(a, b *common.PackageBuildStatus) int {
return strings.Compare(a.Package, b.Package) return strings.Compare(a.Package, b.Package)
} }
common.LogDebug("******* RESULTS: ") common.LogDebug("******* RESULTS: ")
data, _ := xml.MarshalIndent(results, "", " ") data, _ := xml.MarshalIndent(results, "", " ")
@@ -191,24 +199,23 @@ func GetPackageBuildStatus(project *common.BuildResultList, packageName string)
return true, BuildStatusSummaryUnknown // true for 'missing' return true, BuildStatusSummaryUnknown // true for 'missing'
} }
// Check for any failures // Check for any unfinished builds
for _, pkgStatus := range packageStatuses { for _, pkgStatus := range packageStatuses {
res, ok := common.ObsBuildStatusDetails[pkgStatus.Code] res, ok := common.ObsBuildStatusDetails[pkgStatus.Code]
if !ok { if !ok {
common.LogInfo("unknown package result code:", pkgStatus.Code, "for package:", pkgStatus.Package) common.LogInfo("unknown package result code:", pkgStatus.Code, "for package:", pkgStatus.Package)
return false, BuildStatusSummaryUnknown return false, BuildStatusSummaryUnknown
} }
if !res.Success { if !res.Finished {
return false, BuildStatusSummaryFailed return false, BuildStatusSummaryBuilding
} }
} }
// Check for any unfinished builds // Check for any failures
for _, pkgStatus := range packageStatuses { for _, pkgStatus := range packageStatuses {
res, _ := common.ObsBuildStatusDetails[pkgStatus.Code] res, _ := common.ObsBuildStatusDetails[pkgStatus.Code]
// 'ok' is already checked in the loop above if !res.Success {
if !res.Finished { return false, BuildStatusSummaryFailed
return false, BuildStatusSummaryBuilding
} }
} }
@@ -216,7 +223,7 @@ func GetPackageBuildStatus(project *common.BuildResultList, packageName string)
return false, BuildStatusSummarySuccess return false, BuildStatusSummarySuccess
} }
func GenerateObsPrjMeta(git common.Git, gitea common.Gitea, pr *models.PullRequest, stagingPrj, buildPrj string, stagingMasterPrj string) (*common.ProjectMeta, error) { func GenerateObsPrjMeta(obs common.ObsClientInterface, git common.Git, gitea common.Gitea, pr *models.PullRequest, stagingPrj, buildPrj string, stagingMasterPrj string) (*common.ProjectMeta, error) {
common.LogDebug("repo content fetching ...") common.LogDebug("repo content fetching ...")
err := FetchPrGit(git, pr) err := FetchPrGit(git, pr)
if err != nil { if err != nil {
@@ -260,13 +267,13 @@ func GenerateObsPrjMeta(git common.Git, gitea common.Gitea, pr *models.PullReque
} }
common.LogDebug("Trying first staging master project: ", stagingMasterPrj) common.LogDebug("Trying first staging master project: ", stagingMasterPrj)
meta, err := ObsClient.GetProjectMeta(stagingMasterPrj) meta, err := obs.GetProjectMeta(stagingMasterPrj)
if err == nil { if err == nil {
// success, so we use that staging master project as our build project // success, so we use that staging master project as our build project
buildPrj = stagingMasterPrj buildPrj = stagingMasterPrj
} else { } else {
common.LogInfo("error fetching project meta for ", stagingMasterPrj, ". Fall Back to ", buildPrj) common.LogInfo("error fetching project meta for ", stagingMasterPrj, ". Fall Back to ", buildPrj)
meta, err = ObsClient.GetProjectMeta(buildPrj) meta, err = obs.GetProjectMeta(buildPrj)
} }
if err != nil { if err != nil {
common.LogError("error fetching project meta for", buildPrj, ". Err:", err) common.LogError("error fetching project meta for", buildPrj, ". Err:", err)
@@ -330,10 +337,10 @@ func GenerateObsPrjMeta(git common.Git, gitea common.Gitea, pr *models.PullReque
// stagingProject:$buildProject // stagingProject:$buildProject
// ^- stagingProject:$buildProject:$subProjectName (based on templateProject) // ^- stagingProject:$buildProject:$subProjectName (based on templateProject)
func CreateQASubProject(stagingConfig *common.StagingConfig, git common.Git, gitea common.Gitea, pr *models.PullRequest, stagingProject, templateProject, subProjectName string, buildDisableRepos []string) error { func CreateQASubProject(obs common.ObsClientInterface, stagingConfig *common.StagingConfig, git common.Git, gitea common.Gitea, pr *models.PullRequest, stagingProject, templateProject, subProjectName string, buildDisableRepos []string) error {
common.LogDebug("Setup QA sub projects") common.LogDebug("Setup QA sub projects")
common.LogDebug("reading templateProject ", templateProject) common.LogDebug("reading templateProject ", templateProject)
templateMeta, err := ObsClient.GetProjectMeta(templateProject) templateMeta, err := obs.GetProjectMeta(templateProject)
if err != nil { if err != nil {
common.LogError("error fetching template project meta for", templateProject, ":", err) common.LogError("error fetching template project meta for", templateProject, ":", err)
return err return err
@@ -343,10 +350,10 @@ func CreateQASubProject(stagingConfig *common.StagingConfig, git common.Git, git
templateMeta.Name = stagingProject + ":" + subProjectName templateMeta.Name = stagingProject + ":" + subProjectName
// freeze tag for now // freeze tag for now
if len(templateMeta.ScmSync) > 0 { if len(templateMeta.ScmSync) > 0 {
repository, err := url.Parse(templateMeta.ScmSync) repository, err := url.Parse(templateMeta.ScmSync)
if err != nil { if err != nil {
panic(err) panic(err)
} }
common.LogDebug("getting data for ", repository.EscapedPath()) common.LogDebug("getting data for ", repository.EscapedPath())
split := strings.Split(repository.EscapedPath(), "/") split := strings.Split(repository.EscapedPath(), "/")
@@ -354,12 +361,12 @@ func CreateQASubProject(stagingConfig *common.StagingConfig, git common.Git, git
common.LogDebug("getting commit for ", org, " repo ", repo, " fragment ", repository.Fragment) common.LogDebug("getting commit for ", org, " repo ", repo, " fragment ", repository.Fragment)
branch, err := gitea.GetCommit(org, repo, repository.Fragment) branch, err := gitea.GetCommit(org, repo, repository.Fragment)
if err != nil { if err != nil {
panic(err) panic(err)
} }
// set expanded commit url // set expanded commit url
repository.Fragment = branch.SHA repository.Fragment = branch.SHA
templateMeta.ScmSync = repository.String() templateMeta.ScmSync = repository.String()
common.LogDebug("Setting scmsync url to ", templateMeta.ScmSync) common.LogDebug("Setting scmsync url to ", templateMeta.ScmSync)
} }
@@ -406,11 +413,11 @@ func CreateQASubProject(stagingConfig *common.StagingConfig, git common.Git, git
templateMeta.Repositories[idx].Paths[pidx].Project = templateMeta.Name templateMeta.Repositories[idx].Paths[pidx].Project = templateMeta.Name
} else } else
// Check for path prefixes against a template project inside of template project area // Check for path prefixes against a template project inside of template project area
if strings.HasPrefix(path.Project, stagingConfig.StagingProject + ":") { if strings.HasPrefix(path.Project, stagingConfig.StagingProject+":") {
newProjectName := stagingProject newProjectName := stagingProject
// find project name // find project name
for _, setup := range stagingConfig.QA { for _, setup := range stagingConfig.QA {
if setup.Origin == path.Project { if setup.Origin == path.Project {
common.LogDebug(" Match:", setup.Origin) common.LogDebug(" Match:", setup.Origin)
newProjectName = newProjectName + ":" + setup.Name newProjectName = newProjectName + ":" + setup.Name
common.LogDebug(" New:", newProjectName) common.LogDebug(" New:", newProjectName)
@@ -418,14 +425,14 @@ func CreateQASubProject(stagingConfig *common.StagingConfig, git common.Git, git
} }
} }
templateMeta.Repositories[idx].Paths[pidx].Project = newProjectName templateMeta.Repositories[idx].Paths[pidx].Project = newProjectName
common.LogDebug(" Matched prefix") common.LogDebug(" Matched prefix")
} }
common.LogDebug(" Path using project ", templateMeta.Repositories[idx].Paths[pidx].Project) common.LogDebug(" Path using project ", templateMeta.Repositories[idx].Paths[pidx].Project)
} }
} }
if !IsDryRun { if !IsDryRun {
err = ObsClient.SetProjectMeta(templateMeta) err = obs.SetProjectMeta(templateMeta)
if err != nil { if err != nil {
common.LogError("cannot create project:", templateMeta.Name, err) common.LogError("cannot create project:", templateMeta.Name, err)
x, _ := xml.MarshalIndent(templateMeta, "", " ") x, _ := xml.MarshalIndent(templateMeta, "", " ")
@@ -439,10 +446,10 @@ func CreateQASubProject(stagingConfig *common.StagingConfig, git common.Git, git
return nil return nil
} }
func StartOrUpdateBuild(config *common.StagingConfig, git common.Git, gitea common.Gitea, pr *models.PullRequest) (RequestModification, error) { func StartOrUpdateBuild(obs common.ObsClientInterface, config *common.StagingConfig, git common.Git, gitea common.Gitea, pr *models.PullRequest) (RequestModification, error) {
common.LogDebug("fetching OBS project Meta") common.LogDebug("fetching OBS project Meta")
obsPrProject := GetObsProjectAssociatedWithPr(config, ObsClient.HomeProject, pr) obsPrProject := GetObsProjectAssociatedWithPr(config, obs.GetHomeProject(), pr)
meta, err := ObsClient.GetProjectMeta(obsPrProject) meta, err := obs.GetProjectMeta(obsPrProject)
if err != nil { if err != nil {
common.LogError("error fetching project meta for", obsPrProject, ":", err) common.LogError("error fetching project meta for", obsPrProject, ":", err)
return RequestModificationNoChange, err return RequestModificationNoChange, err
@@ -467,7 +474,7 @@ func StartOrUpdateBuild(config *common.StagingConfig, git common.Git, gitea comm
if meta == nil { if meta == nil {
// new build // new build
common.LogDebug(" Staging master:", config.StagingProject) common.LogDebug(" Staging master:", config.StagingProject)
meta, err = GenerateObsPrjMeta(git, gitea, pr, obsPrProject, config.ObsProject, config.StagingProject) meta, err = GenerateObsPrjMeta(obs, git, gitea, pr, obsPrProject, config.ObsProject, config.StagingProject)
if err != nil { if err != nil {
return RequestModificationNoChange, err return RequestModificationNoChange, err
} }
@@ -479,7 +486,7 @@ func StartOrUpdateBuild(config *common.StagingConfig, git common.Git, gitea comm
common.LogDebug("Creating build project:") common.LogDebug("Creating build project:")
common.LogDebug(" meta:", string(x)) common.LogDebug(" meta:", string(x))
} else { } else {
err = ObsClient.SetProjectMeta(meta) err = obs.SetProjectMeta(meta)
if err != nil { if err != nil {
x, _ := xml.MarshalIndent(meta, "", " ") x, _ := xml.MarshalIndent(meta, "", " ")
common.LogDebug(" meta:", string(x)) common.LogDebug(" meta:", string(x))
@@ -550,7 +557,7 @@ func ParseNotificationToPR(thread *models.NotificationThread) (org string, repo
return return
} }
func ProcessPullNotification(gitea common.Gitea, thread *models.NotificationThread) { func ProcessPullNotification(obs common.ObsClientInterface, gitea common.Gitea, thread *models.NotificationThread) {
defer func() { defer func() {
err := recover() err := recover()
if err != nil { if err != nil {
@@ -566,7 +573,7 @@ func ProcessPullNotification(gitea common.Gitea, thread *models.NotificationThre
} }
common.LogInfo("processing PR:", org, "/", repo, "#", num) common.LogInfo("processing PR:", org, "/", repo, "#", num)
done, err := ProcessPullRequest(gitea, org, repo, num) done, err := ProcessPullRequest(obs, gitea, org, repo, num)
if !IsDryRun && err == nil && done { if !IsDryRun && err == nil && done {
gitea.SetNotificationRead(thread.ID) gitea.SetNotificationRead(thread.ID)
} else if err != nil { } else if err != nil {
@@ -576,7 +583,7 @@ func ProcessPullNotification(gitea common.Gitea, thread *models.NotificationThre
var CleanedUpIssues []int64 = []int64{} var CleanedUpIssues []int64 = []int64{}
func CleanupPullNotification(gitea common.Gitea, thread *models.NotificationThread) (CleanupComplete bool) { func CleanupPullNotification(obs common.ObsClientInterface, gitea common.Gitea, thread *models.NotificationThread) (CleanupComplete bool) {
defer func() { defer func() {
err := recover() err := recover()
if err != nil { if err != nil {
@@ -643,8 +650,8 @@ func CleanupPullNotification(gitea common.Gitea, thread *models.NotificationThre
return false return false
} }
stagingProject := GetObsProjectAssociatedWithPr(config, ObsClient.HomeProject, pr) stagingProject := GetObsProjectAssociatedWithPr(config, obs.GetHomeProject(), pr)
if prj, err := ObsClient.GetProjectMeta(stagingProject); err != nil { if prj, err := obs.GetProjectMeta(stagingProject); err != nil {
common.LogError("Failed fetching meta for project:", stagingProject, ". Not cleaning up") common.LogError("Failed fetching meta for project:", stagingProject, ". Not cleaning up")
return false return false
} else if prj == nil && err == nil { } else if prj == nil && err == nil {
@@ -658,13 +665,13 @@ func CleanupPullNotification(gitea common.Gitea, thread *models.NotificationThre
project := stagingProject + ":" + qa.Name project := stagingProject + ":" + qa.Name
common.LogDebug("Cleaning up QA staging", project) common.LogDebug("Cleaning up QA staging", project)
if !IsDryRun { if !IsDryRun {
if err := ObsClient.DeleteProject(project); err != nil { if err := obs.DeleteProject(project); err != nil {
common.LogError("Failed to cleanup QA staging", project, err) common.LogError("Failed to cleanup QA staging", project, err)
} }
} }
} }
if !IsDryRun { if !IsDryRun {
if err := ObsClient.DeleteProject(stagingProject); err != nil { if err := obs.DeleteProject(stagingProject); err != nil {
common.LogError("Failed to cleanup staging", stagingProject, err) common.LogError("Failed to cleanup staging", stagingProject, err)
} }
} }
@@ -685,7 +692,7 @@ func SetStatus(gitea common.Gitea, org, repo, hash string, status *models.Commit
return err return err
} }
func commentOnPackagePR(gitea common.Gitea, org string, repo string, prNum int64, msg string) { func CommentPROnce(gitea common.Gitea, org string, repo string, prNum int64, msg string) {
if IsDryRun { if IsDryRun {
common.LogInfo("Would comment on package PR %s/%s#%d: %s", org, repo, prNum, msg) common.LogInfo("Would comment on package PR %s/%s#%d: %s", org, repo, prNum, msg)
return return
@@ -697,6 +704,18 @@ func commentOnPackagePR(gitea common.Gitea, org string, repo string, prNum int64
return return
} }
timeline, err := gitea.GetTimeline(org, repo, prNum)
if err != nil {
common.LogError("Failed to get timeline for PR %s/%s#%d: %v", org, repo, prNum, err)
return
}
for _, t := range timeline {
if t.User != nil && t.User.UserName == BotUser && t.Type == common.TimelineCommentType_Comment && t.Body == msg {
return
}
}
err = gitea.AddComment(pr, msg) err = gitea.AddComment(pr, msg)
if err != nil { if err != nil {
common.LogError("Failed to comment on package PR %s/%s#%d: %v", org, repo, prNum, err) common.LogError("Failed to comment on package PR %s/%s#%d: %v", org, repo, prNum, err)
@@ -704,20 +723,21 @@ func commentOnPackagePR(gitea common.Gitea, org string, repo string, prNum int64
} }
// Create and remove QA projects // Create and remove QA projects
func ProcessQaProjects(stagingConfig *common.StagingConfig, git common.Git, gitea common.Gitea, pr *models.PullRequest, stagingProject string) []string { func ProcessQaProjects(obs common.ObsClientInterface, stagingConfig *common.StagingConfig, git common.Git, gitea common.Gitea, pr *models.PullRequest, stagingProject string) ([]string, string) {
usedQAprojects := make([]string, 0) usedQAprojects := make([]string, 0)
prLabelNames := make(map[string]int) prLabelNames := make(map[string]int)
for _, label := range pr.Labels { for _, label := range pr.Labels {
prLabelNames[label.Name] = 1 prLabelNames[label.Name] = 1
} }
msg := "" msg := ""
var qa_projects []string
for _, setup := range stagingConfig.QA { for _, setup := range stagingConfig.QA {
QAproject := stagingProject + ":" + setup.Name QAproject := stagingProject + ":" + setup.Name
if len(setup.Label) > 0 { if len(setup.Label) > 0 {
if _, ok := prLabelNames[setup.Label]; !ok { if _, ok := prLabelNames[setup.Label]; !ok {
if !IsDryRun { if !IsDryRun {
// blindly remove, will fail when not existing // blindly remove, will fail when not existing
ObsClient.DeleteProject(QAproject) obs.DeleteProject(QAproject)
} }
common.LogInfo("QA project ", setup.Name, "has no matching Label") common.LogInfo("QA project ", setup.Name, "has no matching Label")
continue continue
@@ -726,24 +746,25 @@ func ProcessQaProjects(stagingConfig *common.StagingConfig, git common.Git, gite
usedQAprojects = append(usedQAprojects, QAproject) usedQAprojects = append(usedQAprojects, QAproject)
// check for existens first, no error, but no meta is a 404 // check for existens first, no error, but no meta is a 404
if meta, err := ObsClient.GetProjectMeta(QAproject); meta == nil && err == nil { if meta, err := obs.GetProjectMeta(QAproject); meta == nil && err == nil {
common.LogInfo("Create QA project ", QAproject) common.LogInfo("Create QA project ", QAproject)
CreateQASubProject(stagingConfig, git, gitea, pr, CreateQASubProject(obs, stagingConfig, git, gitea, pr,
stagingProject, stagingProject,
setup.Origin, setup.Origin,
setup.Name, setup.Name,
setup.BuildDisableRepos) setup.BuildDisableRepos)
msg = msg + "QA Project added: " + ObsWebHost + "/project/show/" + qa_projects = append(qa_projects, ObsWebHost+"/project/show/"+QAproject)
QAproject + "\n"
} }
} }
if len(msg) > 1 {
gitea.AddComment(pr, msg) if len(qa_projects) > 0 {
msg = "Additional QA builds:\n" + strings.Join(qa_projects, "\n")
} }
return usedQAprojects
return usedQAprojects, msg
} }
func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, error) { func ProcessPullRequest(obs common.ObsClientInterface, gitea common.Gitea, org, repo string, id int64) (bool, error) {
dir, err := os.MkdirTemp(os.TempDir(), BotName) dir, err := os.MkdirTemp(os.TempDir(), BotName)
common.PanicOnError(err) common.PanicOnError(err)
if IsDryRun { if IsDryRun {
@@ -752,7 +773,7 @@ func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, e
defer os.RemoveAll(dir) defer os.RemoveAll(dir)
} }
gh, err := common.AllocateGitWorkTree(dir, GitAuthor, "noaddress@suse.de") gh, err := GitWorkTreeAllocate(dir, GitAuthor, "noaddress@suse.de")
common.PanicOnError(err) common.PanicOnError(err)
git, err := gh.CreateGitHandler(org) git, err := gh.CreateGitHandler(org)
@@ -797,7 +818,7 @@ func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, e
if err != nil { if err != nil {
common.LogError("Staging config", common.StagingConfigFile, "not found in PR to the project. Aborting.") common.LogError("Staging config", common.StagingConfigFile, "not found in PR to the project. Aborting.")
if !IsDryRun { if !IsDryRun {
_, err = gitea.AddReviewComment(pr, common.ReviewStateRequestChanges, "Cannot find project config in PR: "+common.ProjectConfigFile) _, _ = gitea.AddReviewComment(pr, common.ReviewStateRequestChanges, "Cannot find project config in PR: "+common.ProjectConfigFile)
} }
return true, err return true, err
} }
@@ -817,7 +838,7 @@ func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, e
return true, nil return true, nil
} }
meta, err := ObsClient.GetProjectMeta(stagingConfig.ObsProject) meta, err := obs.GetProjectMeta(stagingConfig.ObsProject)
if err != nil || meta == nil { if err != nil || meta == nil {
common.LogError("Cannot find reference project meta:", stagingConfig.ObsProject, err) common.LogError("Cannot find reference project meta:", stagingConfig.ObsProject, err)
if !IsDryRun && err == nil { if !IsDryRun && err == nil {
@@ -856,8 +877,11 @@ func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, e
// NOTE: this is user input, so we need some limits here // NOTE: this is user input, so we need some limits here
l := len(stagingConfig.ObsProject) l := len(stagingConfig.ObsProject)
if l >= len(stagingConfig.StagingProject) || stagingConfig.ObsProject != stagingConfig.StagingProject[0:l] { if l >= len(stagingConfig.StagingProject) || stagingConfig.ObsProject != stagingConfig.StagingProject[0:l] {
common.LogError("StagingProject (", stagingConfig.StagingProject, ") is not child of target project", stagingConfig.ObsProject) // TEMPORARY HACK: We remove this when Factory has switched to git
return true, nil if ( stagingConfig.ObsProject != "openSUSE:Factory:git" && stagingConfig.StagingProject != "openSUSE:Factory:PullRequest" ) {
common.LogError("StagingProject (", stagingConfig.StagingProject, ") is not child of target project", stagingConfig.ObsProject)
return true, nil
}
} }
} }
@@ -946,8 +970,8 @@ func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, e
} }
common.LogDebug("ObsProject:", stagingConfig.ObsProject) common.LogDebug("ObsProject:", stagingConfig.ObsProject)
stagingProject := GetObsProjectAssociatedWithPr(stagingConfig, ObsClient.HomeProject, pr) stagingProject := GetObsProjectAssociatedWithPr(stagingConfig, obs.GetHomeProject(), pr)
change, err := StartOrUpdateBuild(stagingConfig, git, gitea, pr) change, err := StartOrUpdateBuild(obs, stagingConfig, git, gitea, pr)
status := &models.CommitStatus{ status := &models.CommitStatus{
Context: BotName, Context: BotName,
Description: "OBS Staging build", Description: "OBS Staging build",
@@ -978,11 +1002,8 @@ func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, e
SetStatus(gitea, org, repo, pr.Head.Sha, status) SetStatus(gitea, org, repo, pr.Head.Sha, status)
} }
if change != RequestModificationNoChange && !IsDryRun {
gitea.AddComment(pr, msg)
}
stagingResult, err := ObsClient.BuildStatus(stagingProject) stagingResult, err := obs.BuildStatus(stagingProject)
if err != nil { if err != nil {
common.LogError("failed fetching stage project status for", stagingProject, ":", err) common.LogError("failed fetching stage project status for", stagingProject, ":", err)
} }
@@ -990,7 +1011,14 @@ func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, e
_, packagePRs := common.ExtractDescriptionAndPRs(bufio.NewScanner(strings.NewReader(pr.Body))) _, packagePRs := common.ExtractDescriptionAndPRs(bufio.NewScanner(strings.NewReader(pr.Body)))
// always update QA projects because Labels can change // always update QA projects because Labels can change
qaProjects := ProcessQaProjects(stagingConfig, git, gitea, pr, stagingProject) qaProjects, qaProjectMsg := ProcessQaProjects(obs, stagingConfig, git, gitea, pr, stagingProject)
if change != RequestModificationNoChange && !IsDryRun {
if len(qaProjectMsg) > 0 {
msg += "\n" + qaProjectMsg
}
CommentPROnce(gitea, org, repo, id, msg)
}
done := false done := false
overallBuildStatus := ProcessBuildStatus(stagingResult) overallBuildStatus := ProcessBuildStatus(stagingResult)
@@ -998,7 +1026,7 @@ func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, e
if len(qaProjects) > 0 && overallBuildStatus == BuildStatusSummarySuccess { if len(qaProjects) > 0 && overallBuildStatus == BuildStatusSummarySuccess {
seperator := " in " seperator := " in "
for _, qaProject := range qaProjects { for _, qaProject := range qaProjects {
qaResult, err := ObsClient.BuildStatus(qaProject) qaResult, err := obs.BuildStatus(qaProject)
if err != nil { if err != nil {
common.LogError("failed fetching stage project status for", qaProject, ":", err) common.LogError("failed fetching stage project status for", qaProject, ":", err)
} }
@@ -1018,6 +1046,7 @@ func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, e
} }
} }
} }
switch overallBuildStatus { switch overallBuildStatus {
case BuildStatusSummarySuccess: case BuildStatusSummarySuccess:
status.Status = common.CommitStatus_Success status.Status = common.CommitStatus_Success
@@ -1058,7 +1087,7 @@ func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, e
default: default:
continue continue
} }
commentOnPackagePR(gitea, packagePR.Org, packagePR.Repo, packagePR.Num, msg) CommentPROnce(gitea, packagePR.Org, packagePR.Repo, packagePR.Num, msg)
} }
if len(missingPkgs) > 0 { if len(missingPkgs) > 0 {
@@ -1068,10 +1097,7 @@ func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, e
msg = msg + " - " + pkg + "\n" msg = msg + " - " + pkg + "\n"
} }
common.LogInfo(msg) common.LogInfo(msg)
err := gitea.AddComment(pr, msg) CommentPROnce(gitea, org, repo, id, msg)
if err != nil {
common.LogError(err)
}
} }
} }
@@ -1090,8 +1116,7 @@ func ProcessPullRequest(gitea common.Gitea, org, repo string, id int64) (bool, e
return false, nil return false, nil
} }
func PollWorkNotifications(giteaUrl string) { func PollWorkNotifications(obs common.ObsClientInterface, gitea common.Gitea) {
gitea := common.AllocateGiteaTransport(giteaUrl)
data, err := gitea.GetNotifications(common.GiteaNotificationType_Pull, nil) data, err := gitea.GetNotifications(common.GiteaNotificationType_Pull, nil)
if err != nil { if err != nil {
@@ -1107,7 +1132,7 @@ func PollWorkNotifications(giteaUrl string) {
if !ListPullNotificationsOnly { if !ListPullNotificationsOnly {
switch notification.Subject.Type { switch notification.Subject.Type {
case "Pull": case "Pull":
ProcessPullNotification(gitea, notification) ProcessPullNotification(obs, gitea, notification)
default: default:
if !IsDryRun { if !IsDryRun {
gitea.SetNotificationRead(notification.ID) gitea.SetNotificationRead(notification.ID)
@@ -1130,7 +1155,7 @@ func PollWorkNotifications(giteaUrl string) {
continue continue
} }
cleanupFinished = CleanupPullNotification(gitea, n) && cleanupFinished cleanupFinished = CleanupPullNotification(obs, gitea, n) && cleanupFinished
} }
} else if err != nil { } else if err != nil {
common.LogError(err) common.LogError(err)
@@ -1144,7 +1169,8 @@ var ObsApiHost string
var ObsWebHost string var ObsWebHost string
var IsDryRun bool var IsDryRun bool
var ProcessPROnly string var ProcessPROnly string
var ObsClient *common.ObsClient var ObsClient common.ObsClientInterface
var BotUser string
func ObsWebHostFromApiHost(apihost string) string { func ObsWebHostFromApiHost(apihost string) string {
u, err := url.Parse(apihost) u, err := url.Parse(apihost)
@@ -1209,9 +1235,18 @@ func main() {
} }
if len(*buildRoot) > 0 { if len(*buildRoot) > 0 {
ObsClient.HomeProject = *buildRoot ObsClient.SetHomeProject(*buildRoot)
} }
gitea := common.AllocateGiteaTransport(GiteaUrl)
user, err := gitea.GetCurrentUser()
if err != nil {
common.LogError("Cannot fetch current user:", err)
return
}
BotUser = user.UserName
if len(*ProcessPROnly) > 0 { if len(*ProcessPROnly) > 0 {
rx := regexp.MustCompile("^([^/#]+)/([^/#]+)#([0-9]+)$") rx := regexp.MustCompile("^([^/#]+)/([^/#]+)#([0-9]+)$")
m := rx.FindStringSubmatch(*ProcessPROnly) m := rx.FindStringSubmatch(*ProcessPROnly)
@@ -1220,15 +1255,14 @@ func main() {
return return
} }
gitea := common.AllocateGiteaTransport(GiteaUrl)
id, _ := strconv.ParseInt(m[3], 10, 64) id, _ := strconv.ParseInt(m[3], 10, 64)
ProcessPullRequest(gitea, m[1], m[2], id) ProcessPullRequest(ObsClient, gitea, m[1], m[2], id)
return return
} }
for { for {
PollWorkNotifications(GiteaUrl) PollWorkNotifications(ObsClient, gitea)
common.LogInfo("Poll cycle finished") common.LogInfo("Poll cycle finished")
time.Sleep(5 * time.Minute) time.Sleep(5 * time.Minute)
} }

File diff suppressed because it is too large Load Diff

View File

@@ -54,6 +54,7 @@ This is the ProjectGit config file. For runtime config file, see bottom.
| *GitProjectName* | Repository and branch where the ProjectGit lives. | no | string | **Format**: `org/project_repo#branch` | By default assumes `_ObsPrj` with default branch in the *Organization* | | *GitProjectName* | Repository and branch where the ProjectGit lives. | no | string | **Format**: `org/project_repo#branch` | By default assumes `_ObsPrj` with default branch in the *Organization* |
| *ManualMergeOnly* | Merges are permitted only upon receiving a "merge ok" comment from designated maintainers in the PkgGit PR. | no | bool | true, false | false | | *ManualMergeOnly* | Merges are permitted only upon receiving a "merge ok" comment from designated maintainers in the PkgGit PR. | no | bool | true, false | false |
| *ManualMergeProject* | Merges are permitted only upon receiving a "merge ok" comment in the ProjectGit PR from project maintainers. | no | bool | true, false | false | | *ManualMergeProject* | Merges are permitted only upon receiving a "merge ok" comment in the ProjectGit PR from project maintainers. | no | bool | true, false | false |
| *MergeMode* | Type of package merge accepted. See below for details. | no | string | ff-only, replace, devel | ff-only |
| *ReviewRequired* | If submitter is a maintainer, require review from another maintainer if available. | no | bool | true, false | false | | *ReviewRequired* | If submitter is a maintainer, require review from another maintainer if available. | no | bool | true, false | false |
| *NoProjectGitPR* | Do not create PrjGit PR, but still perform other tasks. | no | bool | true, false | false | | *NoProjectGitPR* | Do not create PrjGit PR, but still perform other tasks. | no | bool | true, false | false |
| *Reviewers* | PrjGit reviewers. Additional review requests are triggered for associated PkgGit PRs. PrjGit PR is merged only when all reviews are complete. | no | array of strings | | `[]` | | *Reviewers* | PrjGit reviewers. Additional review requests are triggered for associated PkgGit PRs. PrjGit PR is merged only when all reviews are complete. | no | array of strings | | `[]` |
@@ -96,6 +97,19 @@ Package Deletion Requests
If you wish to re-add a package, create a new PrjGit PR which adds again the submodule on the branch that has the "-removed" suffix. The bot will automatically remove this suffix from the project branch in the pool. If you wish to re-add a package, create a new PrjGit PR which adds again the submodule on the branch that has the "-removed" suffix. The bot will automatically remove this suffix from the project branch in the pool.
Merge Modes
-----------
| Merge Mode | Description
|------------|--------------------------------------------------------------------------------
| ff-only | Only allow --ff-only merges in the package branch. This is best suited for
| | devel projects and openSUSE Tumbleweed development, where history should be linear
| replace | Merge is done via `-X theirs` strategy and old files are removed in the merge.
| | This works well for downstream codestreams, like Leap, that would update their branch
| | using latest version.
| devel | No merge, just set the project branch to PR HEAD. This is suitable for downstream
| | projects like Leap during development cycle, where keeping maintenance history is not important
Labels Labels
------ ------
@@ -104,8 +118,6 @@ The following labels are used, when defined in Repo/Org.
| Label Config Entry | Default label | Description | Label Config Entry | Default label | Description
|--------------------|----------------|---------------------------------------- |--------------------|----------------|----------------------------------------
| StagingAuto | staging/Auto | Assigned to Project Git PRs when first staged | StagingAuto | staging/Auto | Assigned to Project Git PRs when first staged
| ReviewPending | review/Pending | Assigned to Project Git PR when package reviews are still pending
| ReviewDone | review/Done | Assigned to Project Git PR when reviews are complete on all package PRs
Maintainership Maintainership

View File

@@ -309,14 +309,17 @@ func (pr *PRProcessor) UpdatePrjGitPR(prset *common.PRSet) error {
PrjGit := PrjGitPR.PR.Base.Repo PrjGit := PrjGitPR.PR.Base.Repo
prjGitPRbranch := PrjGitPR.PR.Head.Name prjGitPRbranch := PrjGitPR.PR.Head.Name
if PrjGitPR.PR.Base.RepoID != PrjGitPR.PR.Head.RepoID { if PrjGitPR.PR.Base.RepoID != PrjGitPR.PR.Head.RepoID {
PrjGitPR.RemoteName, err = git.GitClone(common.DefaultGitPrj, "", PrjGit.SSHURL) // permission check, if submission comes from foreign repo
git.GitExecOrPanic(common.DefaultGitPrj, "fetch", PrjGitPR.RemoteName, PrjGitPR.PR.Head.Sha) if !PrjGitPR.PR.AllowMaintainerEdit {
git.GitExecOrPanic(common.DefaultGitPrj, "checkout", PrjGitPR.PR.Head.Sha) // well, wrong place...
common.LogInfo("Cannot update this PR as it's on another remote, not branch:", prjGitPRbranch, "Assuming this is by-design. (eg. project git PR only)") // common.LogError("Warning: source and target branch are in different repositories. We may not have the right permissions...")
return nil // Gitea.AddComment(PrjGitPR.PR, "This PR does not allow maintainer changes, but referenced package branch has changed!")
return nil
}
} }
PrjGitPR.RemoteName, err = git.GitClone(common.DefaultGitPrj, prjGitPRbranch, PrjGit.SSHURL) // PrjGitPR.RemoteName, err = git.GitClone(common.DefaultGitPrj, prjGitPRbranch, PrjGit.SSHURL)
PrjGitPR.RemoteName, err = git.GitClone(common.DefaultGitPrj, PrjGitPR.PR.Head.Ref, PrjGitPR.PR.Head.Repo.SSHURL)
common.PanicOnError(err) common.PanicOnError(err)
git.GitExecOrPanic(common.DefaultGitPrj, "fetch", PrjGitPR.RemoteName, PrjGitBranch) git.GitExecOrPanic(common.DefaultGitPrj, "fetch", PrjGitPR.RemoteName, PrjGitBranch)
@@ -364,6 +367,7 @@ func (pr *PRProcessor) UpdatePrjGitPR(prset *common.PRSet) error {
} }
common.PanicOnError(git.GitExec(common.DefaultGitPrj, params...)) common.PanicOnError(git.GitExec(common.DefaultGitPrj, params...))
PrjGitPR.PR.Head.Sha = newHeadCommit PrjGitPR.PR.Head.Sha = newHeadCommit
Gitea.SetLabels(PrjGit.Owner.UserName, PrjGit.Name, PrjGitPR.PR.Index, []string{prset.Config.Label("PR/updated")})
} }
// update PR // update PR
@@ -413,6 +417,12 @@ func (pr *PRProcessor) Process(req *models.PullRequest) error {
} }
common.LogInfo("fetched PRSet of size:", len(prset.PRs)) common.LogInfo("fetched PRSet of size:", len(prset.PRs))
if !prset.PrepareForMerge(git) {
common.LogError("PRs are NOT mergeable.")
} else {
common.LogInfo("PRs are in mergeable state.")
}
prjGitPRbranch := prGitBranchNameForPR(prRepo, prNo) prjGitPRbranch := prGitBranchNameForPR(prRepo, prNo)
prjGitPR, err := prset.GetPrjGitPR() prjGitPR, err := prset.GetPrjGitPR()
if err == common.PRSet_PrjGitMissing { if err == common.PRSet_PrjGitMissing {
@@ -466,7 +476,8 @@ func (pr *PRProcessor) Process(req *models.PullRequest) error {
if pr.PR.State == "open" { if pr.PR.State == "open" {
org, repo, idx := pr.PRComponents() org, repo, idx := pr.PRComponents()
if prjGitPR.PR.HasMerged { if prjGitPR.PR.HasMerged {
Gitea.AddComment(pr.PR, "This PR is merged via the associated Project PR.") // TODO: use timeline here because this can spam if ManualMergePR fails
// Gitea.AddComment(pr.PR, "This PR is merged via the associated Project PR.")
err = Gitea.ManualMergePR(org, repo, idx, pr.PR.Head.Sha, false) err = Gitea.ManualMergePR(org, repo, idx, pr.PR.Head.Sha, false)
if _, ok := err.(*repository.RepoMergePullRequestConflict); !ok { if _, ok := err.(*repository.RepoMergePullRequestConflict); !ok {
common.PanicOnError(err) common.PanicOnError(err)

View File

@@ -128,6 +128,7 @@ func TestOpenPR(t *testing.T) {
mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes() mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes()
mockGit.EXPECT().GitBranchHead(gomock.Any(), gomock.Any()).Return("head", nil).AnyTimes() mockGit.EXPECT().GitBranchHead(gomock.Any(), gomock.Any()).Return("head", nil).AnyTimes()
mockGit.EXPECT().GitSubmoduleList(gomock.Any(), gomock.Any()).Return(map[string]string{"testRepo": "testing"}, nil).AnyTimes() mockGit.EXPECT().GitSubmoduleList(gomock.Any(), gomock.Any()).Return(map[string]string{"testRepo": "testing"}, nil).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes() mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitCatFile(gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes() mockGit.EXPECT().GitCatFile(gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes()
mockGit.EXPECT().Close().Return(nil).AnyTimes() mockGit.EXPECT().Close().Return(nil).AnyTimes()
@@ -187,6 +188,7 @@ func TestOpenPR(t *testing.T) {
mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes() mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes()
mockGit.EXPECT().GitBranchHead(gomock.Any(), gomock.Any()).Return("head", nil).AnyTimes() mockGit.EXPECT().GitBranchHead(gomock.Any(), gomock.Any()).Return("head", nil).AnyTimes()
mockGit.EXPECT().GitSubmoduleList(gomock.Any(), gomock.Any()).Return(map[string]string{"testRepo": "testing"}, nil).AnyTimes() mockGit.EXPECT().GitSubmoduleList(gomock.Any(), gomock.Any()).Return(map[string]string{"testRepo": "testing"}, nil).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes() mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitCatFile(gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes() mockGit.EXPECT().GitCatFile(gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes()
mockGit.EXPECT().Close().Return(nil).AnyTimes() mockGit.EXPECT().Close().Return(nil).AnyTimes()
@@ -236,6 +238,7 @@ func TestOpenPR(t *testing.T) {
mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes() mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes()
mockGit.EXPECT().GitBranchHead(gomock.Any(), gomock.Any()).Return("head", nil).AnyTimes() mockGit.EXPECT().GitBranchHead(gomock.Any(), gomock.Any()).Return("head", nil).AnyTimes()
mockGit.EXPECT().GitSubmoduleList(gomock.Any(), gomock.Any()).Return(map[string]string{"testRepo": "testing"}, nil).AnyTimes() mockGit.EXPECT().GitSubmoduleList(gomock.Any(), gomock.Any()).Return(map[string]string{"testRepo": "testing"}, nil).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes() mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitCatFile(gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes() mockGit.EXPECT().GitCatFile(gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes()
gitea.EXPECT().RequestReviews(gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes() gitea.EXPECT().RequestReviews(gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes()
@@ -289,6 +292,7 @@ func TestOpenPR(t *testing.T) {
mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes() mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes()
mockGit.EXPECT().GitBranchHead(gomock.Any(), gomock.Any()).Return("head", nil).AnyTimes() mockGit.EXPECT().GitBranchHead(gomock.Any(), gomock.Any()).Return("head", nil).AnyTimes()
mockGit.EXPECT().GitSubmoduleList(gomock.Any(), gomock.Any()).Return(map[string]string{"testRepo": "testing"}, nil).AnyTimes() mockGit.EXPECT().GitSubmoduleList(gomock.Any(), gomock.Any()).Return(map[string]string{"testRepo": "testing"}, nil).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes() mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitCatFile(gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes() mockGit.EXPECT().GitCatFile(gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes()
mockGit.EXPECT().Close().Return(nil).AnyTimes() mockGit.EXPECT().Close().Return(nil).AnyTimes()
@@ -343,6 +347,7 @@ func TestOpenPR(t *testing.T) {
mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes() mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes()
mockGit.EXPECT().GitBranchHead(gomock.Any(), gomock.Any()).Return("head", nil).AnyTimes() mockGit.EXPECT().GitBranchHead(gomock.Any(), gomock.Any()).Return("head", nil).AnyTimes()
mockGit.EXPECT().GitSubmoduleList(gomock.Any(), gomock.Any()).Return(map[string]string{"testRepo": "testing"}, nil).AnyTimes() mockGit.EXPECT().GitSubmoduleList(gomock.Any(), gomock.Any()).Return(map[string]string{"testRepo": "testing"}, nil).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes() mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitCatFile(gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes() mockGit.EXPECT().GitCatFile(gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes()
mockGit.EXPECT().Close().Return(nil).AnyTimes() mockGit.EXPECT().Close().Return(nil).AnyTimes()

View File

@@ -396,7 +396,62 @@ func TestUpdatePrjGitPR(t *testing.T) {
Name: "feature", Name: "feature",
RepoID: 2, // Different RepoID RepoID: 2, // Different RepoID
Sha: "sha1", Sha: "sha1",
Repo: &models.Repository{
SSHURL: "url",
},
}, },
User: &models.User{UserName: "someone"},
Mergeable: true,
AllowMaintainerEdit: true,
},
},
{
PR: &models.PullRequest{
State: "open",
Base: &models.PRBranchInfo{
Name: "other",
Repo: &models.Repository{
Name: "other-pkg",
Owner: &models.User{UserName: "test-org"},
},
},
Head: &models.PRBranchInfo{
Sha: "other-sha",
},
},
},
},
}
mockGit.EXPECT().GitClone(common.DefaultGitPrj, gomock.Any(), gomock.Any()).Return("remote2", nil)
mockGit.EXPECT().GitExecOrPanic(common.DefaultGitPrj, "fetch", "remote2", "main")
mockGit.EXPECT().GitBranchHead(common.DefaultGitPrj, gomock.Any()).Return("sha1", nil).Times(2)
mockGit.EXPECT().GitSubmoduleList(common.DefaultGitPrj, "HEAD").Return(map[string]string{"other-pkg": "other-sha"}, nil)
err := processor.UpdatePrjGitPR(prset)
if err != nil {
t.Errorf("Unexpected error: %v", err)
}
})
t.Run("PR on another remote - not allowed", func(t *testing.T) {
prset := &common.PRSet{
Config: config,
PRs: []*common.PRInfo{
{
PR: &models.PullRequest{
Base: &models.PRBranchInfo{
Name: "main",
RepoID: 1,
Repo: &models.Repository{
Name: "test-prj",
Owner: &models.User{UserName: "test-org"},
},
},
Head: &models.PRBranchInfo{
Name: "feature",
RepoID: 2, // Different RepoID
},
AllowMaintainerEdit: false,
}, },
}, },
{ {
@@ -412,10 +467,6 @@ func TestUpdatePrjGitPR(t *testing.T) {
}, },
}, },
} }
mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("remote2", nil)
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), "fetch", "remote2", "sha1")
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), "checkout", "sha1")
err := processor.UpdatePrjGitPR(prset) err := processor.UpdatePrjGitPR(prset)
if err != nil { if err != nil {
t.Errorf("Unexpected error: %v", err) t.Errorf("Unexpected error: %v", err)
@@ -445,6 +496,9 @@ func TestUpdatePrjGitPR(t *testing.T) {
Name: "PR_branch", Name: "PR_branch",
RepoID: 1, RepoID: 1,
Sha: "old-head", Sha: "old-head",
Repo: &models.Repository{
SSHURL: "url",
},
}, },
}, },
}, },
@@ -483,6 +537,7 @@ func TestUpdatePrjGitPR(t *testing.T) {
mockGit.EXPECT().GitStatus(gomock.Any()).Return(nil, nil).AnyTimes() mockGit.EXPECT().GitStatus(gomock.Any()).Return(nil, nil).AnyTimes()
// UpdatePullRequest expectation // UpdatePullRequest expectation
gitea.EXPECT().SetLabels(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes()
gitea.EXPECT().UpdatePullRequest(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes() gitea.EXPECT().UpdatePullRequest(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, nil).AnyTimes()
err := processor.UpdatePrjGitPR(prset) err := processor.UpdatePrjGitPR(prset)
@@ -514,6 +569,9 @@ func TestUpdatePrjGitPR(t *testing.T) {
Name: "PR_branch", Name: "PR_branch",
RepoID: 1, RepoID: 1,
Sha: "head", Sha: "head",
Repo: &models.Repository{
SSHURL: "url",
},
}, },
}, },
}, },
@@ -724,7 +782,10 @@ func TestPRProcessor_Process_EdgeCases(t *testing.T) {
gitea.EXPECT().FetchMaintainershipFile(gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, "", nil).AnyTimes() gitea.EXPECT().FetchMaintainershipFile(gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, "", nil).AnyTimes()
gitea.EXPECT().FetchMaintainershipDirFile(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, "", nil).AnyTimes() gitea.EXPECT().FetchMaintainershipDirFile(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).Return(nil, "", nil).AnyTimes()
// Mock expectations for the merged path // Mock expectations for the merged path
mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil) mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitExec(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).Return(nil).AnyTimes()
mockGit.EXPECT().GitSubmoduleList(gomock.Any(), gomock.Any()).Return(map[string]string{"pkg-a": "old-sha"}, nil).AnyTimes() mockGit.EXPECT().GitSubmoduleList(gomock.Any(), gomock.Any()).Return(map[string]string{"pkg-a": "old-sha"}, nil).AnyTimes()
gitea.EXPECT().GetRecentCommits(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).Return([]*models.Commit{{SHA: "pkg-sha"}}, nil).AnyTimes() gitea.EXPECT().GetRecentCommits(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).Return([]*models.Commit{{SHA: "pkg-sha"}}, nil).AnyTimes()
@@ -892,6 +953,10 @@ func TestProcessFunc(t *testing.T) {
mockGit.EXPECT().GetPath().Return("/tmp").AnyTimes() mockGit.EXPECT().GetPath().Return("/tmp").AnyTimes()
mockGit.EXPECT().Close().Return(nil) mockGit.EXPECT().Close().Return(nil)
mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
// Expect Process calls (mocked via mockGit mostly) // Expect Process calls (mocked via mockGit mostly)
gitea.EXPECT().GetTimeline(gomock.Any(), gomock.Any(), gomock.Any()).Return([]*models.TimelineComment{}, nil).AnyTimes() gitea.EXPECT().GetTimeline(gomock.Any(), gomock.Any(), gomock.Any()).Return([]*models.TimelineComment{}, nil).AnyTimes()
gitea.EXPECT().GetPullRequestReviews(gomock.Any(), gomock.Any(), gomock.Any()).Return([]*models.PullReview{}, nil).AnyTimes() gitea.EXPECT().GetPullRequestReviews(gomock.Any(), gomock.Any(), gomock.Any()).Return([]*models.PullReview{}, nil).AnyTimes()

View File

@@ -185,6 +185,9 @@ func TestDefaultStateChecker_ProcessPR(t *testing.T) {
mockGit.EXPECT().GetPath().Return("/tmp").AnyTimes() mockGit.EXPECT().GetPath().Return("/tmp").AnyTimes()
mockGit.EXPECT().Close().Return(nil) mockGit.EXPECT().Close().Return(nil)
mockGit.EXPECT().GitClone(gomock.Any(), gomock.Any(), gomock.Any()).Return("origin", nil).AnyTimes()
mockGit.EXPECT().GitExecOrPanic(gomock.Any(), gomock.Any(), gomock.Any(), gomock.Any()).AnyTimes()
// Expectations for ProcesPullRequest // Expectations for ProcesPullRequest
gitea.EXPECT().GetPullRequest(gomock.Any(), gomock.Any(), gomock.Any()).Return(pr, nil).AnyTimes() gitea.EXPECT().GetPullRequest(gomock.Any(), gomock.Any(), gomock.Any()).Return(pr, nil).AnyTimes()
gitea.EXPECT().GetTimeline(gomock.Any(), gomock.Any(), gomock.Any()).Return([]*models.TimelineComment{}, nil).AnyTimes() gitea.EXPECT().GetTimeline(gomock.Any(), gomock.Any(), gomock.Any()).Return([]*models.TimelineComment{}, nil).AnyTimes()