Compare commits

...

91 Commits
main ... main

Author SHA1 Message Date
Dirk Müller
44b4d690db
Only stop importing when it isn't a jengelh repository 2024-12-02 09:34:49 +01:00
Dirk Müller
a69e861614
Switch the operating organization on the "pool" 2024-12-02 09:33:52 +01:00
Dirk Müller
1da740bd8b
Strip multibuild flavors from monitoring 2024-09-09 09:34:12 +02:00
Dirk Mueller
b3107ba3bf Merge pull request 'Stop importing/exporting scmsync packages/projects' (#32) from adamm/git-importer:option_for_non_factory into main
Reviewed-on: importers/git-importer#32
Reviewed-by: Dirk Mueller <dirkmueller@noreply@src.opensuse.org>
2024-09-03 12:40:00 +02:00
86f82325d8 Stop importing/exporting scmsync packages/projects
Also, allow other-than Factory projects
2024-08-08 10:35:53 +02:00
Dirk Mueller
39ba616226 Merge pull request 'Add ability to specify non-Factory' (#31) from adamm/git-importer:option_for_non_factory into main
Reviewed-on: importers/git-importer#31
Reviewed-by: Dirk Mueller <dirkmueller@noreply@src.opensuse.org>
2024-08-07 18:27:11 +02:00
531dbc7c1b Add ability to specify non-Factory
This is important for devel-project only imports
non-factory is still blocked by assert
2024-08-07 16:55:05 +02:00
Dirk Müller
1318f9e0c4
Remove devel branch import
this for yet undefined reason screws up systemd history import
2024-08-07 09:47:54 +02:00
Dirk Müller
d563076d9e
add explicit conversion to string to fix the concatenation 2024-08-07 09:47:18 +02:00
b11b3f1adb
Add and remove literal files
pathspec in git has special characters that we should not trigger.
Assume every filespec as literal
2024-08-01 16:53:46 +02:00
Dirk Müller
479738d4b2
ruff format run 2024-07-10 10:34:20 +02:00
Adam Majer
2d04136ca5
Make sure we create devel branch, when no diff to Factory 2024-06-13 15:36:59 +02:00
Adam Majer
40ad64ddff
Ignore .osc directory 2024-06-10 18:13:51 +02:00
Adam Majer
6bd5d72100
New branch is empty
New branches must be born empty
2024-06-10 17:06:15 +02:00
Dirk Müller
022ae5ab58
remember failed tasks in a separate directory 2024-06-10 17:04:43 +02:00
Dirk Müller
2ff8ed76d0
Reconnect to the AMQP bus when the connection breaks down 2024-06-10 17:04:25 +02:00
Dirk Müller
5f228dc046
enable robust push 2024-05-17 21:47:35 +02:00
Dirk Müller
4e07d8272e
don't loop over failed packages 2024-05-17 21:47:15 +02:00
Dirk Müller
2a3475ab6e
Create with sha256 enabled 2024-05-17 20:39:55 +02:00
Dirk Müller
574bc9aa10
Avoid guessing in switch 2024-05-17 20:07:16 +02:00
Dirk Müller
0414b33206
Fix testing for origin
The previous code path was untested and not working
2024-05-17 20:06:25 +02:00
Dirk Müller
b9670821a9
Only init the repository if it doesn't exist already
harmless, but avoids a scary warning
2024-05-17 20:05:54 +02:00
Dirk Müller
073550825c
Fixups to improve the conversion process 2024-05-17 14:41:42 +02:00
Dirk Müller
5a353c98d3
Add tasks 2024-05-17 11:46:18 +02:00
Dirk Müller
1fc466d15b
Add monitor for commits 2024-05-17 11:40:19 +02:00
Dirk Müller
39fde7744a
Code cleanup 2024-05-16 15:47:45 +02:00
Dirk Müller
f5ffc83a69
Remove double quoting of url parameters
makeurl quotes by itself, so this was messing it up
2024-05-16 11:49:14 +02:00
Dirk Müller
d0ccf83684
Revert "Try to fetch the element as deleted if initial access failed"
The OBS api has been fixed to provide an automatic fallback via
https://github.com/openSUSE/open-build-service/pull/15655

This reverts commit c9e07e536f19820c4bba1f11e2edcb23069874d7.
2024-05-16 11:49:14 +02:00
Dirk Müller
b0ffb01c59
cleanups 2024-05-16 11:49:14 +02:00
Dirk Müller
28d5c6e606
Switch to psycopg rather than psycopg2
It's a bit more modern and uses dedicated c bindings
2024-05-16 11:49:14 +02:00
Dirk Mueller
1e22c2895a Merge pull request 'Switch to sha-256 git repo and use git tools again' (#23) from adamm/git-importer:main into main
Reviewed-on: importers/git-importer#23
2024-05-16 11:48:36 +02:00
Adam Majer
5da7861c2a Switch to sha-256 git repo and use git tools again 2024-04-09 11:40:26 +02:00
Dirk Müller
c9e07e536f
Try to fetch the element as deleted if initial access failed
The reference to the object might be already deleted by when the
request is failing. plus setting deleted=0 is rejected by the API.
So try with deleted=1 if and only if the previous access failed.
2023-12-07 18:30:36 +01:00
Dirk Müller
dc0f33354e
Failing to LFS register should abort the import 2023-12-07 18:29:56 +01:00
Dirk Müller
56cbe0a125
Avoid multi-threading races on import
There seems to be races when using db cursors from multiple threads. as
found by import issues after switching to a newer computer that has
performance and energy efficient cores.

As this is not particularly performance critical, convert to single
threaded use which makes it work again
2023-11-28 23:36:44 +01:00
Dirk Müller
4353f015c8
Switch to localhost:9999 which is provided via a ssh tunnel
The port is no longer directly exposed, so we need to ssh tunnel it
2023-11-22 14:39:55 +01:00
Dirk Müller
9cbe0899bc
Remove unused import 2023-06-19 13:19:52 +02:00
Dirk Müller
9e80a64fe0
Change hostname references from gitea.opensuse.org to src.opensuse.org 2023-06-19 10:59:56 +02:00
Dirk Müller
12001b1640
Commit local changes 2023-04-18 22:31:38 +02:00
Stephan Kulow
3797ea178a Merge pull request 'Add a list of packages no longer existing' (#22) from add_gone into main
Reviewed-on: https://gitea.opensuse.org/importers/git-importer/pulls/22
2023-02-09 10:23:35 +01:00
Stephan Kulow
999dcabcfa Add a list of packages no longer existing
I made this a file and not a DB that is automatically maintained as I think
for now adding an entry in there should be done manually - OBS being OBS
packages might look have gone for a brief moment and reappar the day after.
2022-12-02 11:00:31 +01:00
9962673eff Merge pull request 'Add force push for the devel branch' (#21) from add_force into main
Reviewed-on: https://gitea.opensuse.org/importers/git-importer/pulls/21
2022-12-02 09:35:40 +01:00
Stephan Kulow
7b20c03256 Add force push for the devel branch
As devel branches can change in case of factory reverts we need to force
push. Factory branch shouldn't be affected, so not force pushing there
2022-12-02 09:12:11 +01:00
Stephan Kulow
4692d47120 Make the refresh a debug output, not info 2022-11-16 09:05:36 +01:00
coolo
d311d54f26 Merge pull request 'Also treat some more mimetypes as text' (#20) from add_further_mimetypes into main
Reviewed-on: https://gitea.opensuse.org/importers/git-importer/pulls/20
2022-11-15 07:28:05 +01:00
Stephan Kulow
dddc54ab1c Remove ProcessPool from exporting
It's ignoring exceptions and makes debugging way too hard to justify
what's happening
2022-11-11 16:33:44 +01:00
Stephan Kulow
4d1ca8d882 Also treat some more mimetypes as text 2022-11-11 16:22:18 +01:00
Stephan Kulow
7861a7e9b0 Fix LFS register (it needs json not data)
Refactored the LFS Oid handling in its class of its own and
add a way to recheck all LFS handles (or re-register)
2022-11-09 08:32:18 +01:00
coolo
f5b29886ae Merge pull request 'No longer rely on external service for LFS tracking' (#18) from add_lfs into main
Reviewed-on: https://gitea.opensuse.org/importers/git-importer/pulls/18
2022-11-08 11:00:34 +01:00
coolo
d1a8a3288d Merge pull request 'Push to the remote when the repo changed' (#19) from push_it_baby into main
Reviewed-on: https://gitea.opensuse.org/importers/git-importer/pulls/19
2022-11-08 09:48:56 +01:00
Stephan Kulow
9f6c8f62e7 Push to the remote when the repo changed 2022-11-08 09:32:03 +01:00
Stephan Kulow
3e1fbaa1c3 Migrate the ProxySHA256 data into postgresql DB
The calculation of the sha256 and the mimetype is local due to that
2022-11-07 21:50:31 +01:00
Stephan Kulow
e1b32999f0 Fix confusion about User constructor 2022-11-07 16:04:44 +01:00
coolo
86490b51dd Merge pull request 'Fix the maintenance of .gitattributes file' (#17) from fix_lfs_attributes into main
Reviewed-on: https://gitea.opensuse.org/importers/git-importer/pulls/17
2022-11-07 13:26:36 +01:00
Stephan Kulow
da5de04171 Add packages to consider 2022-11-07 07:29:20 +01:00
Stephan Kulow
be8fb2ab94 Fix fake revision creation 2022-11-06 12:27:36 +01:00
Stephan Kulow
9e895e34b6 Adding a gitea remote when creating the git repo 2022-11-06 12:18:16 +01:00
Stephan Kulow
5e495dbd95 Fancy up the git commit message 2022-11-06 11:46:04 +01:00
Stephan Kulow
5ae02a413d Store API URL in the revision table
Will be important once we get into SLE
2022-11-06 10:57:32 +01:00
Stephan Kulow
f1457e8f8e Move git commit message creation into class 2022-11-06 10:16:42 +01:00
Stephan Kulow
9114c2fff8 Change debug output for downloading files 2022-11-06 10:16:42 +01:00
Stephan Kulow
834cf61634 Use proper user info in commits 2022-11-06 09:53:52 +01:00
Stephan Kulow
a294c0f670 Readd the skipping of _staging_workflow file
A repository with 150k commits is just very hard to work with -
especially if 99% of them are worthless
2022-11-06 08:29:17 +01:00
Stephan Kulow
7bc4d6c8b1 Make downloading a little more careful for races
As we're downloading packages in parallel, it could happen that we
copy a file that isn't fully copied yet
2022-11-06 08:24:11 +01:00
Stephan Kulow
bd5bd5a444 Don't reset the .gitattributes file
Just change it if it existed before
2022-11-04 21:02:18 +01:00
Stephan Kulow
4e1d5b42ca Only validate the MD5 if we downloaded - trust the file system 2022-11-04 21:02:18 +01:00
Stephan Kulow
0bcc0183c9 Load the proxy data for is_text as well
Otherwise the text state changes over time
2022-11-04 21:02:18 +01:00
Stephan Kulow
7f88e0cc5c Run in the same process if there is only package
Debugging is much easier without Process Pool
2022-11-04 21:02:18 +01:00
coolo
4cc0a23d4e Merge pull request 'Run many packages in parallel to avoid overhead and make use of CPUS' (#16) from parallize_packages into main
Reviewed-on: https://gitea.opensuse.org/importers/git-importer/pulls/16
2022-11-04 10:04:15 +01:00
Stephan Kulow
a457a16a50 Limit the workers to 8
This is hard coding the limit, we may want to make this configurable
but for now the machines supposed to run this code are very similiar
2022-11-04 10:00:28 +01:00
Stephan Kulow
60ced896a3 Fix condition for export 2022-11-04 09:58:36 +01:00
Stephan Kulow
33a5733cb9 Create the git repos in multiple processes
Threads appear to be too dangerous for this
2022-11-04 07:48:17 +01:00
Stephan Kulow
d21ce571f5 Refresh the packages in multiple threads 2022-11-03 22:04:45 +01:00
Stephan Kulow
ab38332642 Allow to import multiple packages in one go
This way we avoid duplicating all startup and SQL queries
2022-11-03 20:14:56 +01:00
coolo
651bd94771 Merge pull request 'Fix merge points creating a cross' (#15) from debug_firewalld into main
Reviewed-on: https://gitea.opensuse.org/importers/git-importer/pulls/15
2022-11-03 15:35:52 +01:00
Stephan Kulow
dd5e26b779 Clarify which of the candidates is the right one - removing assert 2022-11-03 15:29:58 +01:00
Stephan Kulow
f2019db8ff Ignore merge point candidates that create crosses
In OBS you can create submit requests for revisions that are behind
the last merge point, in git you can't - so we ignore them.

Fixes #14
2022-11-03 15:19:51 +01:00
Stephan Kulow
ef7755c771 Add firewalld test showing a broken tree 2022-11-03 15:19:21 +01:00
coolo
6b26f8ff96 Merge pull request 'Fix the import of breeze and other packages' (#13) from add_export into main
Reviewed-on: https://gitea.opensuse.org/importers/git-importer/pulls/13
2022-11-03 15:19:06 +01:00
Stephan Kulow
ed4b7367eb Reset branch if the devel branch is based on Factory
This happens in packages that change their devel project over time. Then
the commit in the devel project no longer has the parent in the devel branch
but is based on factory
2022-11-03 15:12:07 +01:00
Stephan Kulow
f5b3e42165 Add a test case that switches devel project in its life time 2022-11-03 15:06:12 +01:00
6dd3cf3eba Merge pull request 'implement file caching' (#11) from file-cache into main
Reviewed-on: https://gitea.opensuse.org/importers/git-importer/pulls/11
2022-11-03 14:24:22 +01:00
8aed76e52a
change cached file naming pattern 2022-11-03 14:22:19 +01:00
639096b548
optimize cached file locations and add option for cache directory 2022-11-03 14:12:32 +01:00
7678967ae0
implement file caching
to prevent having to download files multiple times
2022-11-03 14:05:11 +01:00
coolo
74f5cd901e Merge pull request 'Keep a reference to the database in DBRevision' (#12) from add_export into main
Reviewed-on: https://gitea.opensuse.org/importers/git-importer/pulls/12
2022-11-03 08:16:30 +01:00
Stephan Kulow
1c54a74ecd Download the full revision 2022-11-02 20:55:09 +01:00
Stephan Kulow
c2294d6200 Add a default LFS .gitattributes for now
Otherwise some packages will break to import
2022-11-02 18:27:17 +01:00
Stephan Kulow
ba7436f10c Keep a reference to the database in DBRevision
To avoid passing the db to all actions
2022-11-02 18:27:09 +01:00
coolo
75f9f56a57 Merge pull request 'Fix up some code after aplanas' continued review' (#10) from add_export into main
Reviewed-on: https://gitea.opensuse.org/importers/git-importer/pulls/10
2022-11-02 18:05:04 +01:00
Stephan Kulow
172242891d Fix up some code after aplanas' continued review 2022-11-02 15:22:24 +01:00
35 changed files with 41642 additions and 506 deletions

View File

@ -1,7 +1,13 @@
all: all:
isort -rc . isort *.py lib/*py tests/*py
autoflake -r --in-place --remove-unused-variables . autoflake --in-place --remove-unused-variables *.py lib/*py tests/*py
black . black *.py lib/*py tests/*py
test: test:
python3 -m unittest -v tests/*.py python3 -m unittest -v tests/*.py
update-packages:
f=$$(mktemp) ;\
osc api /source/openSUSE:Factory?view=info | grep -v lsrcmd5 | grep srcmd5= | sed -e 's,.*package=",,; s,".*,,' | grep -v : > $$f ;\
echo _project >> $$f ;\
mv $$f packages

View File

@ -1,4 +1,4 @@
sudo zypper in python3-psycopg2 sudo zypper in python3-psycopg
sudo su - postgres sudo su - postgres
# `createdb -O <LOCAL_USER> imported_git` # `createdb -O <LOCAL_USER> imported_git`

View File

@ -42,16 +42,36 @@ PROJECTS = [
] ]
def export_package(project, package, repodir, cachedir, gc):
exporter = GitExporter(URL_OBS, project, package, repodir, cachedir)
exporter.set_gc_interval(gc)
exporter.export_as_git()
def main(): def main():
parser = argparse.ArgumentParser(description="OBS history importer into git") parser = argparse.ArgumentParser(description="OBS history importer into git")
parser.add_argument("package", help="OBS package name") parser.add_argument("packages", help="OBS package names", nargs="*")
parser.add_argument(
"-p",
"--project",
default="openSUSE:Factory",
help="Project to import/export, default is openSUSE:Factory",
)
parser.add_argument( parser.add_argument(
"-r", "-r",
"--repodir", "--repodir",
required=False, required=False,
default=pathlib.Path("repos"),
type=pathlib.Path, type=pathlib.Path,
help="Local git repository directory", help="Local git repository directory",
) )
parser.add_argument(
"-c",
"--cachedir",
required=False,
type=pathlib.Path,
help="Local cache directory",
)
parser.add_argument( parser.add_argument(
"-g", "-g",
"--gc", "--gc",
@ -87,17 +107,22 @@ def main():
requests_log.propagate = True requests_log.propagate = True
if args.export: if args.export:
TestExporter(args.package).run() if len(args.packages) != 1:
print("Can only export one package")
sys.exit(1)
TestExporter(args.packages[0]).run()
return return
if not args.repodir: if not args.cachedir:
args.repodir = pathlib.Path("repos/" + args.package) args.cachedir = pathlib.Path("~/.cache/git-import/").expanduser()
importer = Importer(URL_OBS, "openSUSE:Factory", args.package) importer = Importer(URL_OBS, args.project, args.packages)
importer.import_into_db() importer.import_into_db()
exporter = GitExporter(URL_OBS, "openSUSE:Factory", args.package, args.repodir) for package in args.packages:
exporter.set_gc_interval(args.gc) if not importer.package_with_scmsync(package):
exporter.export_as_git() export_package(args.project, package, args.repodir, args.cachedir, args.gc)
else:
logging.debug(f"{args.project}/{package} has scmsync links - skipping export")
if __name__ == "__main__": if __name__ == "__main__":

1355
gone-packages.txt Normal file

File diff suppressed because it is too large Load Diff

View File

@ -1,6 +1,10 @@
class AbstractWalker: from abc import ABC, abstractmethod
class AbstractWalker(ABC):
"""Just a duck type, most likely not needed by python, but I """Just a duck type, most likely not needed by python, but I
find interface classes preferable (java school)""" find interface classes preferable (java school)"""
@abstractmethod
def call(self, node, is_source): def call(self, node, is_source):
pass pass

View File

@ -25,18 +25,28 @@ BINARY = {
".zst", ".zst",
} }
TEXT_MIMETYPES = {
"message/rfc822",
"application/pgp-keys",
"application/x-gnupg-keyring",
}
def is_text_mimetype(mimetype):
if mimetype.startswith("text/"):
return True
return mimetype.split(";")[0] in TEXT_MIMETYPES
def is_binary_or_large(filename, size): def is_binary_or_large(filename, size):
"""Decide if is a binary file based on the extension or size""" """Decide if is a binary file based on the extension or size"""
binary_suffix = BINARY binary_suffix = BINARY
non_binary_suffix = { non_binary_suffix = {
".1",
".8",
".SUSE", ".SUSE",
".asc", ".asc",
".c", ".c",
".cabal", ".cabal",
".cfg",
".changes", ".changes",
".conf", ".conf",
".desktop", ".desktop",

View File

@ -14,8 +14,6 @@ def config(filename="database.ini", section="production"):
for param in params: for param in params:
db[param[0]] = param[1] db[param[0]] = param[1]
else: else:
raise Exception( raise Exception(f"Section {section} not found in the {filename} file")
"Section {0} not found in the {1} file".format(section, filename)
)
return db return db

View File

@ -1,7 +1,6 @@
import logging import logging
import psycopg2 import psycopg
from psycopg2.extras import LoggingConnection
from lib.config import config from lib.config import config
@ -17,22 +16,20 @@ class DB:
# read the connection parameters # read the connection parameters
params = config(section=self.config_section) params = config(section=self.config_section)
# connect to the PostgreSQL server # connect to the PostgreSQL server
self.conn = psycopg2.connect(connection_factory=LoggingConnection, **params) self.conn = psycopg.connect(conninfo=f"dbname={params['database']}")
logger = logging.getLogger(__name__) logging.getLogger("psycopg.pool").setLevel(logging.INFO)
self.conn.initialize(logger)
except (Exception, psycopg2.DatabaseError) as error: except (Exception, psycopg.DatabaseError) as error:
print(error) print(error)
raise error raise error
def schema_version(self): def schema_version(self):
# create a cursor # create a cursor
with self.conn.cursor() as cur: with self.conn.cursor() as cur:
# execute a statement # execute a statement
try: try:
cur.execute("SELECT MAX(version) from scheme") cur.execute("SELECT MAX(version) from scheme")
except psycopg2.errors.UndefinedTable as error: except psycopg.errors.UndefinedTable:
cur.close() cur.close()
self.close() self.close()
self.connect() self.connect()
@ -146,9 +143,9 @@ class DB:
) )
schemes[10] = ( schemes[10] = (
"ALTER TABLE revisions ADD COLUMN request_id INTEGER", "ALTER TABLE revisions ADD COLUMN request_id INTEGER",
"""ALTER TABLE revisions """ALTER TABLE revisions
ADD CONSTRAINT request_id_foreign_key ADD CONSTRAINT request_id_foreign_key
FOREIGN KEY (request_id) FOREIGN KEY (request_id)
REFERENCES requests (id)""", REFERENCES requests (id)""",
"UPDATE scheme SET version=10", "UPDATE scheme SET version=10",
) )
@ -215,6 +212,51 @@ class DB:
"CREATE INDEX ON linked_revs(considered)", "CREATE INDEX ON linked_revs(considered)",
"UPDATE scheme SET version=20", "UPDATE scheme SET version=20",
) )
schemes[21] = (
"ALTER TABLE revisions ADD COLUMN api_url VARCHAR(40)",
"UPDATE revisions SET api_url='https://api.opensuse.org'",
"ALTER TABLE revisions ALTER COLUMN api_url SET NOT NULL",
"UPDATE scheme SET version=21",
)
schemes[22] = (
"""DROP TABLE IF EXISTS lfs_oids""",
"""
CREATE TABLE lfs_oids (
id SERIAL PRIMARY KEY,
project VARCHAR(255) NOT NULL,
package VARCHAR(255) NOT NULL,
filename VARCHAR(255) NOT NULL,
rev VARCHAR(40) NOT NULL,
sha256 VARCHAR(70) NOT NULL,
size INTEGER NOT NULL,
mimetype VARCHAR(255) NOT NULL,
file_md5 VARCHAR(40) NOT NULL
)
""",
"CREATE UNIQUE INDEX ON lfs_oids (sha256,size)",
"CREATE INDEX ON revisions(package)",
"""DROP TABLE IF EXISTS text_files""",
"""
CREATE TABLE text_files (
id SERIAL PRIMARY KEY,
package VARCHAR(255) NOT NULL,
filename VARCHAR(255) NOT NULL
)
""",
"CREATE UNIQUE INDEX ON text_files (package,filename)",
"""DROP TABLE IF EXISTS lfs_oid_in_package""",
"""
CREATE TABLE lfs_oid_in_package (
id SERIAL PRIMARY KEY,
lfs_oid_id INTEGER NOT NULL,
package VARCHAR(255) NOT NULL,
filename VARCHAR(255) NOT NULL
)
""",
"CREATE INDEX ON text_files(package)",
"CREATE INDEX ON lfs_oid_in_package(package)",
"UPDATE scheme SET version=22",
)
schema_version = self.schema_version() schema_version = self.schema_version()
if (schema_version + 1) not in schemes: if (schema_version + 1) not in schemes:
return return
@ -228,7 +270,7 @@ class DB:
cur.execute(command) cur.execute(command)
# commit the changes # commit the changes
self.conn.commit() self.conn.commit()
except (Exception, psycopg2.DatabaseError) as error: except (Exception, psycopg.DatabaseError) as error:
print(error) print(error)
self.close() self.close()
raise error raise error

View File

@ -1,15 +1,15 @@
from __future__ import annotations from __future__ import annotations
import logging
from hashlib import md5 from hashlib import md5
from typing import Optional from pathlib import Path
from lib.db import DB from lib.db import DB
from lib.obs_revision import OBSRevision
from lib.request import Request from lib.request import Request
class DBRevision: class DBRevision:
def __init__(self, row): def __init__(self, db: DB, row: tuple):
# need to stay in sync with the schema creation in db.py # need to stay in sync with the schema creation in db.py
( (
self.dbid, self.dbid,
@ -25,9 +25,12 @@ class DBRevision:
self.request_number, self.request_number,
self.request_id, self.request_id,
self.files_hash, self.files_hash,
self.api_url,
) = row ) = row
self.rev = float(self.rev) self.rev = float(self.rev)
self._files = None self._files = None
self.db = db
self.git_commit = None
def short_string(self): def short_string(self):
return f"{self.project}/{self.package}/{self.rev}" return f"{self.project}/{self.package}/{self.rev}"
@ -48,7 +51,29 @@ class DBRevision:
return self.package < other.package return self.package < other.package
return self.rev < other.rev return self.rev < other.rev
def as_dict(self, db): def request_accept_message(self):
request = Request.find(self.db, self.request_id)
msg = f"Accepting request {request.number} from {request.source_project}\n\n"
msg += self.comment.strip()
url = self.api_url.replace("api.", "build.")
msg += f"\n\nOBS-URL: {url}/request/show/{self.request_number}"
return msg
def git_commit_message(self):
msg = ""
if self.request_id:
msg = self.request_accept_message()
else:
msg = self.comment.strip() + "\n"
url = self.api_url.replace("api.", "build.")
if self.rev == int(self.rev):
# do not link to fake revisions
msg += f"\nOBS-URL: {url}/package/show/{self.project}/{self.package}?expand=0&rev={int(self.rev)}"
else:
msg += f"\nOBS-URL: {url}/package/show/{self.project}/{self.package}?expand=0&rev={self.expanded_srcmd5}"
return msg
def as_dict(self):
"""Return a dict we can put into YAML for test cases""" """Return a dict we can put into YAML for test cases"""
ret = { ret = {
"project": self.project, "project": self.project,
@ -60,26 +85,27 @@ class DBRevision:
"comment": self.comment, "comment": self.comment,
"broken": self.broken, "broken": self.broken,
"expanded_srcmd5": self.expanded_srcmd5, "expanded_srcmd5": self.expanded_srcmd5,
"api_url": self.api_url,
"files_hash": self.files_hash, "files_hash": self.files_hash,
"files": self.files_list(db), "files": self.files_list(),
} }
if self.request_id: if self.request_id:
ret["request"] = Request.find(db, self.request_id).as_dict() ret["request"] = Request.find(self.db, self.request_id).as_dict()
return ret return ret
def links_to(self, db, project, package): def links_to(self, project: str, package: str) -> None:
with db.cursor() as cur: with self.db.cursor() as cur:
cur.execute( cur.execute(
"INSERT INTO links (revision_id, project, package) VALUES (%s,%s,%s)", "INSERT INTO links (revision_id, project, package) VALUES (%s,%s,%s)",
(self.dbid, project, package), (self.dbid, project, package),
) )
@classmethod @staticmethod
def import_obs_rev(cls, db, revision): def import_obs_rev(db: DB, revision: OBSRevision):
with db.cursor() as cur: with db.cursor() as cur:
cur.execute( cur.execute(
"""INSERT INTO revisions (project, package, rev, unexpanded_srcmd5, commit_time, userid, comment, request_number) """INSERT INTO revisions (project, package, rev, unexpanded_srcmd5, commit_time, userid, comment, request_number, api_url)
VALUES(%s, %s, %s, %s, %s, %s, %s, %s)""", VALUES(%s, %s, %s, %s, %s, %s, %s, %s, %s)""",
( (
revision.project, revision.project,
revision.package, revision.package,
@ -89,12 +115,17 @@ class DBRevision:
revision.userid, revision.userid,
revision.comment, revision.comment,
revision.request_number, revision.request_number,
revision.obs.url,
), ),
) )
return cls.fetch_revision(db, revision.project, revision.package, revision.rev) return DBRevision.fetch_revision(
db, revision.project, revision.package, revision.rev
)
@staticmethod @staticmethod
def fetch_revision(db, project, package, rev): def fetch_revision(db, project, package, rev):
"""Technically we would need the api_url as well, but we assume projects are unique
(e.g. not importing SLE from obs)"""
with db.cursor() as cur: with db.cursor() as cur:
cur.execute( cur.execute(
"SELECT * FROM revisions where project=%s and package=%s and rev=%s", "SELECT * FROM revisions where project=%s and package=%s and rev=%s",
@ -102,16 +133,21 @@ class DBRevision:
) )
row = cur.fetchone() row = cur.fetchone()
if row: if row:
return DBRevision(row) return DBRevision(db, row)
@staticmethod @staticmethod
def latest_revision(db, project, package): def max_rev(db, project, package):
with db.cursor() as cur: with db.cursor() as cur:
cur.execute( cur.execute(
"SELECT MAX(rev) FROM revisions where project=%s and package=%s", "SELECT MAX(rev) FROM revisions where project=%s and package=%s",
(project, package), (project, package),
) )
max = cur.fetchone()[0] return cur.fetchone()[0]
return None
@staticmethod
def latest_revision(db, project, package):
max = DBRevision.max_rev(db, project, package)
if max: if max:
return DBRevision.fetch_revision(db, project, package, max) return DBRevision.fetch_revision(db, project, package, max)
return None return None
@ -125,13 +161,13 @@ class DBRevision:
) )
ret = [] ret = []
for row in cur.fetchall(): for row in cur.fetchall():
ret.append(DBRevision(row)) ret.append(DBRevision(db, row))
return ret return ret
def linked_rev(self, db): def linked_rev(self):
if self.broken: if self.broken:
return None return None
with db.cursor() as cur: with self.db.cursor() as cur:
cur.execute( cur.execute(
"SELECT project,package FROM links where revision_id=%s", (self.dbid,) "SELECT project,package FROM links where revision_id=%s", (self.dbid,)
) )
@ -143,26 +179,33 @@ class DBRevision:
"SELECT * FROM revisions where project=%s and package=%s and commit_time <= %s ORDER BY commit_time DESC LIMIT 1", "SELECT * FROM revisions where project=%s and package=%s and commit_time <= %s ORDER BY commit_time DESC LIMIT 1",
(project, package, self.commit_time), (project, package, self.commit_time),
) )
revisions = [DBRevision(row) for row in cur.fetchall()] revisions = [DBRevision(self.db, row) for row in cur.fetchall()]
if revisions: if revisions:
return revisions[0] return revisions[0]
else: else:
self.set_broken(db) self.set_broken()
return None return None
def set_broken(self, db): def set_broken(self):
with db.cursor() as cur: with self.db.cursor() as cur:
cur.execute("UPDATE revisions SET broken=TRUE where id=%s", (self.dbid,)) cur.execute("UPDATE revisions SET broken=TRUE where id=%s", (self.dbid,))
def import_dir_list(self, db, xml): def import_dir_list(self, xml):
with db.cursor() as cur: with self.db.cursor() as cur:
cur.execute( cur.execute(
"UPDATE revisions SET expanded_srcmd5=%s where id=%s", "UPDATE revisions SET expanded_srcmd5=%s where id=%s",
(xml.get("srcmd5"), self.dbid), (xml.get("srcmd5"), self.dbid),
) )
for entry in xml.findall("entry"): for entry in xml.findall("entry"):
# this file creates easily 100k commits and is just useless data :(
# unfortunately it's stored in the same meta package as the project config
if (
entry.get("name") == "_staging_workflow"
and self.package == "_project"
):
continue
cur.execute( cur.execute(
"""INSERT INTO files (name, md5, size, mtime, revision_id) """INSERT INTO files (name, md5, size, mtime, revision_id)
VALUES (%s,%s,%s,%s,%s)""", VALUES (%s,%s,%s,%s,%s)""",
( (
entry.get("name"), entry.get("name"),
@ -173,15 +216,19 @@ class DBRevision:
), ),
) )
def previous_commit(self, db): def previous_commit(self):
return self.fetch_revision(db, self.project, self.package, int(self.rev) - 1) return DBRevision.fetch_revision(
self.db, self.project, self.package, int(self.rev) - 1
)
def next_commit(self, db): def next_commit(self):
return self.fetch_revision(db, self.project, self.package, int(self.rev) + 1) return DBRevision.fetch_revision(
self.db, self.project, self.package, int(self.rev) + 1
)
def calculate_files_hash(self, db): def calculate_files_hash(self):
m = md5() m = md5()
for file_dict in self.files_list(db): for file_dict in self.files_list():
m.update( m.update(
( (
file_dict["name"] file_dict["name"]
@ -193,10 +240,10 @@ class DBRevision:
) )
return m.hexdigest() return m.hexdigest()
def files_list(self, db): def files_list(self):
if self._files: if self._files:
return self._files return self._files
with db.cursor() as cur: with self.db.cursor() as cur:
cur.execute("SELECT * from files where revision_id=%s", (self.dbid,)) cur.execute("SELECT * from files where revision_id=%s", (self.dbid,))
self._files = [] self._files = []
for row in cur.fetchall(): for row in cur.fetchall():
@ -207,27 +254,23 @@ class DBRevision:
self._files.sort(key=lambda x: x["name"]) self._files.sort(key=lambda x: x["name"])
return self._files return self._files
def calc_delta(self, db: DB, current_rev: Optional[DBRevision]): def calc_delta(self, current_rev: DBRevision | None):
"""Calculate the list of files to download and to delete. """Calculate the list of files to download and to delete.
Param current_rev is the revision that's currently checked out. Param current_rev is the revision that's currently checked out.
If it's None, the repository is empty. If it's None, the repository is empty.
""" """
to_download = [] to_download = []
to_delete = []
if current_rev: if current_rev:
old_files = { old_files = {
e["name"]: f"{e['md5']}-{e['size']}" for e in current_rev.files_list(db) e["name"]: f"{e['md5']}-{e['size']}" for e in current_rev.files_list()
} }
else: else:
old_files = dict() old_files = dict()
for entry in self.files_list(db): for entry in self.files_list():
if old_files.get(entry["name"]) != f"{entry['md5']}-{entry['size']}": if old_files.get(entry["name"]) != f"{entry['md5']}-{entry['size']}":
logging.debug(f"Download {entry['name']}") to_download.append((Path(entry["name"]), entry["size"], entry["md5"]))
to_download.append((entry["name"], entry["md5"]))
old_files.pop(entry["name"], None) old_files.pop(entry["name"], None)
for entry in old_files.keys(): to_delete = [Path(e) for e in old_files.keys()]
logging.debug(f"Delete {entry}")
to_delete.append(entry)
return to_download, to_delete return to_download, to_delete
@staticmethod @staticmethod
@ -245,9 +288,9 @@ class DBRevision:
"""Used in test cases to read a revision from fixtures into the test database""" """Used in test cases to read a revision from fixtures into the test database"""
with db.cursor() as cur: with db.cursor() as cur:
cur.execute( cur.execute(
"""INSERT INTO revisions (project, package, rev, unexpanded_srcmd5, expanded_srcmd5, """INSERT INTO revisions (project, package, rev, unexpanded_srcmd5, expanded_srcmd5,
commit_time, userid, comment, broken, files_hash) commit_time, userid, comment, broken, files_hash, api_url)
VALUES(%s, %s, %s, %s, %s, %s, %s, %s, %s, %s) RETURNING id""", VALUES(%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s) RETURNING id""",
( (
rev_dict["project"], rev_dict["project"],
rev_dict["package"], rev_dict["package"],
@ -259,6 +302,7 @@ class DBRevision:
rev_dict["comment"], rev_dict["comment"],
rev_dict["broken"], rev_dict["broken"],
rev_dict["files_hash"], rev_dict["files_hash"],
rev_dict.get("api_url", "https://api.opensuse.org"),
), ),
) )
rev_id = cur.fetchone()[0] rev_id = cur.fetchone()[0]

View File

@ -9,12 +9,8 @@ class FlatNode:
self.parent2 = parent2 self.parent2 = parent2
def __str__(self) -> str: def __str__(self) -> str:
p1_str = "" p1_str = f" p1:{self.parent1.short_string()}" if self.parent1 else ""
if self.parent1: p2_str = f" p2:{self.parent2.short_string()}" if self.parent2 else ""
p1_str = f" p1:{self.parent1.short_string()}"
p2_str = ""
if self.parent2:
p2_str = f" p2:{self.parent2.short_string()}"
return f"{self.branch} c:{self.commit.short_string()}{p1_str}{p2_str}" return f"{self.branch} c:{self.commit.short_string()}{p1_str}{p2_str}"
@ -36,8 +32,7 @@ class FlatTreeWalker(AbstractWalker):
def handle_source_node(self, node) -> None: def handle_source_node(self, node) -> None:
if self.rebase_devel and node.parent and node.parent.merged_into: if self.rebase_devel and node.parent and node.parent.merged_into:
self.add("devel", node.revision, node.parent.merged_into.revision) self.add("devel", node.revision, node.parent.merged_into.revision)
return elif node.parent:
if node.parent:
self.add("devel", node.revision, node.parent.revision) self.add("devel", node.revision, node.parent.revision)
elif self.last_merge: elif self.last_merge:
self.add("devel", node.revision, self.last_merge.parent.revision) self.add("devel", node.revision, self.last_merge.parent.revision)

View File

@ -1,9 +1,10 @@
import fnmatch import fnmatch
import logging import logging
import os
import pathlib import pathlib
import subprocess import subprocess
import pygit2 import requests
from lib.binary import BINARY from lib.binary import BINARY
@ -18,12 +19,6 @@ class Git:
self.committer = committer self.committer = committer
self.committer_email = committer_email self.committer_email = committer_email
self.repo = None
def is_open(self):
return self.repo is not None
# TODO: Extend it to packages and files
def exists(self): def exists(self):
"""Check if the path is a valid git repository""" """Check if the path is a valid git repository"""
return (self.path / ".git").exists() return (self.path / ".git").exists()
@ -31,36 +26,72 @@ class Git:
def create(self): def create(self):
"""Create a local git repository""" """Create a local git repository"""
self.path.mkdir(parents=True, exist_ok=True) self.path.mkdir(parents=True, exist_ok=True)
# Convert the path to string, to avoid some limitations in self.open()
# older pygit2
self.repo = pygit2.init_repository(str(self.path)) def git_run(self, args, **kwargs):
return self """Run a git command"""
if "env" in kwargs:
envs = kwargs["env"].copy()
del kwargs["env"]
else:
envs = os.environ.copy()
envs["GIT_LFS_SKIP_SMUDGE"] = "1"
envs["GIT_CONFIG_GLOBAL"] = "/dev/null"
return subprocess.run(
["git"] + args,
cwd=self.path,
check=True,
env=envs,
**kwargs,
)
def open(self):
if not self.exists():
self.git_run(["init", "--object-format=sha256", "-b", "factory"])
self.git_run(["config", "lfs.allowincompletepush", "true"])
def is_dirty(self): def is_dirty(self):
"""Check if there is something to commit""" """Check if there is something to commit"""
assert self.is_open() status_str = self.git_run(
["status", "--porcelain=2"],
return self.repo.status() stdout=subprocess.PIPE,
).stdout.decode("utf-8")
return len(list(filter(None, status_str.split("\n")))) > 0
def branches(self): def branches(self):
return list(self.repo.branches) br = (
self.git_run(
["for-each-ref", "--format=%(refname:short)", "refs/heads/"],
stdout=subprocess.PIPE,
)
.stdout.decode("utf-8")
.split()
)
if len(br) == 0:
br.append("factory") # unborn branch?
return br
def branch(self, branch, commit=None): def branch(self, branch, commit="HEAD"):
if not commit: commit = (
commit = self.repo.head self.git_run(
else: ["rev-parse", "--verify", "--end-of-options", commit + "^{commit}"],
commit = self.repo.get(commit) stdout=subprocess.PIPE,
self.repo.branches.local.create(branch, commit) )
.stdout.decode("utf-8")
.strip()
)
return self.git_run(["branch", branch, commit])
def checkout(self, branch): def checkout(self, branch):
"""Checkout into the branch HEAD""" """Checkout into the branch HEAD"""
new_branch = False new_branch = False
ref = f"refs/heads/{branch}"
if branch not in self.branches(): if branch not in self.branches():
self.repo.references["HEAD"].set_target(ref) self.git_run(["switch", "-q", "--orphan", branch])
new_branch = True new_branch = True
else: else:
self.repo.checkout(ref) ref = f"refs/heads/{branch}"
if (self.path / ".git" / ref).exists():
self.git_run(["switch", "--no-guess", "-q", branch])
return new_branch return new_branch
def commit( def commit(
@ -73,10 +104,8 @@ class Git:
committer=None, committer=None,
committer_email=None, committer_email=None,
committer_time=None, committer_time=None,
allow_empty=False,
): ):
"""Add all the files and create a new commit in the current HEAD""" """Add all the files and create a new commit in the current HEAD"""
assert allow_empty or self.is_dirty()
if not committer: if not committer:
committer = self.committer if self.committer else self.user committer = self.committer if self.committer else self.user
@ -85,125 +114,80 @@ class Git:
) )
committer_time = committer_time if committer_time else user_time committer_time = committer_time if committer_time else user_time
try: if self.is_dirty():
self.repo.index.add_all() self.git_run(["add", "--all", "."])
except pygit2.GitError as e:
if not allow_empty:
raise e
self.repo.index.write() tree_id = (
author = pygit2.Signature(user, user_email, int(user_time.timestamp())) self.git_run(["write-tree"], stdout=subprocess.PIPE)
committer = pygit2.Signature( .stdout.decode("utf-8")
committer, committer_email, int(committer_time.timestamp()) .strip()
)
if not parents:
try:
parents = [self.repo.head.target]
except pygit2.GitError as e:
parents = []
if not allow_empty:
raise e
tree = self.repo.index.write_tree()
return self.repo.create_commit(
"HEAD", author, committer, message, tree, parents
) )
def merge( parent_array = []
self, if isinstance(parents, list):
user, for parent in filter(None, parents):
user_email, parent_array = parent_array + ["-p", parent]
user_time, elif isinstance(parents, str):
message, parent_array = ["-p", parents]
commit,
committer=None,
committer_email=None,
committer_time=None,
clean_on_conflict=True,
merged=False,
allow_empty=False,
):
new_branch = False
if not merged: commit_id = (
try: self.git_run(
self.repo.merge(commit) ["commit-tree"] + parent_array + [tree_id],
except KeyError: env={
# If it is the first commit, we will have a missing "GIT_AUTHOR_NAME": user,
# "HEAD", but the files will be there. We can proceed "GIT_AUTHOR_EMAIL": user_email,
# to the commit directly. "GIT_AUTHOR_DATE": f"{int(user_time.timestamp())} +0000",
new_branch = True "GIT_COMMITTER_NAME": committer,
"GIT_COMMITTER_EMAIL": committer_email,
"GIT_COMMITTER_DATE": f"{int(committer_time.timestamp())} +0000",
},
input=message.encode("utf-8"),
stdout=subprocess.PIPE,
)
.stdout.decode("utf-8")
.rstrip()
)
self.git_run(["reset", "--soft", commit_id])
return commit_id
if not merged and self.repo.index.conflicts: def branch_head(self, branch="HEAD"):
for conflict in self.repo.index.conflicts: return (
conflict = [c for c in conflict if c] self.git_run(
if conflict: ["rev-parse", "--verify", "--end-of-options", branch],
logging.info(f"CONFLICT {conflict[0].path}") stdout=subprocess.PIPE,
)
if clean_on_conflict: .stdout.decode("utf-8")
self.clean() .strip()
# Now I miss Rust enums
return "CONFLICT"
# Some merges are empty in OBS (no changes, not sure
# why), for now we signal them
if not allow_empty and not self.is_dirty():
# I really really do miss Rust enums
return "EMPTY"
if new_branch:
parents = [commit]
else:
parents = [
self.repo.head.target,
commit,
]
commit = self.commit(
user,
user_email,
user_time,
message,
parents,
committer,
committer_email,
committer_time,
allow_empty=allow_empty,
) )
return commit def set_branch_head(self, branch, commit):
return self.git_run(["update-ref", f"refs/heads/{branch}", commit])
def merge_abort(self):
self.repo.state_cleanup()
def last_commit(self):
try:
return self.repo.head.target
except:
return None
def branch_head(self, branch):
return self.repo.references["refs/heads/" + branch].target
def gc(self): def gc(self):
logging.info(f"Garbage recollect and repackage {self.path}") logging.debug(f"Garbage recollect and repackage {self.path}")
subprocess.run( self.git_run(
["git", "gc", "--auto"], ["gc", "--auto"],
cwd=self.path,
stdout=subprocess.PIPE, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT, stderr=subprocess.STDOUT,
) )
def clean(self): # def clean(self):
for path, _ in self.repo.status().items(): # for path, _ in self.repo.status().items():
logging.debug(f"Cleaning {path}") # logging.debug(f"Cleaning {path}")
try: # try:
(self.path / path).unlink() # (self.path / path).unlink()
self.repo.index.remove(path) # self.repo.index.remove(path)
except Exception as e: # except Exception as e:
logging.warning(f"Error removing file {path}: {e}") # logging.warning(f"Error removing file {path}: {e}")
def add(self, filename): def add(self, filename):
self.repo.index.add(filename) self.git_run(["add", ":(literal)" + str(filename)])
def add_default_gitignore(self):
if not (self.path / ".gitignore").exists():
with (self.path / ".gitignore").open("w") as f:
f.write(".osc\n")
self.add(".gitignore")
def add_default_lfs_gitattributes(self, force=False): def add_default_lfs_gitattributes(self, force=False):
if not (self.path / ".gitattributes").exists() or force: if not (self.path / ".gitattributes").exists() or force:
@ -256,11 +240,49 @@ class Git:
) )
return any(fnmatch.fnmatch(filename, line) for line in patterns) return any(fnmatch.fnmatch(filename, line) for line in patterns)
def remove(self, filename): def remove(self, file: pathlib.Path):
self.repo.index.remove(filename) self.git_run(
(self.path / filename).unlink() ["rm", "-q", "-f", "--ignore-unmatch", ":(literal)" + file.name],
)
patterns = self.get_specific_lfs_gitattributes() patterns = self.get_specific_lfs_gitattributes()
if filename in patterns: if file.name in patterns:
patterns.remove(filename) patterns.remove(file.name)
self.add_specific_lfs_gitattributes(patterns) self.add_specific_lfs_gitattributes(patterns)
def add_gitea_remote(self, package):
repo_name = package.replace("+", "_")
org_name = "pool"
if not os.getenv("GITEA_TOKEN"):
logging.warning("Not adding a remote due to missing $GITEA_TOKEN")
return
url = f"https://src.opensuse.org/api/v1/org/{org_name}/repos"
response = requests.post(
url,
data={"name": repo_name, "object_format_name": "sha256"},
headers={"Authorization": f"token {os.getenv('GITEA_TOKEN')}"},
timeout=10,
)
# 409 Conflict (Already existing)
# 201 Created
if response.status_code not in (201, 409):
print(response.data)
url = f"gitea@src.opensuse.org:{org_name}/{repo_name}.git"
self.git_run(
["remote", "add", "origin", url],
)
def push(self, force=False):
if "origin" not in self.git_run(
["remote"],
stdout=subprocess.PIPE,
).stdout.decode("utf-8"):
logging.warning("Not pushing to remote because no 'origin' configured")
return
cmd = ["push"]
if force:
cmd.append("-f")
cmd += ["origin", "--all"]
self.git_run(cmd)

View File

@ -6,99 +6,48 @@ import yaml
from lib.binary import is_binary_or_large from lib.binary import is_binary_or_large
from lib.db import DB from lib.db import DB
from lib.git import Git from lib.git import Git
from lib.lfs_oid import LFSOid
from lib.obs import OBS from lib.obs import OBS
from lib.proxy_sha256 import ProxySHA256, md5 from lib.proxy_sha256 import ProxySHA256
from lib.tree_builder import TreeBuilder from lib.tree_builder import TreeBuilder
from lib.user import User
class GitExporter: class GitExporter:
def __init__(self, api_url, project, package, repodir): def __init__(self, api_url, project, package, repodir, cachedir):
self.obs = OBS() self.obs = OBS(api_url)
self.project = project self.project = project
self.package = package self.package = package
# TODO: Store the api url in the revision self.db = DB()
self.obs.change_url(api_url) self.proxy_sha256 = ProxySHA256(self.obs, self.db)
self.proxy_sha256 = ProxySHA256(self.obs, enabled=True)
self.git = Git( self.git = Git(
repodir, repodir / package,
committer="Git OBS Bridge", committer="Git OBS Bridge",
committer_email="obsbridge@suse.de", committer_email="obsbridge@suse.de",
).create() )
if self.git.exists():
self.git.open()
else:
self.git.create()
self.git.add_gitea_remote(package)
self.state_file = os.path.join(self.git.path, ".git", "_flat_state.yaml") self.state_file = os.path.join(self.git.path, ".git", "_flat_state.yaml")
self.gc_interval = 200 self.gc_interval = 200
self.cachedir = cachedir
def download(self, revision):
obs_files = self.obs.files(revision.project, revision.package, revision.srcmd5)
git_files = {
(f.name, f.stat().st_size, md5(f))
for f in self.git.path.iterdir()
if f.is_file() and f.name not in (".gitattributes")
}
# Overwrite ".gitattributes" with the
self.git.add_default_lfs_gitattributes(force=True)
# Download each file in OBS if it is not a binary (or large)
# file
for (name, size, file_md5) in obs_files:
# this file creates easily 100k commits and is just useless data :(
# unfortunately it's stored in the same meta package as the project config
if revision.package == "_project" and name == "_staging_workflow":
continue
# have such files been detected as text mimetype before?
is_text = self.proxy_sha256.is_text(name)
if not is_text and is_binary_or_large(name, size):
file_sha256 = self.proxy_sha256.get_or_put(
revision.project,
revision.package,
name,
revision.srcmd5,
file_md5,
size,
)
self.git.add_lfs(name, file_sha256["sha256"], size)
else:
if (name, size, file_md5) not in git_files:
logging.debug(f"Download {name}")
self.obs.download(
revision.project,
revision.package,
name,
revision.srcmd5,
self.git.path,
file_md5=file_md5,
)
# Validate the MD5 of the downloaded file
if md5(self.git.path / name) != file_md5:
raise Exception(f"Download error in {name}")
self.git.add(name)
# Remove extra files
obs_names = {n for (n, _, _) in obs_files}
git_names = {n for (n, _, _) in git_files}
for name in git_names - obs_names:
logging.debug(f"Remove {name}")
self.git.remove(name)
def set_gc_interval(self, gc): def set_gc_interval(self, gc):
self.gc_interval = gc self.gc_interval = gc
def export_as_git(self): def check_repo_state(self, flats, branch_state):
db = DB()
tree = TreeBuilder(db).build(self.project, self.package)
flats = tree.as_flat_list()
branch_state = {"factory": None, "devel": None}
state_data = dict() state_data = dict()
if os.path.exists(self.state_file): if os.path.exists(self.state_file):
with open(self.state_file, "r") as f: with open(self.state_file) as f:
state_data = yaml.safe_load(f) state_data = yaml.safe_load(f)
if type(state_data) != dict: if not isinstance(state_data, dict):
state_data = {} state_data = {}
left_to_commit = [] left_to_commit = []
for flat in reversed(flats): for flat in reversed(flats):
found_state = False found_state = False
for branch in ["factory", "devel"]: for branch in ["factory"]:
if flat.commit.dbid == state_data.get(branch): if flat.commit.dbid == state_data.get(branch):
branch_state[branch] = flat.commit branch_state[branch] = flat.commit
flat.commit.git_commit = self.git.branch_head(branch) flat.commit.git_commit = self.git.branch_head(branch)
@ -109,55 +58,116 @@ class GitExporter:
found_state = True found_state = True
if not found_state: if not found_state:
left_to_commit.append(flat) left_to_commit.append(flat)
return left_to_commit
def export_as_git(self):
if os.getenv("CHECK_ALL_LFS"):
LFSOid.check_all(self.db, self.package)
tree = TreeBuilder(self.db).build(self.project, self.package)
flats = tree.as_flat_list()
branch_state = {"factory": None, "devel": None}
left_to_commit = self.check_repo_state(flats, branch_state)
if not left_to_commit:
return
logging.info(f"Commiting into {self.git.path}")
self.run_gc()
users = dict()
gc_cnt = self.gc_interval
if len(left_to_commit) > 0:
self.git.gc()
for flat in left_to_commit: for flat in left_to_commit:
gc_cnt -= 1 if flat.commit.userid not in users:
if gc_cnt <= 0 and self.gc_interval: users[flat.commit.userid] = User.find(self.db, flat.commit.userid)
self.git.gc() flat.user = users[flat.commit.userid]
gc_cnt = self.gc_interval self.gc_cnt -= 1
if self.gc_cnt <= 0 and self.gc_interval:
self.run_gc()
logging.debug(f"Committing {flat}") logging.debug(f"Committing {flat}")
self.commit_flat(db, flat, branch_state) self.commit_flat(flat, branch_state)
def limit_download(self, file): # make sure that we create devel branch
if file.endswith(".spec") or file.endswith(".changes"): if not branch_state["devel"]:
return True logging.debug("force creating devel")
return False self.git.set_branch_head("devel", self.git.branch_head("factory"))
def commit_flat(self, db, flat, branch_state): self.git.push(force=True)
parents = []
self.git.checkout(flat.branch) def run_gc(self):
if flat.parent1: self.gc_cnt = self.gc_interval
parents.append(flat.parent1.git_commit) self.git.gc()
if flat.parent2:
parents.append(flat.parent2.git_commit) def is_lfs_file(self, package, filename, size):
to_download, to_delete = flat.commit.calc_delta(db, branch_state[flat.branch]) if not is_binary_or_large(filename, size):
for file in to_delete: return False
if not self.limit_download(file): return not self.proxy_sha256.is_text(package, filename)
continue
self.git.remove(file) def commit_file(self, flat, file, size, md5):
for file, md5 in to_download: # have such files been detected as text mimetype before?
if not self.limit_download(file): if self.is_lfs_file(flat.commit.package, file.name, size):
continue file_sha256 = self.proxy_sha256.get_or_put(
self.obs.download(
flat.commit.project, flat.commit.project,
flat.commit.package, flat.commit.package,
file, file.name,
flat.commit.expanded_srcmd5, flat.commit.expanded_srcmd5,
self.git.path, md5,
file_md5=md5, size,
) )
self.git.add(file) # as it's newly registered, it might be a text file now, so double check
if not self.proxy_sha256.is_text(flat.commit.package, file.name):
self.git.add_lfs(file.name, file_sha256, size)
return
self.commit_non_lfs_file(flat, file, md5)
def commit_non_lfs_file(self, flat, file, md5):
self.obs.change_url(flat.commit.api_url)
self.obs.download(
flat.commit.project,
flat.commit.package,
file.name,
flat.commit.expanded_srcmd5,
self.git.path,
self.cachedir,
file_md5=md5,
)
self.git.add(file)
def branch_fits_parent1(self, flat, branch_state):
if branch_state[flat.branch] is None:
# everything fits nothing
return True
return flat.parent1 == branch_state[flat.branch]
def commit_flat(self, flat, branch_state):
parents = []
self.git.checkout(flat.branch)
if flat.parent1:
if not self.branch_fits_parent1(flat, branch_state):
logging.debug(f"Reset {flat.branch} onto {flat.parent1.short_string()}")
assert flat.parent1.git_commit
self.git.set_branch_head(flat.branch, flat.parent1.git_commit)
self.git.checkout(flat.branch)
parents.append(flat.parent1.git_commit)
if flat.parent2:
assert flat.parent2.git_commit
parents.append(flat.parent2.git_commit)
# create file if not existant
self.git.add_default_lfs_gitattributes(force=False)
self.git.add_default_gitignore()
to_download, to_delete = flat.commit.calc_delta(branch_state[flat.branch])
for file in to_delete:
self.git.remove(file)
for file, size, md5 in to_download:
self.commit_file(flat, file, size, md5)
commit = self.git.commit( commit = self.git.commit(
f"OBS User {flat.commit.userid}", flat.user.realname,
"null@suse.de", flat.user.email,
flat.commit.commit_time, flat.commit.commit_time,
# TODO: Normalize better the commit message flat.commit.git_commit_message(),
f"{flat.commit.comment}\n\n{flat.commit}",
allow_empty=True,
parents=parents, parents=parents,
) )
flat.commit.git_commit = commit flat.commit.git_commit = commit

20
lib/hash.py Normal file
View File

@ -0,0 +1,20 @@
import functools
import hashlib
def _hash(hash_alg, file_or_path):
h = hash_alg()
def __hash(f):
while chunk := f.read(1024 * 4):
h.update(chunk)
if hasattr(file_or_path, "read"):
__hash(file_or_path)
else:
with file_or_path.open("rb") as f:
__hash(f)
return h.hexdigest()
md5 = functools.partial(_hash, hashlib.md5)

View File

@ -1,4 +1,5 @@
import logging import logging
import pathlib
import xml.etree.ElementTree as ET import xml.etree.ElementTree as ET
from lib.db import DB from lib.db import DB
@ -8,46 +9,65 @@ from lib.obs_revision import OBSRevision
from lib.user import User from lib.user import User
def refresh_package(importer, project, package):
importer.refresh_package(project, package)
def import_request(importer, number):
importer.import_request(number)
def import_rev(importer, rev):
importer.import_rev(rev)
class Importer: class Importer:
def __init__(self, api_url, project, package): def __init__(self, api_url, project, packages):
# Import a Factory package into the database # Import multiple Factory packages into the database
self.package = package self.packages = packages
self.project = project self.project = project
self.scmsync_cache = dict()
self.packages_with_scmsync = set()
self.obs = OBS() self.db = DB()
assert project == "openSUSE:Factory" self.obs = OBS(api_url)
self.obs.change_url(api_url) assert not self.has_scmsync(project)
self.refreshed_packages = set() self.refreshed_packages = set()
self.gone_packages_set = None
def update_db_package(self, db, project, package):
def import_request(self, number):
self.obs.request(number).import_into_db(self.db)
def update_db_package(self, project, package):
root = self.obs._history(project, package) root = self.obs._history(project, package)
if root is None: if root is None:
return return
latest = DBRevision.latest_revision(db, project, package) latest = DBRevision.max_rev(self.db, project, package)
for r in root.findall("revision"): for r in root.findall("revision"):
rev = OBSRevision(self.obs, project, package).parse(r) rev = OBSRevision(self.obs, project, package).parse(r)
if not latest or rev.rev > latest.rev: if not latest or rev.rev > latest:
dbrev = DBRevision.import_obs_rev(db, rev) dbrev = DBRevision.import_obs_rev(self.db, rev)
try: try:
root = rev.read_link() root = rev.read_link()
except ET.ParseError: except ET.ParseError:
dbrev.set_broken(db) dbrev.set_broken()
continue continue
if root is not None: if root is not None:
tprj = root.get("project") or project tprj = root.get("project") or project
tpkg = root.get("package") or package tpkg = root.get("package") or package
dbrev.links_to(db, tprj, tpkg) dbrev.links_to(tprj, tpkg)
def find_linked_revs(self, db): def find_linked_revs(self):
with db.cursor() as cur: with self.db.cursor() as cur:
cur.execute( cur.execute(
"""SELECT * from revisions WHERE id in (SELECT l.revision_id FROM links l """SELECT * from revisions WHERE id in (SELECT l.revision_id FROM links l
LEFT JOIN linked_revs lrevs ON lrevs.revision_id=l.revision_id LEFT JOIN linked_revs lrevs ON lrevs.revision_id=l.revision_id
WHERE lrevs.id IS NULL) and broken is FALSE;""" WHERE lrevs.id IS NULL) and broken is FALSE;"""
) )
for row in cur.fetchall(): for row in cur.fetchall():
rev = DBRevision(row) rev = DBRevision(self.db, row)
linked_rev = rev.linked_rev(db) linked_rev = rev.linked_rev()
if not linked_rev: if not linked_rev:
logging.debug(f"No link {rev}") logging.debug(f"No link {rev}")
continue continue
@ -57,8 +77,8 @@ class Importer:
(rev.dbid, linked_rev.dbid), (rev.dbid, linked_rev.dbid),
) )
def fetch_all_linked_packages(self, db, project, package): def fetch_all_linked_packages(self, project, package):
with db.cursor() as cur: with self.db.cursor() as cur:
cur.execute( cur.execute(
"""SELECT DISTINCT l.project, l.package from links l JOIN revisions r """SELECT DISTINCT l.project, l.package from links l JOIN revisions r
on r.id=l.revision_id WHERE r.project=%s AND r.package=%s""", on r.id=l.revision_id WHERE r.project=%s AND r.package=%s""",
@ -67,36 +87,36 @@ class Importer:
for row in cur.fetchall(): for row in cur.fetchall():
(lproject, lpackage) = row (lproject, lpackage) = row
# recurse # recurse
self.refresh_package(db, lproject, lpackage) self.refresh_package(lproject, lpackage)
def find_fake_revisions(self, db): def find_fake_revisions(self):
with db.cursor() as cur: with self.db.cursor() as cur:
cur.execute( cur.execute(
"SELECT * from revisions WHERE id in (SELECT linked_id from linked_revs WHERE considered=FALSE)" "SELECT * from revisions WHERE id in (SELECT linked_id from linked_revs WHERE considered=FALSE)"
) )
for row in cur.fetchall(): for row in cur.fetchall():
self._find_fake_revision(db, DBRevision(row)) self._find_fake_revision(DBRevision(self.db, row))
def _find_fake_revision(self, db, rev): def _find_fake_revision(self, rev):
prev = rev.previous_commit(db) prev = rev.previous_commit()
if not prev: if not prev:
with db.cursor() as cur: with self.db.cursor() as cur:
cur.execute( cur.execute(
"UPDATE linked_revs SET considered=TRUE where linked_id=%s", "UPDATE linked_revs SET considered=TRUE where linked_id=%s",
(rev.dbid,), (rev.dbid,),
) )
return return
with db.cursor() as cur: with self.db.cursor() as cur:
cur.execute( cur.execute(
"""SELECT * FROM revisions WHERE id IN """SELECT * FROM revisions WHERE id IN
(SELECT revision_id from linked_revs WHERE linked_id=%s) (SELECT revision_id from linked_revs WHERE linked_id=%s)
AND commit_time <= %s ORDER BY commit_time""", AND commit_time <= %s ORDER BY commit_time""",
(prev.dbid, rev.commit_time), (prev.dbid, rev.commit_time),
) )
last_linked = None last_linked = None
for linked in cur.fetchall(): for linked in cur.fetchall():
linked = DBRevision(linked) linked = DBRevision(self.db, linked)
nextrev = linked.next_commit(db) nextrev = linked.next_commit()
if nextrev and nextrev.commit_time < rev.commit_time: if nextrev and nextrev.commit_time < rev.commit_time:
continue continue
last_linked = linked last_linked = linked
@ -107,7 +127,7 @@ class Importer:
if not last_linked: if not last_linked:
return return
with db.cursor() as cur: with self.db.cursor() as cur:
linked = last_linked linked = last_linked
cur.execute( cur.execute(
"SELECT 1 FROM fake_revs where revision_id=%s AND linked_id=%s", "SELECT 1 FROM fake_revs where revision_id=%s AND linked_id=%s",
@ -120,10 +140,10 @@ class Importer:
) )
return return
fake_rev = linked.rev + rev.rev / 1000.0 fake_rev = linked.rev + rev.rev / 1000.0
comment = f"Updating link to change in {rev.project}/{rev.package} revision {rev.rev}" comment = f"Updating link to change in {rev.project}/{rev.package} revision {int(rev.rev)}"
cur.execute( cur.execute(
"""INSERT INTO revisions (project,package,rev,unexpanded_srcmd5, """INSERT INTO revisions (project,package,rev,unexpanded_srcmd5,
commit_time, userid, comment) VALUES(%s,%s,%s,%s,%s,%s,%s) RETURNING id""", commit_time, userid, comment, api_url) VALUES(%s,%s,%s,%s,%s,%s,%s,%s) RETURNING id""",
( (
linked.project, linked.project,
linked.package, linked.package,
@ -132,6 +152,7 @@ class Importer:
rev.commit_time, rev.commit_time,
"buildservice-autocommit", "buildservice-autocommit",
comment, comment,
linked.api_url,
), ),
) )
new_id = cur.fetchone()[0] new_id = cur.fetchone()[0]
@ -144,70 +165,118 @@ class Importer:
(rev.dbid, linked.dbid), (rev.dbid, linked.dbid),
) )
def revisions_without_files(self, db): def revisions_without_files(self, package):
with db.cursor() as cur: logging.debug(f"revisions_without_files({package})")
with self.db.cursor() as cur:
cur.execute( cur.execute(
"SELECT * FROM revisions WHERE broken=FALSE AND expanded_srcmd5 IS NULL" "SELECT * FROM revisions WHERE package=%s AND broken=FALSE AND expanded_srcmd5 IS NULL",
(package,),
) )
return [DBRevision(row) for row in cur.fetchall()] return [DBRevision(self.db, row) for row in cur.fetchall()]
def fill_file_lists(self, db): def import_rev(self, rev):
self.find_linked_revs(db) with self.db.cursor() as cur:
cur.execute(
self.find_fake_revisions(db) """SELECT unexpanded_srcmd5 from revisions WHERE
for rev in self.revisions_without_files(db): id=(SELECT linked_id FROM linked_revs WHERE revision_id=%s)""",
with db.cursor() as cur: (rev.dbid,),
)
linked_rev = cur.fetchone()
if linked_rev:
linked_rev = linked_rev[0]
obs_dir_list = self.obs.list(
rev.project, rev.package, rev.unexpanded_srcmd5, linked_rev
)
if obs_dir_list:
rev.import_dir_list(obs_dir_list)
md5 = rev.calculate_files_hash()
with self.db.cursor() as cur:
cur.execute( cur.execute(
"""SELECT unexpanded_srcmd5 from revisions WHERE "UPDATE revisions SET files_hash=%s WHERE id=%s",
id=(SELECT linked_id FROM linked_revs WHERE revision_id=%s)""", (md5, rev.dbid),
(rev.dbid,),
) )
linked_rev = cur.fetchone() else:
if linked_rev: rev.set_broken()
linked_rev = linked_rev[0]
list = self.obs.list(
rev.project, rev.package, rev.unexpanded_srcmd5, linked_rev
)
if list:
rev.import_dir_list(db, list)
md5 = rev.calculate_files_hash(db)
with db.cursor() as cur:
cur.execute(
"UPDATE revisions SET files_hash=%s WHERE id=%s",
(md5, rev.dbid),
)
else:
rev.set_broken(db)
def refresh_package(self, db, project, package): def fill_file_lists(self):
self.find_linked_revs()
self.find_fake_revisions()
for package in self.packages:
for rev in self.revisions_without_files(package):
print(f"rev {rev} is without files")
self.import_rev(rev)
def refresh_package(self, project, package):
key = f"{project}/{package}" key = f"{project}/{package}"
if key in self.refreshed_packages: if key in self.refreshed_packages:
# refreshing once is good enough # refreshing once is good enough
return return
if self.package_gone(key):
return
logging.debug(f"Refresh {project}/{package}")
self.refreshed_packages.add(key) self.refreshed_packages.add(key)
self.update_db_package(db, project, package) if self.has_scmsync(project) or self.has_scmsync(key):
self.fetch_all_linked_packages(db, project, package) self.packages_with_scmsync.add(package)
logging.debug(f"{project}/{package} already in Git - skipping")
return
self.update_db_package(project, package)
self.fetch_all_linked_packages(project, package)
def import_into_db(self): def import_into_db(self):
db = DB() for package in self.packages:
refresh_package(self, self.project, package)
self.refresh_package(db, self.project, self.package) self.db.conn.commit()
for number in DBRevision.requests_to_fetch(db):
self.obs.request(number).import_into_db(db) for number in DBRevision.requests_to_fetch(self.db):
with db.cursor() as cur: self.import_request(number)
self.db.conn.commit()
with self.db.cursor() as cur:
cur.execute( cur.execute(
"""SELECT DISTINCT source_project,source_package FROM requests """SELECT DISTINCT source_project,source_package FROM requests
WHERE id IN (SELECT request_id FROM revisions WHERE project=%s and package=%s);""", WHERE id IN (SELECT request_id FROM revisions WHERE project=%s and package = ANY(%s));""",
(self.project, self.package), (self.project, self.packages),
) )
for project, package in cur.fetchall(): for project, package in cur.fetchall():
self.refresh_package(db, project, package) self.refresh_package(project, package)
missing_users = User.missing_users(db) self.db.conn.commit()
missing_users = User.missing_users(self.db)
for userid in missing_users: for userid in missing_users:
missing_user = self.obs.user(userid) missing_user = self.obs.user(userid)
if missing_user: if missing_user:
missing_user.import_into_db(db) missing_user.import_into_db(self.db)
self.db.conn.commit()
self.fill_file_lists()
self.db.conn.commit()
def package_gone(self, key):
if not self.gone_packages_set:
self.gone_packages_set = set()
with open(pathlib.Path(__file__).parent.parent / "gone-packages.txt") as f:
for line in f.readlines():
self.gone_packages_set.add(line.strip())
return key in self.gone_packages_set
def has_scmsync(self, key):
if key in self.scmsync_cache:
return self.scmsync_cache[key]
root = self.obs._meta(key)
scmsync = None
scmsync_exists = False
if root and root.find('scmsync') is not None:
scmsync = root.find('scmsync').text
if scmsync:
scmsync_exists = scmsync.startswith('https://src.opensuse.org/pool/')
self.scmsync_cache[key] = scmsync_exists
return scmsync_exists
def package_with_scmsync(self, package):
return package in self.packages_with_scmsync
self.fill_file_lists(db)
db.conn.commit()

194
lib/lfs_oid.py Normal file
View File

@ -0,0 +1,194 @@
from __future__ import annotations
import logging
import os
import sys
import requests
from lib.binary import is_text_mimetype
from lib.db import DB
# no need for this class yet, so just leave the migration code here
class LFSOid:
def __init__(self, db: DB) -> None:
self.db = db
self.dbid = None
self.project = None
self.package = None
self.filename = None
self.revision = None
self.sha = None
self.size = None
self.mimetype = None
self.file_md5 = None
@staticmethod
def check_all(db, package):
with db.cursor() as cur:
cur.execute(
"SELECT lfs_oid_id FROM lfs_oid_in_package WHERE package=%s ORDER BY lfs_oid_id DESC limit 10 ",
(package,),
)
for row in cur.fetchall():
oid = LFSOid(db).set_from_dbid(row[0])
if not oid.check():
oid.register()
def add(
self,
project: str,
package: str,
filename: str,
revision: str,
sha256: str,
size: int,
mimetype: str,
file_md5: str,
) -> None:
with self.db.cursor() as cur:
# we UPDATE here so the return functions. conflicts are likely as we look for filename/md5 but conflict on sha256
cur.execute(
"""INSERT INTO lfs_oids (project,package,filename,rev,sha256,size,mimetype,file_md5)
VALUES (%s,%s,%s,%s,%s,%s,%s,%s)
ON CONFLICT (sha256,size) DO UPDATE SET mimetype=EXCLUDED.mimetype
RETURNING id""",
(
project,
package,
filename,
revision,
sha256,
size,
mimetype,
file_md5,
),
)
row = cur.fetchone()
lfs_oid_id = row[0]
cur.execute(
"""INSERT INTO lfs_oid_in_package (package,filename,lfs_oid_id)
VALUES (%s,%s,%s)""",
(package, filename, lfs_oid_id),
)
if is_text_mimetype(mimetype):
cur.execute(
"INSERT INTO text_files (package,filename) VALUES (%s,%s)",
(package, filename),
)
self.db.conn.commit()
self.set_from_dbid(lfs_oid_id)
if not self.check():
self.register()
def check(self):
url = f"http://localhost:9999/check/{self.sha256}/{self.size}"
response = requests.get(
url,
timeout=10,
)
return response.status_code == 200
def set_from_dbid(self, dbid: int) -> LFSOid:
with self.db.cursor() as cur:
cur.execute("SELECT * from lfs_oids where id=%s", (dbid,))
row = cur.fetchone()
self.set_from_row(row)
assert self.dbid == dbid
return self
def set_from_row(self, row: list) -> LFSOid:
(
self.dbid,
self.project,
self.package,
self.filename,
self.revision,
self.sha256,
self.size,
self.mimetype,
self.file_md5,
) = row
return self
def register(self):
if not os.getenv("GITEA_REGISTER_SECRET"):
logging.info("Not registering LFS due to missing secret")
return
data = {
"secret": os.getenv("GITEA_REGISTER_SECRET"),
"project": self.project,
"package": self.package,
"filename": self.filename,
"rev": self.revision,
"sha256": self.sha256,
"size": self.size,
}
url = "http://localhost:9999/register"
response = requests.post(
url,
json=data,
timeout=10,
)
response.raise_for_status()
logging.info(f"Register LFS returned {response.status_code}")
if __name__ == "__main__":
"""
Import the old data - it only makes sense on a DB with previously scanned revisions
curl -s https://stephan.kulow.org/git_lfs.csv.xz | xz -cd | PYTHONPATH=$PWD /usr/bin/python3 lib/lfs_oid.py
"""
db = DB()
logging.basicConfig(level=logging.DEBUG)
with db.cursor() as cur:
while True:
line = sys.stdin.readline()
if not line:
break
(
project,
package,
filename,
rev,
sha256,
size,
mimetype,
md5,
) = line.strip().split("\t")
cur.execute(
"""INSERT INTO lfs_oids (project,package,filename,rev,sha256,size,mimetype,file_md5)
VALUES (%s,%s,%s,%s,%s,%s,%s,%s) ON CONFLICT DO NOTHING""",
(project, package, filename, rev, sha256, size, mimetype, md5),
)
cur.execute(
"""
CREATE TEMPORARY TABLE lfs_oid_in_revision (
revision_id INTEGER,
lfs_oid_id INTEGER NOT NULL,
name VARCHAR(255) NOT NULL
)
"""
)
cur.execute(
"""INSERT INTO lfs_oid_in_revision (revision_id, lfs_oid_id, name)
SELECT revision_id,lfs_oids.id,files.name FROM lfs_oids JOIN files ON files.md5=lfs_oids.file_md5"""
)
cur.execute(
"""INSERT INTO text_files (package,filename)
SELECT DISTINCT r.package, lfs_oid_in_revision.name FROM lfs_oids
JOIN lfs_oid_in_revision on lfs_oid_in_revision.lfs_oid_id=lfs_oids.id
JOIN revisions r ON r.id=lfs_oid_in_revision.revision_id
WHERE lfs_oids.mimetype like 'text/%' ON CONFLICT DO NOTHING"""
)
cur.execute(
"""INSERT INTO lfs_oid_in_package (lfs_oid_id, package, filename)
SELECT DISTINCT lfs_oids.id,r.package, lfs_oid_in_revision.name FROM lfs_oids
JOIN lfs_oid_in_revision on lfs_oid_in_revision.lfs_oid_id=lfs_oids.id
JOIN revisions r ON r.id=lfs_oid_in_revision.revision_id"""
)
db.conn.commit()

View File

@ -1,5 +1,7 @@
import errno import errno
import logging import logging
import os
import shutil
import time import time
import urllib.parse import urllib.parse
import xml.etree.ElementTree as ET import xml.etree.ElementTree as ET
@ -7,6 +9,7 @@ from urllib.error import HTTPError
import osc.core import osc.core
from lib.hash import md5
from lib.request import Request from lib.request import Request
from lib.user import User from lib.user import User
@ -56,24 +59,25 @@ osc.core.http_GET = retry(osc.core.http_GET)
class OBS: class OBS:
def __init__(self, url=None): def __init__(self, url):
if url: self.url = None
self.change_url(url) self.change_url(url)
def change_url(self, url): def change_url(self, url):
self.url = url if url != self.url:
osc.conf.get_config(override_apiurl=url) self.url = url
osc.conf.get_config(override_apiurl=url)
def _xml(self, url_path, **params): def _xml(self, url_path, **params):
url = osc.core.makeurl(self.url, [url_path], params) url = osc.core.makeurl(self.url, [url_path], params)
logging.debug(f"GET {url}") logging.debug(f"GET {url}")
return ET.parse(osc.core.http_GET(url)).getroot() return ET.parse(osc.core.http_GET(url)).getroot()
def _meta(self, project, package, **params): def _meta(self, key, **params):
try: try:
root = self._xml(f"source/{project}/{package}/_meta", **params) root = self._xml(f"source/{key}/_meta", **params)
except HTTPError: except HTTPError:
logging.error(f"Package [{project}/{package} {params}] has no meta") logging.error(f"Project/Package [{key} {params}] has no meta")
return None return None
return root return root
@ -114,13 +118,13 @@ class OBS:
return root return root
def exists(self, project, package): def exists(self, project, package):
root = self._meta(project, package) root = self._meta(f"{project}/{package}")
if root is None: if root is None:
return False return False
return root.get("project") == project return root.get("project") == project
def devel_project(self, project, package): def devel_project(self, project, package):
root = self._meta(project, package) root = self._meta(f"{project}/{package}")
devel = root.find("devel") devel = root.find("devel")
if devel is None: if devel is None:
return None return None
@ -146,7 +150,7 @@ class OBS:
def _download(self, project, package, name, revision): def _download(self, project, package, name, revision):
url = osc.core.makeurl( url = osc.core.makeurl(
self.url, self.url,
["source", project, package, urllib.parse.quote(name)], ["source", project, package, name],
{"rev": revision, "expand": 1}, {"rev": revision, "expand": 1},
) )
return osc.core.http_GET(url) return osc.core.http_GET(url)
@ -158,10 +162,24 @@ class OBS:
name: str, name: str,
revision: str, revision: str,
dirpath: str, dirpath: str,
cachedir: str,
file_md5: str, file_md5: str,
) -> None: ) -> None:
with (dirpath / name).open("wb") as f: cached_file = self._path_from_md5(name, cachedir, file_md5)
f.write(self._download(project, package, name, revision).read()) if not self.in_cache(name, cachedir, file_md5):
with (dirpath / name).open("wb") as f:
logging.debug(f"Download {project}/{package}/{name}")
f.write(self._download(project, package, name, revision).read())
# Validate the MD5 of the downloaded file
if md5(dirpath / name) != file_md5:
raise Exception(f"Download error in {name}")
shutil.copy(dirpath / name, cached_file.with_suffix(".new"))
os.rename(cached_file.with_suffix(".new"), cached_file)
else:
shutil.copy(cached_file, dirpath / name)
logging.debug(f"Use cached {project}/{package}/{name}")
def list(self, project, package, srcmd5, linkrev): def list(self, project, package, srcmd5, linkrev):
params = {"rev": srcmd5, "expand": "1"} params = {"rev": srcmd5, "expand": "1"}
@ -179,3 +197,11 @@ class OBS:
raise e raise e
return root return root
def _path_from_md5(self, name, cachedir, md5):
filepath = cachedir / md5[:3]
filepath.mkdir(parents=True, exist_ok=True)
return filepath / md5[3:]
def in_cache(self, name, cachedir, md5):
return self._path_from_md5(name, cachedir, md5).exists()

View File

@ -1,106 +1,89 @@
import functools
import hashlib import hashlib
import logging import logging
import urllib
import requests try:
import magic
except:
print("Install python3-python-magic, not python3-magic")
raise
from lib.db import DB
def _hash(hash_alg, file_or_path): from lib.lfs_oid import LFSOid
h = hash_alg() from lib.obs import OBS
def __hash(f):
while chunk := f.read(1024 * 4):
h.update(chunk)
if hasattr(file_or_path, "read"):
__hash(file_or_path)
else:
with file_or_path.open("rb") as f:
__hash(f)
return h.hexdigest()
md5 = functools.partial(_hash, hashlib.md5)
sha256 = functools.partial(_hash, hashlib.sha256)
class ProxySHA256: class ProxySHA256:
def __init__(self, obs, url=None, enabled=True): def __init__(self, obs: OBS, db: DB):
self.obs = obs self.obs = obs
self.url = url if url else "http://source.dyn.cloud.suse.de" self.db = db
self.enabled = enabled
self.hashes = None self.hashes = None
self.texts = set() self.texts = None
self.mime = None
def load_package(self, package):
# _project is unreachable for the proxy - due to being a fake package
if package == "_project":
self.enabled = False
self.texts = set(["_config", "_service"])
self.hashes = dict()
return
logging.info("Retrieve all previously defined SHA256")
response = requests.get(f"http://source.dyn.cloud.suse.de/package/{package}")
if response.status_code == 200:
json = response.json()
self.hashes = json["shas"]
self.texts = set(json["texts"])
def get(self, package, name, file_md5): def get(self, package, name, file_md5):
key = f"{file_md5}-{name}"
if self.hashes is None: if self.hashes is None:
if self.enabled: self.load_hashes(package)
self.load_package(package) key = f"{file_md5}-{name}"
else: ret = self.hashes.get(key)
self.hashes = {} return ret
return self.hashes.get(key, None)
def _proxy_put(self, project, package, name, revision, file_md5, size): def load_hashes(self, package):
quoted_name = urllib.parse.quote(name) with self.db.cursor() as cur:
url = f"{self.obs.url}/public/source/{project}/{package}/{quoted_name}?rev={revision}" cur.execute(
response = requests.put( """SELECT lfs_oids.file_md5,lop.filename,lfs_oids.sha256,lfs_oids.size
self.url, FROM lfs_oid_in_package lop
data={ JOIN lfs_oids ON lfs_oids.id=lop.lfs_oid_id
"hash": file_md5, WHERE lop.package=%s""",
"filename": name, (package,),
"url": url, )
"package": package, self.hashes = {
}, f"{row[0]}-{row[1]}": (row[2], row[3]) for row in cur.fetchall()
) }
if response.status_code != 200:
raise Exception(f"Redirector error on {self.url} for {url}")
key = (file_md5, name)
self.hashes[key] = {
"sha256": response.content.decode("utf-8"),
"fsize": size,
}
return self.hashes[key]
def _obs_put(self, project, package, name, revision, file_md5, size):
key = (file_md5, name)
self.hashes[key] = {
"sha256": sha256(self.obs._download(project, package, name, revision)),
"fsize": size,
}
return self.hashes[key]
def put(self, project, package, name, revision, file_md5, size): def put(self, project, package, name, revision, file_md5, size):
if not self.enabled: if not self.mime:
return self._obs_put(project, package, name, revision, file_md5, size) self.mime = magic.Magic(mime=True)
return self._proxy_put(project, package, name, revision, file_md5, size)
def is_text(self, filename): mimetype = None
logging.debug(f"Add LFS for {project}/{package}/{name}")
fin = self.obs._download(project, package, name, revision)
sha = hashlib.sha256()
while True:
buffer = fin.read(10000)
if not buffer:
break
sha.update(buffer)
# only guess from the first 10K
if not mimetype:
mimetype = self.mime.from_buffer(buffer)
fin.close()
LFSOid(self.db).add(
project, package, name, revision, sha.hexdigest(), size, mimetype, file_md5
)
# reset
self.hashes = None
self.texts = None
return self.get(package, name, file_md5)
def is_text(self, package, filename):
if self.texts is None:
self.load_texts(package)
return filename in self.texts return filename in self.texts
def load_texts(self, package):
self.texts = set()
with self.db.cursor() as cur:
cur.execute("SELECT filename from text_files where package=%s", (package,))
for row in cur.fetchall():
self.texts.add(row[0])
def get_or_put(self, project, package, name, revision, file_md5, size): def get_or_put(self, project, package, name, revision, file_md5, size):
result = self.get(package, name, file_md5) result = self.get(package, name, file_md5)
if not result: if not result:
result = self.put(project, package, name, revision, file_md5, size) result = self.put(project, package, name, revision, file_md5, size)
# Sanity check sha256, db_size = result
if result["fsize"] != size: assert db_size == size
raise Exception(f"Redirector has different size for {name}")
return result return sha256

View File

@ -16,11 +16,11 @@ class TestExporter:
db = DB() db = DB()
with db.cursor() as cur: with db.cursor() as cur:
cur.execute( cur.execute(
"SELECT * from revisions where package=%s ORDER BY project,rev", "SELECT * from revisions where package=%s ORDER BY commit_time",
(self.package,), (self.package,),
) )
data = {"revisions": []} data = {"revisions": []}
for row in cur.fetchall(): for row in cur.fetchall():
data["revisions"].append(DBRevision(row).as_dict(db)) data["revisions"].append(DBRevision(db, row).as_dict())
yaml.dump(data, sys.stdout, default_flow_style=False) yaml.dump(data, sys.stdout, default_flow_style=False)

View File

@ -1,4 +1,3 @@
from typing import Dict
from xmlrpc.client import Boolean from xmlrpc.client import Boolean
from lib.db_revision import DBRevision from lib.db_revision import DBRevision
@ -104,14 +103,24 @@ class TreeBuilder:
"""For a given revision in the target, find the node in the source chain """For a given revision in the target, find the node in the source chain
that matches the files""" that matches the files"""
node = source_chain node = source_chain
candidates = []
while node: while node:
# exclude reverts happening after the merge # exclude reverts happening after the merge
if ( if (
node.revision.commit_time <= revision.commit_time node.revision.commit_time <= revision.commit_time
and node.revision.files_hash == revision.files_hash and node.revision.files_hash == revision.files_hash
): ):
return node candidates.append(node)
if node.merged_into:
# we can't have candidates that are crossing previous merges
# see https://src.opensuse.org/importers/git-importer/issues/14
candidates = []
node = node.parent node = node.parent
if candidates:
# the first candidate is the youngest one that matches the check. That's
# good enough. See FastCGI test case for rev 36 and 38: 37 reverted 36 and
# then 38 reverting the revert before it was submitted.
return candidates[0]
def add_merge_points(self, factory_revisions): def add_merge_points(self, factory_revisions):
"""For all target revisions that accepted a request, look up the merge """For all target revisions that accepted a request, look up the merge
@ -128,7 +137,7 @@ class TreeBuilder:
self.requests.add(node.revision.request_id) self.requests.add(node.revision.request_id)
class FindMergeWalker(AbstractWalker): class FindMergeWalker(AbstractWalker):
def __init__(self, builder: TreeBuilder, requests: Dict) -> None: def __init__(self, builder: TreeBuilder, requests: dict) -> None:
super().__init__() super().__init__()
self.source_revisions = dict() self.source_revisions = dict()
self.builder = builder self.builder = builder

View File

@ -1,3 +1,7 @@
from __future__ import annotations
from lib.db import DB
FAKE_ACCOUNTS = ( FAKE_ACCOUNTS = (
"unknown", "unknown",
"buildservice-autocommit", "buildservice-autocommit",
@ -15,6 +19,22 @@ FAKE_ACCOUNTS = (
class User: class User:
@staticmethod
def find(db: DB, userid: str) -> User:
row = User.lookup(db, userid)
self = User()
self.userid = userid
if row:
(_, _, self.email, self.realname) = row
else:
self.email = ""
self.realname = ""
if not self.email:
self.email = "null@suse.de"
if not self.realname:
self.realname = f"OBS User {userid}"
return self
def parse(self, xml, userid): def parse(self, xml, userid):
self.userid = userid self.userid = userid
self.realname = xml.find("realname").text self.realname = xml.find("realname").text

61
opensuse-monitor.py Executable file
View File

@ -0,0 +1,61 @@
#!/usr/bin/python3
import json
from pathlib import Path
import pika
import random
import time
MY_TASKS_DIR = Path(__file__).parent / "tasks"
def listen_events():
connection = pika.BlockingConnection(
pika.URLParameters("amqps://opensuse:opensuse@rabbit.opensuse.org")
)
channel = connection.channel()
channel.exchange_declare(
exchange="pubsub", exchange_type="topic", passive=True, durable=False
)
result = channel.queue_declare("", exclusive=True)
queue_name = result.method.queue
channel.queue_bind(
exchange="pubsub", queue=queue_name, routing_key="opensuse.obs.package.commit"
)
print(" [*] Waiting for logs. To exit press CTRL+C")
def callback(ch, method, properties, body):
if method.routing_key not in ("opensuse.obs.package.commit",):
return
body = json.loads(body)
if (
"project" in body
and "package" in body
and body["project"] == "openSUSE:Factory"
):
# Strip multibuild flavors
package = body["package"].partition(':')[0]
if "/" in package:
return
(MY_TASKS_DIR / package).touch()
print(" [x] %r:%r" % (method.routing_key, body["package"]))
channel.basic_consume(queue_name, callback, auto_ack=True)
channel.start_consuming()
def main():
while True:
try:
listen_events()
except (pika.exceptions.ConnectionClosed, pika.exceptions.AMQPHeartbeatTimeout):
time.sleep(random.randint(10, 100))
if __name__ == "__main__":
main()

14502
packages Normal file

File diff suppressed because it is too large Load Diff

1
tasks/.gitignore vendored Normal file
View File

@ -0,0 +1 @@
*

View File

@ -6,11 +6,14 @@ from lib.db_revision import DBRevision
from lib.obs import OBS from lib.obs import OBS
from lib.obs_revision import OBSRevision from lib.obs_revision import OBSRevision
# needs to exist in local oscrc (little tricky)
API_URL = "https://api.opensuse.org"
class TestDBMethods(unittest.TestCase): class TestDBMethods(unittest.TestCase):
def setUp(self): def setUp(self):
self.db = DB(section="test") self.db = DB(section="test")
self.obs = OBS() self.obs = OBS(API_URL)
def test_import(self): def test_import(self):
test_rev = OBSRevision(self.obs, "openSUSE:Factory", "xz") test_rev = OBSRevision(self.obs, "openSUSE:Factory", "xz")
@ -30,6 +33,7 @@ class TestDBMethods(unittest.TestCase):
db_rev = DBRevision.fetch_revision( db_rev = DBRevision.fetch_revision(
self.db, project="openSUSE:Factory", package="xz", rev="70" self.db, project="openSUSE:Factory", package="xz", rev="70"
) )
self.assertEqual(db_rev.api_url, API_URL)
self.assertEqual(str(test_rev), str(db_rev)) self.assertEqual(str(test_rev), str(db_rev))

4528
tests/fixtures/FastCGI-data.yaml vendored Normal file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,33 @@
- factory c:openSUSE:Factory/FastCGI/29.0 p1:openSUSE:Factory/FastCGI/28.0 p2:devel:libraries:c_c++/FastCGI/40.0
- devel c:devel:libraries:c_c++/FastCGI/40.0 p1:devel:libraries:c_c++/FastCGI/38.0
- factory c:openSUSE:Factory/FastCGI/28.0 p1:openSUSE:Factory/FastCGI/27.0 p2:devel:libraries:c_c++/FastCGI/38.0
- devel c:devel:libraries:c_c++/FastCGI/38.0 p1:devel:libraries:c_c++/FastCGI/37.0
- devel c:devel:libraries:c_c++/FastCGI/37.0 p1:devel:libraries:c_c++/FastCGI/36.0
- devel c:devel:libraries:c_c++/FastCGI/36.0 p1:devel:libraries:c_c++/FastCGI/34.0
- factory c:openSUSE:Factory/FastCGI/27.0 p1:openSUSE:Factory/FastCGI/26.0 p2:devel:libraries:c_c++/FastCGI/34.0
- devel c:devel:libraries:c_c++/FastCGI/34.0 p1:devel:libraries:c_c++/FastCGI/32.0
- factory c:openSUSE:Factory/FastCGI/26.0 p1:openSUSE:Factory/FastCGI/23.0 p2:devel:libraries:c_c++/FastCGI/32.0
- devel c:devel:libraries:c_c++/FastCGI/32.0 p1:devel:libraries:c_c++/FastCGI/30.0
- factory c:openSUSE:Factory/FastCGI/23.0 p1:openSUSE:Factory/FastCGI/20.0 p2:devel:libraries:c_c++/FastCGI/30.0
- devel c:devel:libraries:c_c++/FastCGI/30.0 p1:devel:libraries:c_c++/FastCGI/28.0
- factory c:openSUSE:Factory/FastCGI/20.0 p1:openSUSE:Factory/FastCGI/19.0 p2:devel:libraries:c_c++/FastCGI/28.0
- devel c:devel:libraries:c_c++/FastCGI/28.0 p1:devel:libraries:c_c++/FastCGI/26.0
- factory c:openSUSE:Factory/FastCGI/19.0 p1:openSUSE:Factory/FastCGI/18.0 p2:devel:libraries:c_c++/FastCGI/26.0
- devel c:devel:libraries:c_c++/FastCGI/26.0 p1:devel:libraries:c_c++/FastCGI/24.0
- factory c:openSUSE:Factory/FastCGI/18.0 p1:openSUSE:Factory/FastCGI/16.0 p2:devel:libraries:c_c++/FastCGI/24.0
- devel c:devel:libraries:c_c++/FastCGI/24.0 p1:devel:libraries:c_c++/FastCGI/22.0
- factory c:openSUSE:Factory/FastCGI/16.0 p1:openSUSE:Factory/FastCGI/15.0 p2:devel:libraries:c_c++/FastCGI/22.0
- devel c:devel:libraries:c_c++/FastCGI/22.0 p1:devel:libraries:c_c++/FastCGI/20.0
- factory c:openSUSE:Factory/FastCGI/15.0 p1:openSUSE:Factory/FastCGI/14.0 p2:devel:libraries:c_c++/FastCGI/20.0
- devel c:devel:libraries:c_c++/FastCGI/20.0 p1:devel:libraries:c_c++/FastCGI/19.014
- devel c:devel:libraries:c_c++/FastCGI/19.014 p1:devel:libraries:c_c++/FastCGI/18.0
- factory c:openSUSE:Factory/FastCGI/14.0 p1:openSUSE:Factory/FastCGI/13.0
- factory c:openSUSE:Factory/FastCGI/13.0 p1:openSUSE:Factory/FastCGI/11.0 p2:devel:libraries:c_c++/FastCGI/18.0
- devel c:devel:libraries:c_c++/FastCGI/18.0 p1:openSUSE:Factory/FastCGI/11.0
- factory c:openSUSE:Factory/FastCGI/11.0 p1:openSUSE:Factory/FastCGI/10.0
- factory c:openSUSE:Factory/FastCGI/10.0 p1:openSUSE:Factory/FastCGI/7.0
- factory c:openSUSE:Factory/FastCGI/7.0 p1:openSUSE:Factory/FastCGI/6.0
- factory c:openSUSE:Factory/FastCGI/6.0 p1:openSUSE:Factory/FastCGI/4.0
- factory c:openSUSE:Factory/FastCGI/4.0 p1:openSUSE:Factory/FastCGI/3.0
- factory c:openSUSE:Factory/FastCGI/3.0 p1:openSUSE:Factory/FastCGI/1.0
- factory c:openSUSE:Factory/FastCGI/1.0

View File

@ -0,0 +1,44 @@
- commit: openSUSE:Factory/FastCGI/29.0
merged:
- devel:libraries:c_c++/FastCGI/40.0
- commit: openSUSE:Factory/FastCGI/28.0
merged:
- devel:libraries:c_c++/FastCGI/38.0
- devel:libraries:c_c++/FastCGI/37.0
- devel:libraries:c_c++/FastCGI/36.0
- commit: openSUSE:Factory/FastCGI/27.0
merged:
- devel:libraries:c_c++/FastCGI/34.0
- commit: openSUSE:Factory/FastCGI/26.0
merged:
- devel:libraries:c_c++/FastCGI/32.0
- commit: openSUSE:Factory/FastCGI/23.0
merged:
- devel:libraries:c_c++/FastCGI/30.0
- commit: openSUSE:Factory/FastCGI/20.0
merged:
- devel:libraries:c_c++/FastCGI/28.0
- commit: openSUSE:Factory/FastCGI/19.0
merged:
- devel:libraries:c_c++/FastCGI/26.0
- commit: openSUSE:Factory/FastCGI/18.0
merged:
- devel:libraries:c_c++/FastCGI/24.0
- commit: openSUSE:Factory/FastCGI/16.0
merged:
- devel:libraries:c_c++/FastCGI/22.0
- commit: openSUSE:Factory/FastCGI/15.0
merged:
- devel:libraries:c_c++/FastCGI/20.0
- devel:libraries:c_c++/FastCGI/19.014
- commit: openSUSE:Factory/FastCGI/14.0
- commit: openSUSE:Factory/FastCGI/13.0
merged:
- devel:libraries:c_c++/FastCGI/18.0
- commit: openSUSE:Factory/FastCGI/11.0
- commit: openSUSE:Factory/FastCGI/10.0
- commit: openSUSE:Factory/FastCGI/7.0
- commit: openSUSE:Factory/FastCGI/6.0
- commit: openSUSE:Factory/FastCGI/4.0
- commit: openSUSE:Factory/FastCGI/3.0
- commit: openSUSE:Factory/FastCGI/1.0

9756
tests/fixtures/breeze-data.yaml vendored Normal file

File diff suppressed because it is too large Load Diff

171
tests/fixtures/breeze-expected-list.yaml vendored Normal file
View File

@ -0,0 +1,171 @@
- factory c:openSUSE:Factory/breeze/43.0 p1:openSUSE:Factory/breeze/42.0 p2:KDE:Frameworks5/breeze/150.0
- devel c:KDE:Frameworks5/breeze/150.0 p1:KDE:Frameworks5/breeze/148.0
- factory c:openSUSE:Factory/breeze/42.0 p1:openSUSE:Factory/breeze/41.0 p2:KDE:Frameworks5/breeze/148.0
- devel c:KDE:Frameworks5/breeze/148.0 p1:KDE:Frameworks5/breeze/147.0
- devel c:KDE:Frameworks5/breeze/147.0 p1:KDE:Frameworks5/breeze/145.0
- factory c:openSUSE:Factory/breeze/41.0 p1:openSUSE:Factory/breeze/40.0 p2:KDE:Frameworks5/breeze/145.0
- devel c:KDE:Frameworks5/breeze/145.0 p1:KDE:Frameworks5/breeze/143.0
- factory c:openSUSE:Factory/breeze/40.0 p1:openSUSE:Factory/breeze/39.0 p2:KDE:Frameworks5/breeze/143.0
- devel c:KDE:Frameworks5/breeze/143.0 p1:KDE:Frameworks5/breeze/142.0
- devel c:KDE:Frameworks5/breeze/142.0 p1:KDE:Frameworks5/breeze/141.0
- devel c:KDE:Frameworks5/breeze/141.0 p1:KDE:Frameworks5/breeze/139.0
- factory c:openSUSE:Factory/breeze/39.0 p1:openSUSE:Factory/breeze/38.0 p2:KDE:Frameworks5/breeze/139.0
- devel c:KDE:Frameworks5/breeze/139.0 p1:KDE:Frameworks5/breeze/137.0
- factory c:openSUSE:Factory/breeze/38.0 p1:openSUSE:Factory/breeze/37.0 p2:KDE:Frameworks5/breeze/137.0
- devel c:KDE:Frameworks5/breeze/137.0 p1:KDE:Frameworks5/breeze/136.0
- devel c:KDE:Frameworks5/breeze/136.0 p1:KDE:Frameworks5/breeze/135.0
- devel c:KDE:Frameworks5/breeze/135.0 p1:KDE:Frameworks5/breeze/134.0
- devel c:KDE:Frameworks5/breeze/134.0 p1:KDE:Frameworks5/breeze/132.0
- factory c:openSUSE:Factory/breeze/37.0 p1:openSUSE:Factory/breeze/36.0 p2:KDE:Frameworks5/breeze/132.0
- devel c:KDE:Frameworks5/breeze/132.0 p1:KDE:Frameworks5/breeze/130.0
- factory c:openSUSE:Factory/breeze/36.0 p1:openSUSE:Factory/breeze/35.0 p2:KDE:Frameworks5/breeze/130.0
- devel c:KDE:Frameworks5/breeze/130.0 p1:KDE:Frameworks5/breeze/128.0
- factory c:openSUSE:Factory/breeze/35.0 p1:openSUSE:Factory/breeze/34.0 p2:KDE:Frameworks5/breeze/128.0
- devel c:KDE:Frameworks5/breeze/128.0 p1:KDE:Frameworks5/breeze/127.0
- devel c:KDE:Frameworks5/breeze/127.0 p1:KDE:Frameworks5/breeze/126.034
- devel c:KDE:Frameworks5/breeze/126.034 p1:KDE:Frameworks5/breeze/126.0
- devel c:KDE:Frameworks5/breeze/126.0 p1:KDE:Frameworks5/breeze/125.0
- devel c:KDE:Frameworks5/breeze/125.0 p1:KDE:Frameworks5/breeze/124.0
- devel c:KDE:Frameworks5/breeze/124.0 p1:KDE:Frameworks5/breeze/123.0
- devel c:KDE:Frameworks5/breeze/123.0 p1:KDE:Frameworks5/breeze/122.0
- devel c:KDE:Frameworks5/breeze/122.0 p1:KDE:Frameworks5/breeze/120.0
- factory c:openSUSE:Factory/breeze/34.0 p1:openSUSE:Factory/breeze/33.0 p2:KDE:Frameworks5:LTS/breeze/14.0
- devel c:KDE:Frameworks5:LTS/breeze/14.0 p1:KDE:Frameworks5:LTS/breeze/13.0
- devel c:KDE:Frameworks5:LTS/breeze/13.0 p1:KDE:Frameworks5:LTS/breeze/12.0
- devel c:KDE:Frameworks5:LTS/breeze/12.0 p1:KDE:Frameworks5:LTS/breeze/11.0
- factory c:openSUSE:Factory/breeze/33.0 p1:openSUSE:Factory/breeze/32.0 p2:KDE:Frameworks5:LTS/breeze/11.0
- devel c:KDE:Frameworks5:LTS/breeze/11.0 p1:KDE:Frameworks5:LTS/breeze/10.0
- devel c:KDE:Frameworks5:LTS/breeze/10.0 p1:KDE:Frameworks5:LTS/breeze/9.0
- devel c:KDE:Frameworks5:LTS/breeze/9.0 p1:KDE:Frameworks5:LTS/breeze/8.0
- devel c:KDE:Frameworks5:LTS/breeze/8.0 p1:KDE:Frameworks5:LTS/breeze/7.0
- devel c:KDE:Frameworks5:LTS/breeze/7.0 p1:KDE:Frameworks5:LTS/breeze/6.0
- devel c:KDE:Frameworks5:LTS/breeze/6.0 p1:KDE:Frameworks5:LTS/breeze/5.0
- devel c:KDE:Frameworks5:LTS/breeze/5.0 p1:KDE:Frameworks5:LTS/breeze/4.0
- devel c:KDE:Frameworks5:LTS/breeze/4.0 p1:KDE:Frameworks5:LTS/breeze/3.0
- devel c:KDE:Frameworks5:LTS/breeze/3.0 p1:KDE:Frameworks5:LTS/breeze/2.0
- devel c:KDE:Frameworks5:LTS/breeze/2.0 p1:KDE:Frameworks5:LTS/breeze/1.0
- devel c:KDE:Frameworks5:LTS/breeze/1.0 p1:openSUSE:Factory/breeze/32.0
- factory c:openSUSE:Factory/breeze/32.0 p1:openSUSE:Factory/breeze/31.0 p2:KDE:Frameworks5/breeze/120.0
- devel c:KDE:Frameworks5/breeze/120.0 p1:KDE:Frameworks5/breeze/117.0
- factory c:openSUSE:Factory/breeze/31.0 p1:openSUSE:Factory/breeze/30.0 p2:KDE:Frameworks5/breeze/117.0
- devel c:KDE:Frameworks5/breeze/117.0 p1:KDE:Frameworks5/breeze/116.0
- factory c:openSUSE:Factory/breeze/30.0 p1:openSUSE:Factory/breeze/29.0 p2:KDE:Frameworks5/breeze/116.0
- devel c:KDE:Frameworks5/breeze/116.0 p1:KDE:Frameworks5/breeze/115.0
- devel c:KDE:Frameworks5/breeze/115.0 p1:KDE:Frameworks5/breeze/113.0
- devel c:KDE:Frameworks5/breeze/113.0 p1:KDE:Frameworks5/breeze/112.0
- devel c:KDE:Frameworks5/breeze/112.0 p1:KDE:Frameworks5/breeze/111.0
- factory c:openSUSE:Factory/breeze/29.0 p1:openSUSE:Factory/breeze/28.0 p2:KDE:Frameworks5/breeze/111.0
- devel c:KDE:Frameworks5/breeze/111.0 p1:KDE:Frameworks5/breeze/110.0
- devel c:KDE:Frameworks5/breeze/110.0 p1:KDE:Frameworks5/breeze/109.0
- devel c:KDE:Frameworks5/breeze/109.0 p1:KDE:Frameworks5/breeze/108.0
- devel c:KDE:Frameworks5/breeze/108.0 p1:KDE:Frameworks5/breeze/107.0
- devel c:KDE:Frameworks5/breeze/107.0 p1:KDE:Frameworks5/breeze/105.0
- factory c:openSUSE:Factory/breeze/28.0 p1:openSUSE:Factory/breeze/27.0 p2:KDE:Frameworks5/breeze/105.0
- devel c:KDE:Frameworks5/breeze/105.0 p1:KDE:Frameworks5/breeze/103.0
- factory c:openSUSE:Factory/breeze/27.0 p1:openSUSE:Factory/breeze/26.0 p2:KDE:Frameworks5/breeze/103.0
- devel c:KDE:Frameworks5/breeze/103.0 p1:KDE:Frameworks5/breeze/100.0
- factory c:openSUSE:Factory/breeze/26.0 p1:openSUSE:Factory/breeze/25.0 p2:KDE:Frameworks5/breeze/100.0
- devel c:KDE:Frameworks5/breeze/100.0 p1:KDE:Frameworks5/breeze/99.0
- factory c:openSUSE:Factory/breeze/25.0 p1:openSUSE:Factory/breeze/24.0 p2:KDE:Frameworks5/breeze/99.0
- devel c:KDE:Frameworks5/breeze/99.0 p1:KDE:Frameworks5/breeze/98.0
- devel c:KDE:Frameworks5/breeze/98.0 p1:KDE:Frameworks5/breeze/97.0
- devel c:KDE:Frameworks5/breeze/97.0 p1:KDE:Frameworks5/breeze/95.0
- factory c:openSUSE:Factory/breeze/24.0 p1:openSUSE:Factory/breeze/23.0 p2:KDE:Frameworks5/breeze/95.0
- devel c:KDE:Frameworks5/breeze/95.0 p1:KDE:Frameworks5/breeze/93.0
- factory c:openSUSE:Factory/breeze/23.0 p1:openSUSE:Factory/breeze/22.0 p2:KDE:Frameworks5/breeze/93.0
- devel c:KDE:Frameworks5/breeze/93.0 p1:KDE:Frameworks5/breeze/91.0
- factory c:openSUSE:Factory/breeze/22.0 p1:openSUSE:Factory/breeze/21.0 p2:KDE:Frameworks5/breeze/91.0
- devel c:KDE:Frameworks5/breeze/91.0 p1:KDE:Frameworks5/breeze/88.0
- factory c:openSUSE:Factory/breeze/21.0 p1:openSUSE:Factory/breeze/20.0 p2:KDE:Frameworks5/breeze/88.0
- devel c:KDE:Frameworks5/breeze/88.0 p1:KDE:Frameworks5/breeze/87.0
- factory c:openSUSE:Factory/breeze/20.0 p1:openSUSE:Factory/breeze/19.0 p2:KDE:Frameworks5/breeze/87.0
- devel c:KDE:Frameworks5/breeze/87.0 p1:KDE:Frameworks5/breeze/86.0
- devel c:KDE:Frameworks5/breeze/86.0 p1:KDE:Frameworks5/breeze/85.0
- devel c:KDE:Frameworks5/breeze/85.0 p1:KDE:Frameworks5/breeze/84.0
- devel c:KDE:Frameworks5/breeze/84.0 p1:KDE:Frameworks5/breeze/83.0
- devel c:KDE:Frameworks5/breeze/83.0 p1:KDE:Frameworks5/breeze/82.0
- devel c:KDE:Frameworks5/breeze/82.0 p1:KDE:Frameworks5/breeze/81.0
- devel c:KDE:Frameworks5/breeze/81.0 p1:KDE:Frameworks5/breeze/80.0
- devel c:KDE:Frameworks5/breeze/80.0 p1:KDE:Frameworks5/breeze/79.0
- devel c:KDE:Frameworks5/breeze/79.0 p1:KDE:Frameworks5/breeze/78.0
- devel c:KDE:Frameworks5/breeze/78.0 p1:KDE:Frameworks5/breeze/76.0
- devel c:KDE:Frameworks5/breeze/76.0 p1:KDE:Frameworks5/breeze/75.0
- factory c:openSUSE:Factory/breeze/19.0 p1:openSUSE:Factory/breeze/18.0 p2:KDE:Frameworks5/breeze/75.0
- devel c:KDE:Frameworks5/breeze/75.0 p1:KDE:Frameworks5/breeze/74.0
- devel c:KDE:Frameworks5/breeze/74.0 p1:KDE:Frameworks5/breeze/73.0
- devel c:KDE:Frameworks5/breeze/73.0 p1:KDE:Frameworks5/breeze/71.0
- factory c:openSUSE:Factory/breeze/18.0 p1:openSUSE:Factory/breeze/17.0 p2:KDE:Frameworks5/breeze/71.0
- devel c:KDE:Frameworks5/breeze/71.0 p1:KDE:Frameworks5/breeze/70.0
- devel c:KDE:Frameworks5/breeze/70.0 p1:KDE:Frameworks5/breeze/69.0
- devel c:KDE:Frameworks5/breeze/69.0 p1:KDE:Frameworks5/breeze/68.0
- devel c:KDE:Frameworks5/breeze/68.0 p1:KDE:Frameworks5/breeze/67.0
- devel c:KDE:Frameworks5/breeze/67.0 p1:KDE:Frameworks5/breeze/65.0
- factory c:openSUSE:Factory/breeze/17.0 p1:openSUSE:Factory/breeze/16.0 p2:KDE:Frameworks5/breeze/65.0
- devel c:KDE:Frameworks5/breeze/65.0 p1:KDE:Frameworks5/breeze/64.0
- devel c:KDE:Frameworks5/breeze/64.0 p1:KDE:Frameworks5/breeze/62.0
- factory c:openSUSE:Factory/breeze/16.0 p1:openSUSE:Factory/breeze/15.0 p2:KDE:Frameworks5/breeze/62.0
- devel c:KDE:Frameworks5/breeze/62.0 p1:KDE:Frameworks5/breeze/61.0
- devel c:KDE:Frameworks5/breeze/61.0 p1:KDE:Frameworks5/breeze/60.0
- devel c:KDE:Frameworks5/breeze/60.0 p1:KDE:Frameworks5/breeze/59.0
- devel c:KDE:Frameworks5/breeze/59.0 p1:KDE:Frameworks5/breeze/58.0
- devel c:KDE:Frameworks5/breeze/58.0 p1:KDE:Frameworks5/breeze/57.0
- devel c:KDE:Frameworks5/breeze/57.0 p1:KDE:Frameworks5/breeze/55.0
- factory c:openSUSE:Factory/breeze/15.0 p1:openSUSE:Factory/breeze/14.0 p2:KDE:Frameworks5/breeze/55.0
- devel c:KDE:Frameworks5/breeze/55.0 p1:KDE:Frameworks5/breeze/53.0
- factory c:openSUSE:Factory/breeze/14.0 p1:openSUSE:Factory/breeze/13.0 p2:KDE:Frameworks5/breeze/53.0
- devel c:KDE:Frameworks5/breeze/53.0 p1:KDE:Frameworks5/breeze/51.0
- factory c:openSUSE:Factory/breeze/13.0 p1:openSUSE:Factory/breeze/12.0 p2:KDE:Frameworks5/breeze/51.0
- devel c:KDE:Frameworks5/breeze/51.0 p1:KDE:Frameworks5/breeze/50.0
- devel c:KDE:Frameworks5/breeze/50.0 p1:KDE:Frameworks5/breeze/49.0
- devel c:KDE:Frameworks5/breeze/49.0 p1:KDE:Frameworks5/breeze/48.0
- devel c:KDE:Frameworks5/breeze/48.0 p1:KDE:Frameworks5/breeze/47.0
- devel c:KDE:Frameworks5/breeze/47.0 p1:KDE:Frameworks5/breeze/46.0
- devel c:KDE:Frameworks5/breeze/46.0 p1:KDE:Frameworks5/breeze/45.0
- devel c:KDE:Frameworks5/breeze/45.0 p1:KDE:Frameworks5/breeze/44.0
- devel c:KDE:Frameworks5/breeze/44.0 p1:KDE:Frameworks5/breeze/43.0
- devel c:KDE:Frameworks5/breeze/43.0 p1:KDE:Frameworks5/breeze/41.0
- factory c:openSUSE:Factory/breeze/12.0 p1:openSUSE:Factory/breeze/11.0 p2:KDE:Frameworks5/breeze/41.0
- devel c:KDE:Frameworks5/breeze/41.0 p1:KDE:Frameworks5/breeze/40.0
- devel c:KDE:Frameworks5/breeze/40.0 p1:KDE:Frameworks5/breeze/39.0
- devel c:KDE:Frameworks5/breeze/39.0 p1:KDE:Frameworks5/breeze/38.0
- factory c:openSUSE:Factory/breeze/11.0 p1:openSUSE:Factory/breeze/10.0 p2:KDE:Frameworks5/breeze/38.0
- devel c:KDE:Frameworks5/breeze/38.0 p1:KDE:Frameworks5/breeze/36.0
- factory c:openSUSE:Factory/breeze/10.0 p1:openSUSE:Factory/breeze/9.0 p2:KDE:Frameworks5/breeze/36.0
- devel c:KDE:Frameworks5/breeze/36.0 p1:KDE:Frameworks5/breeze/35.0
- devel c:KDE:Frameworks5/breeze/35.0 p1:KDE:Frameworks5/breeze/33.0
- factory c:openSUSE:Factory/breeze/9.0 p1:openSUSE:Factory/breeze/8.0 p2:KDE:Frameworks5/breeze/33.0
- devel c:KDE:Frameworks5/breeze/33.0 p1:KDE:Frameworks5/breeze/32.0
- devel c:KDE:Frameworks5/breeze/32.0 p1:KDE:Frameworks5/breeze/31.0
- devel c:KDE:Frameworks5/breeze/31.0 p1:KDE:Frameworks5/breeze/30.0
- devel c:KDE:Frameworks5/breeze/30.0 p1:KDE:Frameworks5/breeze/28.0
- factory c:openSUSE:Factory/breeze/8.0 p1:openSUSE:Factory/breeze/7.0 p2:KDE:Frameworks5/breeze/28.0
- devel c:KDE:Frameworks5/breeze/28.0 p1:KDE:Frameworks5/breeze/27.0
- devel c:KDE:Frameworks5/breeze/27.0 p1:KDE:Frameworks5/breeze/25.0
- factory c:openSUSE:Factory/breeze/7.0 p1:openSUSE:Factory/breeze/6.0 p2:KDE:Frameworks5/breeze/25.0
- devel c:KDE:Frameworks5/breeze/25.0 p1:KDE:Frameworks5/breeze/24.0
- devel c:KDE:Frameworks5/breeze/24.0 p1:KDE:Frameworks5/breeze/22.0
- factory c:openSUSE:Factory/breeze/6.0 p1:openSUSE:Factory/breeze/5.0 p2:KDE:Frameworks5/breeze/22.0
- devel c:KDE:Frameworks5/breeze/22.0 p1:KDE:Frameworks5/breeze/21.0
- devel c:KDE:Frameworks5/breeze/21.0 p1:KDE:Frameworks5/breeze/20.0
- devel c:KDE:Frameworks5/breeze/20.0 p1:KDE:Frameworks5/breeze/19.0
- devel c:KDE:Frameworks5/breeze/19.0 p1:KDE:Frameworks5/breeze/18.0
- devel c:KDE:Frameworks5/breeze/18.0 p1:KDE:Frameworks5/breeze/17.0
- factory c:openSUSE:Factory/breeze/5.0 p1:openSUSE:Factory/breeze/4.0 p2:KDE:Frameworks5/breeze/17.0
- devel c:KDE:Frameworks5/breeze/17.0 p1:KDE:Frameworks5/breeze/16.0
- devel c:KDE:Frameworks5/breeze/16.0 p1:KDE:Frameworks5/breeze/15.0
- devel c:KDE:Frameworks5/breeze/15.0 p1:KDE:Frameworks5/breeze/14.0
- devel c:KDE:Frameworks5/breeze/14.0 p1:KDE:Frameworks5/breeze/13.0
- devel c:KDE:Frameworks5/breeze/13.0 p1:KDE:Frameworks5/breeze/12.0
- devel c:KDE:Frameworks5/breeze/12.0 p1:KDE:Frameworks5/breeze/11.0
- factory c:openSUSE:Factory/breeze/4.0 p1:openSUSE:Factory/breeze/2.0 p2:KDE:Frameworks5/breeze/11.0
- devel c:KDE:Frameworks5/breeze/11.0 p1:KDE:Frameworks5/breeze/10.0
- devel c:KDE:Frameworks5/breeze/10.0 p1:KDE:Frameworks5/breeze/9.0
- devel c:KDE:Frameworks5/breeze/9.0 p1:KDE:Frameworks5/breeze/8.0
- devel c:KDE:Frameworks5/breeze/8.0 p1:KDE:Frameworks5/breeze/6.0
- factory c:openSUSE:Factory/breeze/2.0 p1:openSUSE:Factory/breeze/1.0 p2:KDE:Frameworks5/breeze/6.0
- devel c:KDE:Frameworks5/breeze/6.0 p1:openSUSE:Factory/breeze/1.0
- factory c:openSUSE:Factory/breeze/1.0 p1:KDE:Frameworks5/breeze/4.0
- factory c:KDE:Frameworks5/breeze/4.0 p1:KDE:Frameworks5/breeze/3.0
- factory c:KDE:Frameworks5/breeze/3.0 p1:KDE:Frameworks5/breeze/2.0
- factory c:KDE:Frameworks5/breeze/2.0 p1:KDE:Frameworks5/breeze/1.0
- factory c:KDE:Frameworks5/breeze/1.0

212
tests/fixtures/breeze-expected-tree.yaml vendored Normal file
View File

@ -0,0 +1,212 @@
- commit: openSUSE:Factory/breeze/43.0
merged:
- KDE:Frameworks5/breeze/150.0
- commit: openSUSE:Factory/breeze/42.0
merged:
- KDE:Frameworks5/breeze/148.0
- KDE:Frameworks5/breeze/147.0
- commit: openSUSE:Factory/breeze/41.0
merged:
- KDE:Frameworks5/breeze/145.0
- commit: openSUSE:Factory/breeze/40.0
merged:
- KDE:Frameworks5/breeze/143.0
- KDE:Frameworks5/breeze/142.0
- KDE:Frameworks5/breeze/141.0
- commit: openSUSE:Factory/breeze/39.0
merged:
- KDE:Frameworks5/breeze/139.0
- commit: openSUSE:Factory/breeze/38.0
merged:
- KDE:Frameworks5/breeze/137.0
- KDE:Frameworks5/breeze/136.0
- KDE:Frameworks5/breeze/135.0
- KDE:Frameworks5/breeze/134.0
- commit: openSUSE:Factory/breeze/37.0
merged:
- KDE:Frameworks5/breeze/132.0
- commit: openSUSE:Factory/breeze/36.0
merged:
- KDE:Frameworks5/breeze/130.0
- commit: openSUSE:Factory/breeze/35.0
merged:
- KDE:Frameworks5/breeze/128.0
- KDE:Frameworks5/breeze/127.0
- KDE:Frameworks5/breeze/126.034
- KDE:Frameworks5/breeze/126.0
- KDE:Frameworks5/breeze/125.0
- KDE:Frameworks5/breeze/124.0
- KDE:Frameworks5/breeze/123.0
- KDE:Frameworks5/breeze/122.0
- commit: openSUSE:Factory/breeze/34.0
merged:
- KDE:Frameworks5:LTS/breeze/14.0
- KDE:Frameworks5:LTS/breeze/13.0
- KDE:Frameworks5:LTS/breeze/12.0
- commit: openSUSE:Factory/breeze/33.0
merged:
- KDE:Frameworks5:LTS/breeze/11.0
- KDE:Frameworks5:LTS/breeze/10.0
- KDE:Frameworks5:LTS/breeze/9.0
- KDE:Frameworks5:LTS/breeze/8.0
- KDE:Frameworks5:LTS/breeze/7.0
- KDE:Frameworks5:LTS/breeze/6.0
- KDE:Frameworks5:LTS/breeze/5.0
- KDE:Frameworks5:LTS/breeze/4.0
- KDE:Frameworks5:LTS/breeze/3.0
- KDE:Frameworks5:LTS/breeze/2.0
- KDE:Frameworks5:LTS/breeze/1.0
- commit: openSUSE:Factory/breeze/32.0
merged:
- KDE:Frameworks5/breeze/120.0
- commit: openSUSE:Factory/breeze/31.0
merged:
- KDE:Frameworks5/breeze/117.0
- commit: openSUSE:Factory/breeze/30.0
merged:
- KDE:Frameworks5/breeze/116.0
- KDE:Frameworks5/breeze/115.0
- KDE:Frameworks5/breeze/113.0
- KDE:Frameworks5/breeze/112.0
- commit: openSUSE:Factory/breeze/29.0
merged:
- KDE:Frameworks5/breeze/111.0
- KDE:Frameworks5/breeze/110.0
- KDE:Frameworks5/breeze/109.0
- KDE:Frameworks5/breeze/108.0
- KDE:Frameworks5/breeze/107.0
- commit: openSUSE:Factory/breeze/28.0
merged:
- KDE:Frameworks5/breeze/105.0
- commit: openSUSE:Factory/breeze/27.0
merged:
- KDE:Frameworks5/breeze/103.0
- commit: openSUSE:Factory/breeze/26.0
merged:
- KDE:Frameworks5/breeze/100.0
- commit: openSUSE:Factory/breeze/25.0
merged:
- KDE:Frameworks5/breeze/99.0
- KDE:Frameworks5/breeze/98.0
- KDE:Frameworks5/breeze/97.0
- commit: openSUSE:Factory/breeze/24.0
merged:
- KDE:Frameworks5/breeze/95.0
- commit: openSUSE:Factory/breeze/23.0
merged:
- KDE:Frameworks5/breeze/93.0
- commit: openSUSE:Factory/breeze/22.0
merged:
- KDE:Frameworks5/breeze/91.0
- commit: openSUSE:Factory/breeze/21.0
merged:
- KDE:Frameworks5/breeze/88.0
- commit: openSUSE:Factory/breeze/20.0
merged:
- KDE:Frameworks5/breeze/87.0
- KDE:Frameworks5/breeze/86.0
- KDE:Frameworks5/breeze/85.0
- KDE:Frameworks5/breeze/84.0
- KDE:Frameworks5/breeze/83.0
- KDE:Frameworks5/breeze/82.0
- KDE:Frameworks5/breeze/81.0
- KDE:Frameworks5/breeze/80.0
- KDE:Frameworks5/breeze/79.0
- KDE:Frameworks5/breeze/78.0
- KDE:Frameworks5/breeze/76.0
- commit: openSUSE:Factory/breeze/19.0
merged:
- KDE:Frameworks5/breeze/75.0
- KDE:Frameworks5/breeze/74.0
- KDE:Frameworks5/breeze/73.0
- commit: openSUSE:Factory/breeze/18.0
merged:
- KDE:Frameworks5/breeze/71.0
- KDE:Frameworks5/breeze/70.0
- KDE:Frameworks5/breeze/69.0
- KDE:Frameworks5/breeze/68.0
- KDE:Frameworks5/breeze/67.0
- commit: openSUSE:Factory/breeze/17.0
merged:
- KDE:Frameworks5/breeze/65.0
- KDE:Frameworks5/breeze/64.0
- commit: openSUSE:Factory/breeze/16.0
merged:
- KDE:Frameworks5/breeze/62.0
- KDE:Frameworks5/breeze/61.0
- KDE:Frameworks5/breeze/60.0
- KDE:Frameworks5/breeze/59.0
- KDE:Frameworks5/breeze/58.0
- KDE:Frameworks5/breeze/57.0
- commit: openSUSE:Factory/breeze/15.0
merged:
- KDE:Frameworks5/breeze/55.0
- commit: openSUSE:Factory/breeze/14.0
merged:
- KDE:Frameworks5/breeze/53.0
- commit: openSUSE:Factory/breeze/13.0
merged:
- KDE:Frameworks5/breeze/51.0
- KDE:Frameworks5/breeze/50.0
- KDE:Frameworks5/breeze/49.0
- KDE:Frameworks5/breeze/48.0
- KDE:Frameworks5/breeze/47.0
- KDE:Frameworks5/breeze/46.0
- KDE:Frameworks5/breeze/45.0
- KDE:Frameworks5/breeze/44.0
- KDE:Frameworks5/breeze/43.0
- commit: openSUSE:Factory/breeze/12.0
merged:
- KDE:Frameworks5/breeze/41.0
- KDE:Frameworks5/breeze/40.0
- KDE:Frameworks5/breeze/39.0
- commit: openSUSE:Factory/breeze/11.0
merged:
- KDE:Frameworks5/breeze/38.0
- commit: openSUSE:Factory/breeze/10.0
merged:
- KDE:Frameworks5/breeze/36.0
- KDE:Frameworks5/breeze/35.0
- commit: openSUSE:Factory/breeze/9.0
merged:
- KDE:Frameworks5/breeze/33.0
- KDE:Frameworks5/breeze/32.0
- KDE:Frameworks5/breeze/31.0
- KDE:Frameworks5/breeze/30.0
- commit: openSUSE:Factory/breeze/8.0
merged:
- KDE:Frameworks5/breeze/28.0
- KDE:Frameworks5/breeze/27.0
- commit: openSUSE:Factory/breeze/7.0
merged:
- KDE:Frameworks5/breeze/25.0
- KDE:Frameworks5/breeze/24.0
- commit: openSUSE:Factory/breeze/6.0
merged:
- KDE:Frameworks5/breeze/22.0
- KDE:Frameworks5/breeze/21.0
- KDE:Frameworks5/breeze/20.0
- KDE:Frameworks5/breeze/19.0
- KDE:Frameworks5/breeze/18.0
- commit: openSUSE:Factory/breeze/5.0
merged:
- KDE:Frameworks5/breeze/17.0
- KDE:Frameworks5/breeze/16.0
- KDE:Frameworks5/breeze/15.0
- KDE:Frameworks5/breeze/14.0
- KDE:Frameworks5/breeze/13.0
- KDE:Frameworks5/breeze/12.0
- commit: openSUSE:Factory/breeze/4.0
merged:
- KDE:Frameworks5/breeze/11.0
- KDE:Frameworks5/breeze/10.0
- KDE:Frameworks5/breeze/9.0
- KDE:Frameworks5/breeze/8.0
- commit: openSUSE:Factory/breeze/2.0
merged:
- KDE:Frameworks5/breeze/6.0
- commit: openSUSE:Factory/breeze/1.0
- commit: KDE:Frameworks5/breeze/4.0
- commit: KDE:Frameworks5/breeze/3.0
- commit: KDE:Frameworks5/breeze/2.0
- commit: KDE:Frameworks5/breeze/1.0

9551
tests/fixtures/firewalld-data.yaml vendored Normal file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,173 @@
- factory c:openSUSE:Factory/firewalld/73.0 p1:openSUSE:Factory/firewalld/72.0 p2:security:netfilter/firewalld/131.0
- devel c:security:netfilter/firewalld/131.0 p1:security:netfilter/firewalld/129.0
- factory c:openSUSE:Factory/firewalld/72.0 p1:openSUSE:Factory/firewalld/71.0 p2:security:netfilter/firewalld/129.0
- devel c:security:netfilter/firewalld/129.0 p1:security:netfilter/firewalld/128.0
- devel c:security:netfilter/firewalld/128.0 p1:security:netfilter/firewalld/127.0
- factory c:openSUSE:Factory/firewalld/71.0 p1:openSUSE:Factory/firewalld/70.0 p2:security:netfilter/firewalld/127.0
- devel c:security:netfilter/firewalld/127.0 p1:security:netfilter/firewalld/126.0
- factory c:openSUSE:Factory/firewalld/70.0 p1:openSUSE:Factory/firewalld/69.0
- factory c:openSUSE:Factory/firewalld/69.0 p1:openSUSE:Factory/firewalld/68.0 p2:security:netfilter/firewalld/126.0
- devel c:security:netfilter/firewalld/126.0 p1:security:netfilter/firewalld/125.0
- factory c:openSUSE:Factory/firewalld/68.0 p1:openSUSE:Factory/firewalld/67.0 p2:security:netfilter/firewalld/125.0
- devel c:security:netfilter/firewalld/125.0 p1:security:netfilter/firewalld/124.0
- factory c:openSUSE:Factory/firewalld/67.0 p1:openSUSE:Factory/firewalld/66.0 p2:security:netfilter/firewalld/124.0
- devel c:security:netfilter/firewalld/124.0 p1:security:netfilter/firewalld/123.0
- factory c:openSUSE:Factory/firewalld/66.0 p1:openSUSE:Factory/firewalld/65.0 p2:security:netfilter/firewalld/123.0
- devel c:security:netfilter/firewalld/123.0 p1:security:netfilter/firewalld/122.0
- factory c:openSUSE:Factory/firewalld/65.0 p1:openSUSE:Factory/firewalld/64.0 p2:security:netfilter/firewalld/122.0
- devel c:security:netfilter/firewalld/122.0 p1:security:netfilter/firewalld/121.0
- factory c:openSUSE:Factory/firewalld/64.0 p1:openSUSE:Factory/firewalld/63.0 p2:security:netfilter/firewalld/121.0
- devel c:security:netfilter/firewalld/121.0 p1:security:netfilter/firewalld/120.0
- factory c:openSUSE:Factory/firewalld/63.0 p1:openSUSE:Factory/firewalld/62.0 p2:security:netfilter/firewalld/120.0
- devel c:security:netfilter/firewalld/120.0 p1:security:netfilter/firewalld/119.0
- factory c:openSUSE:Factory/firewalld/62.0 p1:openSUSE:Factory/firewalld/61.0 p2:security:netfilter/firewalld/119.0
- devel c:security:netfilter/firewalld/119.0 p1:security:netfilter/firewalld/118.0
- factory c:openSUSE:Factory/firewalld/61.0 p1:openSUSE:Factory/firewalld/60.0 p2:security:netfilter/firewalld/118.0
- devel c:security:netfilter/firewalld/118.0 p1:security:netfilter/firewalld/117.0
- factory c:openSUSE:Factory/firewalld/60.0 p1:openSUSE:Factory/firewalld/59.0 p2:security:netfilter/firewalld/117.0
- devel c:security:netfilter/firewalld/117.0 p1:security:netfilter/firewalld/116.0
- factory c:openSUSE:Factory/firewalld/59.0 p1:openSUSE:Factory/firewalld/58.0 p2:security:netfilter/firewalld/116.0
- devel c:security:netfilter/firewalld/116.0 p1:security:netfilter/firewalld/115.0
- factory c:openSUSE:Factory/firewalld/58.0 p1:openSUSE:Factory/firewalld/57.0 p2:security:netfilter/firewalld/115.0
- devel c:security:netfilter/firewalld/115.0 p1:security:netfilter/firewalld/114.0
- factory c:openSUSE:Factory/firewalld/57.0 p1:openSUSE:Factory/firewalld/56.0 p2:security:netfilter/firewalld/114.0
- devel c:security:netfilter/firewalld/114.0 p1:security:netfilter/firewalld/113.0
- factory c:openSUSE:Factory/firewalld/56.0 p1:openSUSE:Factory/firewalld/55.0 p2:security:netfilter/firewalld/113.0
- devel c:security:netfilter/firewalld/113.0 p1:security:netfilter/firewalld/112.0
- factory c:openSUSE:Factory/firewalld/55.0 p1:openSUSE:Factory/firewalld/54.0 p2:security:netfilter/firewalld/112.0
- devel c:security:netfilter/firewalld/112.0 p1:security:netfilter/firewalld/111.0
- devel c:security:netfilter/firewalld/111.0 p1:security:netfilter/firewalld/110.0
- devel c:security:netfilter/firewalld/110.0 p1:security:netfilter/firewalld/109.0
- devel c:security:netfilter/firewalld/109.0 p1:security:netfilter/firewalld/108.0
- factory c:openSUSE:Factory/firewalld/54.0 p1:openSUSE:Factory/firewalld/53.0 p2:security:netfilter/firewalld/108.0
- devel c:security:netfilter/firewalld/108.0 p1:security:netfilter/firewalld/107.0
- factory c:openSUSE:Factory/firewalld/53.0 p1:openSUSE:Factory/firewalld/52.0 p2:security:netfilter/firewalld/107.0
- devel c:security:netfilter/firewalld/107.0 p1:security:netfilter/firewalld/106.0
- factory c:openSUSE:Factory/firewalld/52.0 p1:openSUSE:Factory/firewalld/51.0
- factory c:openSUSE:Factory/firewalld/51.0 p1:openSUSE:Factory/firewalld/50.0
- factory c:openSUSE:Factory/firewalld/50.0 p1:openSUSE:Factory/firewalld/49.0 p2:security:netfilter/firewalld/106.0
- devel c:security:netfilter/firewalld/106.0 p1:security:netfilter/firewalld/105.0
- factory c:openSUSE:Factory/firewalld/49.0 p1:openSUSE:Factory/firewalld/48.0 p2:security:netfilter/firewalld/105.0
- devel c:security:netfilter/firewalld/105.0 p1:security:netfilter/firewalld/104.0
- devel c:security:netfilter/firewalld/104.0 p1:security:netfilter/firewalld/103.0
- devel c:security:netfilter/firewalld/103.0 p1:security:netfilter/firewalld/102.0
- factory c:openSUSE:Factory/firewalld/48.0 p1:openSUSE:Factory/firewalld/47.0 p2:security:netfilter/firewalld/102.0
- devel c:security:netfilter/firewalld/102.0 p1:security:netfilter/firewalld/101.0
- factory c:openSUSE:Factory/firewalld/47.0 p1:openSUSE:Factory/firewalld/46.0 p2:security:netfilter/firewalld/101.0
- devel c:security:netfilter/firewalld/101.0 p1:security:netfilter/firewalld/100.0
- factory c:openSUSE:Factory/firewalld/46.0 p1:openSUSE:Factory/firewalld/45.0 p2:security:netfilter/firewalld/100.0
- devel c:security:netfilter/firewalld/100.0 p1:security:netfilter/firewalld/99.0
- factory c:openSUSE:Factory/firewalld/45.0 p1:openSUSE:Factory/firewalld/44.0 p2:security:netfilter/firewalld/99.0
- devel c:security:netfilter/firewalld/99.0 p1:security:netfilter/firewalld/98.0
- factory c:openSUSE:Factory/firewalld/44.0 p1:openSUSE:Factory/firewalld/43.0 p2:security:netfilter/firewalld/98.0
- devel c:security:netfilter/firewalld/98.0 p1:security:netfilter/firewalld/97.0
- factory c:openSUSE:Factory/firewalld/43.0 p1:openSUSE:Factory/firewalld/42.0 p2:security:netfilter/firewalld/97.0
- devel c:security:netfilter/firewalld/97.0 p1:security:netfilter/firewalld/96.0
- devel c:security:netfilter/firewalld/96.0 p1:security:netfilter/firewalld/95.0
- devel c:security:netfilter/firewalld/95.0 p1:security:netfilter/firewalld/94.0
- devel c:security:netfilter/firewalld/94.0 p1:security:netfilter/firewalld/93.0
- factory c:openSUSE:Factory/firewalld/42.0 p1:openSUSE:Factory/firewalld/41.0 p2:security:netfilter/firewalld/93.0
- devel c:security:netfilter/firewalld/93.0 p1:security:netfilter/firewalld/92.0
- factory c:openSUSE:Factory/firewalld/41.0 p1:openSUSE:Factory/firewalld/40.0 p2:security:netfilter/firewalld/92.0
- devel c:security:netfilter/firewalld/92.0 p1:security:netfilter/firewalld/91.0
- factory c:openSUSE:Factory/firewalld/40.0 p1:openSUSE:Factory/firewalld/39.0 p2:security:netfilter/firewalld/91.0
- devel c:security:netfilter/firewalld/91.0 p1:security:netfilter/firewalld/90.0
- factory c:openSUSE:Factory/firewalld/39.0 p1:openSUSE:Factory/firewalld/38.0 p2:security:netfilter/firewalld/90.0
- devel c:security:netfilter/firewalld/90.0 p1:security:netfilter/firewalld/89.0
- factory c:openSUSE:Factory/firewalld/38.0 p1:openSUSE:Factory/firewalld/37.0 p2:security:netfilter/firewalld/89.0
- devel c:security:netfilter/firewalld/89.0 p1:security:netfilter/firewalld/88.0
- factory c:openSUSE:Factory/firewalld/37.0 p1:openSUSE:Factory/firewalld/36.0 p2:security:netfilter/firewalld/88.0
- devel c:security:netfilter/firewalld/88.0 p1:security:netfilter/firewalld/87.0
- devel c:security:netfilter/firewalld/87.0 p1:security:netfilter/firewalld/86.0
- devel c:security:netfilter/firewalld/86.0 p1:security:netfilter/firewalld/85.0
- devel c:security:netfilter/firewalld/85.0 p1:security:netfilter/firewalld/84.0
- factory c:openSUSE:Factory/firewalld/36.0 p1:openSUSE:Factory/firewalld/35.0 p2:security:netfilter/firewalld/84.0
- devel c:security:netfilter/firewalld/84.0 p1:security:netfilter/firewalld/83.0
- devel c:security:netfilter/firewalld/83.0 p1:security:netfilter/firewalld/82.0
- factory c:openSUSE:Factory/firewalld/35.0 p1:openSUSE:Factory/firewalld/34.0 p2:security:netfilter/firewalld/82.0
- devel c:security:netfilter/firewalld/82.0 p1:security:netfilter/firewalld/81.0
- devel c:security:netfilter/firewalld/81.0 p1:security:netfilter/firewalld/80.0
- devel c:security:netfilter/firewalld/80.0 p1:security:netfilter/firewalld/79.0
- devel c:security:netfilter/firewalld/79.0 p1:security:netfilter/firewalld/78.0
- devel c:security:netfilter/firewalld/78.0 p1:security:netfilter/firewalld/77.0
- factory c:openSUSE:Factory/firewalld/34.0 p1:openSUSE:Factory/firewalld/33.0 p2:security:netfilter/firewalld/77.0
- devel c:security:netfilter/firewalld/77.0 p1:security:netfilter/firewalld/76.0
- devel c:security:netfilter/firewalld/76.0 p1:security:netfilter/firewalld/75.0
- devel c:security:netfilter/firewalld/75.0 p1:security:netfilter/firewalld/74.0
- factory c:openSUSE:Factory/firewalld/33.0 p1:openSUSE:Factory/firewalld/32.0
- factory c:openSUSE:Factory/firewalld/32.0 p1:openSUSE:Factory/firewalld/31.0 p2:security:netfilter/firewalld/74.0
- devel c:security:netfilter/firewalld/74.0 p1:security:netfilter/firewalld/71.0
- factory c:openSUSE:Factory/firewalld/31.0 p1:openSUSE:Factory/firewalld/30.0
- factory c:openSUSE:Factory/firewalld/30.0 p1:openSUSE:Factory/firewalld/29.0 p2:security:netfilter/firewalld/71.0
- devel c:security:netfilter/firewalld/71.0 p1:security:netfilter/firewalld/69.0
- factory c:openSUSE:Factory/firewalld/29.0 p1:openSUSE:Factory/firewalld/28.0 p2:security:netfilter/firewalld/69.0
- devel c:security:netfilter/firewalld/69.0 p1:security:netfilter/firewalld/68.0
- factory c:openSUSE:Factory/firewalld/28.0 p1:openSUSE:Factory/firewalld/27.0 p2:security:netfilter/firewalld/68.0
- devel c:security:netfilter/firewalld/68.0 p1:security:netfilter/firewalld/67.0
- devel c:security:netfilter/firewalld/67.0 p1:security:netfilter/firewalld/65.0
- factory c:openSUSE:Factory/firewalld/27.0 p1:openSUSE:Factory/firewalld/26.0 p2:security:netfilter/firewalld/65.0
- devel c:security:netfilter/firewalld/65.0 p1:security:netfilter/firewalld/63.0
- factory c:openSUSE:Factory/firewalld/26.0 p1:openSUSE:Factory/firewalld/25.0 p2:security:netfilter/firewalld/63.0
- devel c:security:netfilter/firewalld/63.0 p1:security:netfilter/firewalld/61.0
- factory c:openSUSE:Factory/firewalld/25.0 p1:openSUSE:Factory/firewalld/24.0 p2:security:netfilter/firewalld/61.0
- devel c:security:netfilter/firewalld/61.0 p1:security:netfilter/firewalld/60.0
- devel c:security:netfilter/firewalld/60.0 p1:security:netfilter/firewalld/59.0
- devel c:security:netfilter/firewalld/59.0 p1:security:netfilter/firewalld/57.0
- factory c:openSUSE:Factory/firewalld/24.0 p1:openSUSE:Factory/firewalld/23.0 p2:security:netfilter/firewalld/57.0
- devel c:security:netfilter/firewalld/57.0 p1:security:netfilter/firewalld/55.0
- factory c:openSUSE:Factory/firewalld/23.0 p1:openSUSE:Factory/firewalld/22.0 p2:security:netfilter/firewalld/55.0
- devel c:security:netfilter/firewalld/55.0 p1:security:netfilter/firewalld/54.0
- devel c:security:netfilter/firewalld/54.0 p1:security:netfilter/firewalld/53.0
- devel c:security:netfilter/firewalld/53.0 p1:security:netfilter/firewalld/51.0
- factory c:openSUSE:Factory/firewalld/22.0 p1:openSUSE:Factory/firewalld/21.0 p2:security:netfilter/firewalld/51.0
- devel c:security:netfilter/firewalld/51.0 p1:security:netfilter/firewalld/50.0
- devel c:security:netfilter/firewalld/50.0 p1:security:netfilter/firewalld/48.0
- factory c:openSUSE:Factory/firewalld/21.0 p1:openSUSE:Factory/firewalld/20.0 p2:security:netfilter/firewalld/48.0
- devel c:security:netfilter/firewalld/48.0 p1:security:netfilter/firewalld/47.0
- devel c:security:netfilter/firewalld/47.0 p1:security:netfilter/firewalld/45.0
- factory c:openSUSE:Factory/firewalld/20.0 p1:openSUSE:Factory/firewalld/19.0 p2:security:netfilter/firewalld/45.0
- devel c:security:netfilter/firewalld/45.0 p1:security:netfilter/firewalld/43.0
- factory c:openSUSE:Factory/firewalld/19.0 p1:openSUSE:Factory/firewalld/18.0 p2:security:netfilter/firewalld/43.0
- devel c:security:netfilter/firewalld/43.0 p1:security:netfilter/firewalld/41.0
- factory c:openSUSE:Factory/firewalld/18.0 p1:openSUSE:Factory/firewalld/17.0 p2:security:netfilter/firewalld/41.0
- devel c:security:netfilter/firewalld/41.0 p1:security:netfilter/firewalld/39.0
- factory c:openSUSE:Factory/firewalld/17.0 p1:openSUSE:Factory/firewalld/16.0 p2:security:netfilter/firewalld/39.0
- devel c:security:netfilter/firewalld/39.0 p1:security:netfilter/firewalld/38.0
- devel c:security:netfilter/firewalld/38.0 p1:security:netfilter/firewalld/36.0
- factory c:openSUSE:Factory/firewalld/16.0 p1:openSUSE:Factory/firewalld/15.0 p2:security:netfilter/firewalld/36.0
- devel c:security:netfilter/firewalld/36.0 p1:security:netfilter/firewalld/34.0
- factory c:openSUSE:Factory/firewalld/15.0 p1:openSUSE:Factory/firewalld/14.0 p2:security:netfilter/firewalld/34.0
- devel c:security:netfilter/firewalld/34.0 p1:security:netfilter/firewalld/32.0
- factory c:openSUSE:Factory/firewalld/14.0 p1:openSUSE:Factory/firewalld/13.0 p2:security:netfilter/firewalld/32.0
- devel c:security:netfilter/firewalld/32.0 p1:security:netfilter/firewalld/30.0
- factory c:openSUSE:Factory/firewalld/13.0 p1:openSUSE:Factory/firewalld/12.0 p2:security:netfilter/firewalld/30.0
- devel c:security:netfilter/firewalld/30.0 p1:security:netfilter/firewalld/28.0
- factory c:openSUSE:Factory/firewalld/12.0 p1:openSUSE:Factory/firewalld/11.0 p2:security:netfilter/firewalld/28.0
- devel c:security:netfilter/firewalld/28.0 p1:security:netfilter/firewalld/26.0
- factory c:openSUSE:Factory/firewalld/11.0 p1:openSUSE:Factory/firewalld/10.0 p2:security:netfilter/firewalld/26.0
- devel c:security:netfilter/firewalld/26.0 p1:security:netfilter/firewalld/24.0
- factory c:openSUSE:Factory/firewalld/10.0 p1:openSUSE:Factory/firewalld/9.0 p2:security:netfilter/firewalld/24.0
- devel c:security:netfilter/firewalld/24.0 p1:security:netfilter/firewalld/22.0
- factory c:openSUSE:Factory/firewalld/9.0 p1:openSUSE:Factory/firewalld/8.0 p2:security:netfilter/firewalld/22.0
- devel c:security:netfilter/firewalld/22.0 p1:security:netfilter/firewalld/21.0
- devel c:security:netfilter/firewalld/21.0 p1:security:netfilter/firewalld/19.0
- factory c:openSUSE:Factory/firewalld/8.0 p1:openSUSE:Factory/firewalld/7.0 p2:security:netfilter/firewalld/19.0
- devel c:security:netfilter/firewalld/19.0 p1:security:netfilter/firewalld/17.0
- factory c:openSUSE:Factory/firewalld/7.0 p1:openSUSE:Factory/firewalld/6.0 p2:security:netfilter/firewalld/17.0
- devel c:security:netfilter/firewalld/17.0 p1:security:netfilter/firewalld/15.0
- factory c:openSUSE:Factory/firewalld/6.0 p1:openSUSE:Factory/firewalld/5.0 p2:security:netfilter/firewalld/15.0
- devel c:security:netfilter/firewalld/15.0 p1:security:netfilter/firewalld/13.0
- factory c:openSUSE:Factory/firewalld/5.0 p1:openSUSE:Factory/firewalld/4.0 p2:security:netfilter/firewalld/13.0
- devel c:security:netfilter/firewalld/13.0 p1:security:netfilter/firewalld/11.0
- factory c:openSUSE:Factory/firewalld/4.0 p1:openSUSE:Factory/firewalld/3.0 p2:security:netfilter/firewalld/11.0
- devel c:security:netfilter/firewalld/11.0 p1:security:netfilter/firewalld/9.0
- factory c:openSUSE:Factory/firewalld/3.0 p1:openSUSE:Factory/firewalld/2.0 p2:security:netfilter/firewalld/9.0
- devel c:security:netfilter/firewalld/9.0 p1:security:netfilter/firewalld/8.0
- devel c:security:netfilter/firewalld/8.0 p1:security:netfilter/firewalld/6.0
- factory c:openSUSE:Factory/firewalld/2.0 p1:openSUSE:Factory/firewalld/1.0 p2:security:netfilter/firewalld/6.0
- devel c:security:netfilter/firewalld/6.0 p1:security:netfilter/firewalld/5.0
- devel c:security:netfilter/firewalld/5.0 p1:security:netfilter/firewalld/4.0
- devel c:security:netfilter/firewalld/4.0 p1:openSUSE:Factory/firewalld/1.0
- factory c:openSUSE:Factory/firewalld/1.0 p1:security:netfilter/firewalld/2.0
- factory c:security:netfilter/firewalld/2.0 p1:security:netfilter/firewalld/1.0
- factory c:security:netfilter/firewalld/1.0

View File

@ -0,0 +1,240 @@
- commit: openSUSE:Factory/firewalld/73.0
merged:
- security:netfilter/firewalld/131.0
- commit: openSUSE:Factory/firewalld/72.0
merged:
- security:netfilter/firewalld/129.0
- security:netfilter/firewalld/128.0
- commit: openSUSE:Factory/firewalld/71.0
merged:
- security:netfilter/firewalld/127.0
- commit: openSUSE:Factory/firewalld/70.0
- commit: openSUSE:Factory/firewalld/69.0
merged:
- security:netfilter/firewalld/126.0
- commit: openSUSE:Factory/firewalld/68.0
merged:
- security:netfilter/firewalld/125.0
- commit: openSUSE:Factory/firewalld/67.0
merged:
- security:netfilter/firewalld/124.0
- commit: openSUSE:Factory/firewalld/66.0
merged:
- security:netfilter/firewalld/123.0
- commit: openSUSE:Factory/firewalld/65.0
merged:
- security:netfilter/firewalld/122.0
- commit: openSUSE:Factory/firewalld/64.0
merged:
- security:netfilter/firewalld/121.0
- commit: openSUSE:Factory/firewalld/63.0
merged:
- security:netfilter/firewalld/120.0
- commit: openSUSE:Factory/firewalld/62.0
merged:
- security:netfilter/firewalld/119.0
- commit: openSUSE:Factory/firewalld/61.0
merged:
- security:netfilter/firewalld/118.0
- commit: openSUSE:Factory/firewalld/60.0
merged:
- security:netfilter/firewalld/117.0
- commit: openSUSE:Factory/firewalld/59.0
merged:
- security:netfilter/firewalld/116.0
- commit: openSUSE:Factory/firewalld/58.0
merged:
- security:netfilter/firewalld/115.0
- commit: openSUSE:Factory/firewalld/57.0
merged:
- security:netfilter/firewalld/114.0
- commit: openSUSE:Factory/firewalld/56.0
merged:
- security:netfilter/firewalld/113.0
- commit: openSUSE:Factory/firewalld/55.0
merged:
- security:netfilter/firewalld/112.0
- security:netfilter/firewalld/111.0
- security:netfilter/firewalld/110.0
- security:netfilter/firewalld/109.0
- commit: openSUSE:Factory/firewalld/54.0
merged:
- security:netfilter/firewalld/108.0
- commit: openSUSE:Factory/firewalld/53.0
merged:
- security:netfilter/firewalld/107.0
- commit: openSUSE:Factory/firewalld/52.0
- commit: openSUSE:Factory/firewalld/51.0
- commit: openSUSE:Factory/firewalld/50.0
merged:
- security:netfilter/firewalld/106.0
- commit: openSUSE:Factory/firewalld/49.0
merged:
- security:netfilter/firewalld/105.0
- security:netfilter/firewalld/104.0
- security:netfilter/firewalld/103.0
- commit: openSUSE:Factory/firewalld/48.0
merged:
- security:netfilter/firewalld/102.0
- commit: openSUSE:Factory/firewalld/47.0
merged:
- security:netfilter/firewalld/101.0
- commit: openSUSE:Factory/firewalld/46.0
merged:
- security:netfilter/firewalld/100.0
- commit: openSUSE:Factory/firewalld/45.0
merged:
- security:netfilter/firewalld/99.0
- commit: openSUSE:Factory/firewalld/44.0
merged:
- security:netfilter/firewalld/98.0
- commit: openSUSE:Factory/firewalld/43.0
merged:
- security:netfilter/firewalld/97.0
- security:netfilter/firewalld/96.0
- security:netfilter/firewalld/95.0
- security:netfilter/firewalld/94.0
- commit: openSUSE:Factory/firewalld/42.0
merged:
- security:netfilter/firewalld/93.0
- commit: openSUSE:Factory/firewalld/41.0
merged:
- security:netfilter/firewalld/92.0
- commit: openSUSE:Factory/firewalld/40.0
merged:
- security:netfilter/firewalld/91.0
- commit: openSUSE:Factory/firewalld/39.0
merged:
- security:netfilter/firewalld/90.0
- commit: openSUSE:Factory/firewalld/38.0
merged:
- security:netfilter/firewalld/89.0
- commit: openSUSE:Factory/firewalld/37.0
merged:
- security:netfilter/firewalld/88.0
- security:netfilter/firewalld/87.0
- security:netfilter/firewalld/86.0
- security:netfilter/firewalld/85.0
- commit: openSUSE:Factory/firewalld/36.0
merged:
- security:netfilter/firewalld/84.0
- security:netfilter/firewalld/83.0
- commit: openSUSE:Factory/firewalld/35.0
merged:
- security:netfilter/firewalld/82.0
- security:netfilter/firewalld/81.0
- security:netfilter/firewalld/80.0
- security:netfilter/firewalld/79.0
- security:netfilter/firewalld/78.0
- commit: openSUSE:Factory/firewalld/34.0
merged:
- security:netfilter/firewalld/77.0
- security:netfilter/firewalld/76.0
- security:netfilter/firewalld/75.0
- commit: openSUSE:Factory/firewalld/33.0
- commit: openSUSE:Factory/firewalld/32.0
merged:
- security:netfilter/firewalld/74.0
- commit: openSUSE:Factory/firewalld/31.0
- commit: openSUSE:Factory/firewalld/30.0
merged:
- security:netfilter/firewalld/71.0
- commit: openSUSE:Factory/firewalld/29.0
merged:
- security:netfilter/firewalld/69.0
- commit: openSUSE:Factory/firewalld/28.0
merged:
- security:netfilter/firewalld/68.0
- security:netfilter/firewalld/67.0
- commit: openSUSE:Factory/firewalld/27.0
merged:
- security:netfilter/firewalld/65.0
- commit: openSUSE:Factory/firewalld/26.0
merged:
- security:netfilter/firewalld/63.0
- commit: openSUSE:Factory/firewalld/25.0
merged:
- security:netfilter/firewalld/61.0
- security:netfilter/firewalld/60.0
- security:netfilter/firewalld/59.0
- commit: openSUSE:Factory/firewalld/24.0
merged:
- security:netfilter/firewalld/57.0
- commit: openSUSE:Factory/firewalld/23.0
merged:
- security:netfilter/firewalld/55.0
- security:netfilter/firewalld/54.0
- security:netfilter/firewalld/53.0
- commit: openSUSE:Factory/firewalld/22.0
merged:
- security:netfilter/firewalld/51.0
- security:netfilter/firewalld/50.0
- commit: openSUSE:Factory/firewalld/21.0
merged:
- security:netfilter/firewalld/48.0
- security:netfilter/firewalld/47.0
- commit: openSUSE:Factory/firewalld/20.0
merged:
- security:netfilter/firewalld/45.0
- commit: openSUSE:Factory/firewalld/19.0
merged:
- security:netfilter/firewalld/43.0
- commit: openSUSE:Factory/firewalld/18.0
merged:
- security:netfilter/firewalld/41.0
- commit: openSUSE:Factory/firewalld/17.0
merged:
- security:netfilter/firewalld/39.0
- security:netfilter/firewalld/38.0
- commit: openSUSE:Factory/firewalld/16.0
merged:
- security:netfilter/firewalld/36.0
- commit: openSUSE:Factory/firewalld/15.0
merged:
- security:netfilter/firewalld/34.0
- commit: openSUSE:Factory/firewalld/14.0
merged:
- security:netfilter/firewalld/32.0
- commit: openSUSE:Factory/firewalld/13.0
merged:
- security:netfilter/firewalld/30.0
- commit: openSUSE:Factory/firewalld/12.0
merged:
- security:netfilter/firewalld/28.0
- commit: openSUSE:Factory/firewalld/11.0
merged:
- security:netfilter/firewalld/26.0
- commit: openSUSE:Factory/firewalld/10.0
merged:
- security:netfilter/firewalld/24.0
- commit: openSUSE:Factory/firewalld/9.0
merged:
- security:netfilter/firewalld/22.0
- security:netfilter/firewalld/21.0
- commit: openSUSE:Factory/firewalld/8.0
merged:
- security:netfilter/firewalld/19.0
- commit: openSUSE:Factory/firewalld/7.0
merged:
- security:netfilter/firewalld/17.0
- commit: openSUSE:Factory/firewalld/6.0
merged:
- security:netfilter/firewalld/15.0
- commit: openSUSE:Factory/firewalld/5.0
merged:
- security:netfilter/firewalld/13.0
- commit: openSUSE:Factory/firewalld/4.0
merged:
- security:netfilter/firewalld/11.0
- commit: openSUSE:Factory/firewalld/3.0
merged:
- security:netfilter/firewalld/9.0
- security:netfilter/firewalld/8.0
- commit: openSUSE:Factory/firewalld/2.0
merged:
- security:netfilter/firewalld/6.0
- security:netfilter/firewalld/5.0
- security:netfilter/firewalld/4.0
- commit: openSUSE:Factory/firewalld/1.0
- commit: security:netfilter/firewalld/2.0
- commit: security:netfilter/firewalld/1.0

View File

@ -65,6 +65,15 @@ class TestTreeMethods(unittest.TestCase):
def test_000update_repos_tree(self): def test_000update_repos_tree(self):
self.verify_package("000update-repos") self.verify_package("000update-repos")
def test_breeze_tree(self):
self.verify_package("breeze")
def test_firewalld_tree(self):
self.verify_package("firewalld")
def test_FastCGI_tree(self):
self.verify_package("FastCGI")
if __name__ == "__main__": if __name__ == "__main__":
unittest.main() unittest.main()

19
update-tasks.sh Executable file
View File

@ -0,0 +1,19 @@
#!/bin/bash
#
cd /space/dmueller/git-importer
source credentials.sh
while true; do
for i in $PWD/tasks/*; do
if test -f "$i"; then
echo "$(date): Importing $(basename $i)"
if ! python3 ./git-importer.py -c repos/.cache $(basename $i); then
mkdir -p $PWD/failed-tasks
mv -f $i $PWD/failed-tasks
fi
rm -f $i
fi
done
inotifywait -q -e create $PWD/tasks
done