Merge pull request #2613 from ancorgs/drop_tools1

Drop obsolete tools
This commit is contained in:
Ancor Gonzalez Sosa 2021-09-23 12:42:33 +02:00 committed by GitHub
commit 8f4795d135
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
21 changed files with 13 additions and 2365 deletions

View File

@ -98,27 +98,6 @@ Releases distribution snapshots to openQA and publishes if the result is positiv
* Package: openSUSE-release-tools
* Usage: [gocd](https://github.com/openSUSE/openSUSE-release-tools/search?q=path%3A%2Fgocd+totest-manager)
#### unmaintained
Finds unmaintained binaries sourced from SLE.
* Sources: [unmaintained.py](unmaintained.py)
* Documentation: --
* Package: openSUSE-release-tools
* Usage: obsolete
#### sync-rebuild
Syncs openSUSE:Factory and openSUSE:Factory:Rebuild. This feature was already
merged into the [accept
command](https://github.com/openSUSE/openSUSE-release-tools/commit/87c891662015f14421c2315210c248e712e697c8)
of the staging projects plug-in.
* Sources: [sync-rebuild.py](sync-rebuild.py)
* Documentation: --
* Package: openSUSE-release-tools
* Usage: obsolete
#### bugowner
Manages bugowner information
@ -182,15 +161,6 @@ Allows to retrieve requests from OBS with quite elaborated queries.
* Package: openSUSE-release-tools
* Usage: ?
#### update_crawler (obsoleted by [origin-manager](#origin-manager))
Create SRs for Leap.
* Sources: [update_crawler.py](update_crawler.py) and [script](script).
* Documentation: --
* Package: openSUSE-release-tools-leaper
* Usage: obsolete (by origin-manager)
#### create_staging
Scripts and templates to create staging projects.
@ -211,15 +181,6 @@ Handles maintenance incident requests
* Package: openSUSE-release-tools-maintenance
* Usage: obsolete (by origin-manager)
#### leaper
Implements Leap-style services for non-Factory projects (whatever that means).
* Sources: [leaper.py](leaper.py)
* Documentation: --
* Package: openSUSE-release-tools-leaper
* Usage: obsolete
#### origin-manager
Keeps track of from what project a package originates, submit updates, review requests to detect origin changes, and enforce origin specific policies like adding appropriate reviews
@ -269,12 +230,13 @@ Checks ABI compatibility in OBS requests.
#### check_source_in_factory
Checks if the sources of a submission are either in Factory or a request for Factory with the same
sources exist.
sources exist. Not used as a standalone bot anymore, but called internally from
check_tags_in_requests.
* Sources: [check_source_in_factory.py](check_source_in_factory.py)
* Documentation: [docs/factory-source.asciidoc](docs/factory-source.asciidoc)
* Package: openSUSE-release-tools
* Usage: obsolete
* Usage: used from other bots (check_tags_in_requests)
#### openqa-maintenance
@ -297,15 +259,6 @@ Inspects built RPMs from staging projects.
* Package: openSUSE-release-tools-repo-checker
* Usage: gocd ([project-installcheck.py](https://github.com/openSUSE/openSUSE-release-tools/search?q=path%3A%2Fgocd+project-installcheck), [staging-installcheck](https://github.com/openSUSE/openSUSE-release-tools/search?q=path%3A%2Fgocd+staging-installcheck) and [maintenance-installcheck.py](https://github.com/openSUSE/openSUSE-release-tools/search?q=path%3A%2Fgocd+maintenance-installcheck)
#### manager_42.py
Maintains `00Meta/lookup.yml`.
* Sources: [manager_42.py](manager_42.py)
* Documentation: --
* Package: openSUSE-release-tools-leaper
* Usage: obsolete (by origin-manager)
### OSC Plugins
#### osc-check_source.py
@ -356,15 +309,6 @@ Manages staging projects.
* Package: osc-plugin-staging
* Usage: staging projects management
#### status.py
Checks the status of the staging workflow bots.
* Sources: [status.py](status.py)
* Documentation: --
* Package: openSUSE-release-tools
* Usage: obsolete (it still checks for the status of some bots that are already retired, like leaper)
#### fcc_submitter.py
The FactoryCandidates projects are used to determine whether a new package in Factory does build in
@ -388,15 +332,6 @@ changes to allow whitelisting before creating Bugzilla entries.
* Package: openSUSE-release-tools
* Usage: ???
#### obs-clone.py
Clones projects and dependencies between OBS instances.
* Sources: [obs_clone.py](obs_clone.py)
* Documentation: --
* Package: openSUSE-release-tools
* Usage: obsolete (initially added for testing, but it was replaced with a container-based approach)
#### obs-operator
Performs staging operations as a service instead of requiring the osc staging plugin to be utilized
@ -407,15 +342,6 @@ directly.
* Package: openSUSE-release-tools
* Usage: obsolete
#### scan_baselibs.py
Verifies 32bit binaries were imported properly towards a project.
* Sources: [scan_baselibs.py](scan_baselibs.py)
* Documentation: --
* Package: openSUSE-release-tools
* Usage: obsolete (after https://github.com/openSUSE/open-build-service/pull/7662 was introduced in OBS)
#### k8s-secret.py
Applies kubernetes secrets for OSRT tool osc configuration.

View File

@ -21,7 +21,6 @@ install:
sed -i "s/VERSION = .*/VERSION = '$(VERSION)'/" \
$(DESTDIR)$(pkgdatadir)/osclib/common.py
for i in $(pkgdata_BINS); do ln -s $(pkgdatadir)/$$i $(DESTDIR)$(bindir)/osrt-$${i%.*}; done
install -m 755 script/* $(DESTDIR)$(bindir)
ln -s $(pkgdatadir)/metrics/access/aggregate.php $(DESTDIR)$(bindir)/osrt-metrics-access-aggregate
ln -s $(pkgdatadir)/metrics/access/ingest.php $(DESTDIR)$(bindir)/osrt-metrics-access-ingest
cp -R config/* $(DESTDIR)$(sysconfdir)/$(package_name)

View File

@ -78,10 +78,6 @@ Then you can use the new `local` alias to access this new instance.
osc -A local api /about
A facsimile of `openSUSE:Factory` in the form of a subset of the related data can be quickly created in a local OBS instance using the `obs_clone` tool.
./obs_clone.py --debug --apiurl-target local
Some tests will attempt to run against the local OBS, but not all.
nosetests

View File

@ -88,7 +88,6 @@ class ToolBase(object):
root = ET.fromstring(self._meta_get_packagelist(prj, deleted, expand))
return [ node.get('name') for node in root.findall('entry') if not node.get('name') == '000product' and not node.get('name').startswith('patchinfo.') ]
# FIXME: duplicated from manager_42
def latest_packages(self, project):
data = self.cached_GET(self.makeurl(['project', 'latest_commits', project]))
lc = ET.fromstring(data)

View File

@ -132,8 +132,6 @@ class BugownerTool(ToolBase.ToolBase):
root = ET.fromstring(self.cached_GET(url))
for node in root.findall('.//person[@userid]'):
self.release_managers.add(node.get('userid'))
# XXX: hardcoded bot
self.release_managers.add('leaper')
logger.debug("release managers %s", self.release_managers)
return name in self.release_managers

View File

@ -22,10 +22,15 @@ import ReviewBot
class FactorySourceChecker(ReviewBot.ReviewBot):
""" this review bot checks if the sources of a submission are
either in Factory or a request for Factory with the same sources
exist. If the latter a request is only accepted if the Factory
request is reviewed positive."""
""" This review bot is obsolete since the introduction of better
alternatives like origin-manager. But it's kept because other bots like
TagChecker (check_tags_in_request) still call this bot as part of their
implementation.
This review bot was used in the past to check if the sources of a submission
are either in Factory or a request for Factory with the same sources exist.
If the latter a request is only accepted if the Factory request is reviewed
positive."""
def __init__(self, *args, **kwargs):
ReviewBot.ReviewBot.__init__(self, *args, **kwargs)

View File

@ -1 +0,0 @@
Create an OSRT:OriginManager config instead.

View File

@ -20,7 +20,6 @@
</repository>
<packages type="image">
<package name="openSUSE-release-tools-check-source"/>
<package name="openSUSE-release-tools-leaper"/>
<package name="openSUSE-release-tools-obs-operator"/>
<package name="openSUSE-release-tools-origin-manager"/>
<package name="openSUSE-release-tools-pkglistgen"/>

View File

@ -128,17 +128,6 @@ Requires(pre): shadow
%description check-source
Check source review bot that performs basic source analysis and assigns reviews.
%package leaper
Summary: Leap-style services
Group: Development/Tools/Other
BuildArch: noarch
Requires: %{name} = %{version}
Requires: osclib = %{version}
Requires(pre): shadow
%description leaper
Leap-style services for non-Factory projects.
%package maintenance
Summary: Maintenance related services
Group: Development/Tools/Other
@ -315,7 +304,7 @@ make %{?_smp_mflags}
%pre announcer
getent passwd osrt-announcer > /dev/null || \
useradd -r -m -s /sbin/nologin -c "user for openSUSE-release-tools-leaper" osrt-announcer
useradd -r -m -s /sbin/nologin -c "user for openSUSE-release-tools-announcer" osrt-announcer
exit 0
%postun announcer
@ -329,14 +318,6 @@ exit 0
%postun check-source
%systemd_postun
%pre leaper
getent passwd osrt-leaper > /dev/null || \
useradd -r -m -s /sbin/nologin -c "user for openSUSE-release-tools-leaper" osrt-leaper
exit 0
%postun leaper
%systemd_postun
%pre maintenance
getent passwd osrt-maintenance > /dev/null || \
useradd -r -m -s /sbin/nologin -c "user for openSUSE-release-tools-maintenance" osrt-maintenance
@ -412,13 +393,8 @@ exit 0
%{_bindir}/osrt-issue-diff
%{_bindir}/osrt-k8s-secret
%{_bindir}/osrt-legal-auto
%{_bindir}/osrt-obs_clone
%{_bindir}/osrt-openqa-maintenance
%{_bindir}/osrt-requestfinder
%{_bindir}/osrt-scan_baselibs
%{_bindir}/osrt-status
%{_bindir}/osrt-sync-rebuild
%{_bindir}/osrt-unmaintained
%{_bindir}/osrt-totest-manager
%{_datadir}/%{source_dir}
%exclude %{_datadir}/%{source_dir}/abichecker
@ -427,8 +403,6 @@ exit 0
%exclude %{_datadir}/%{source_dir}/check_source.pl
%exclude %{_datadir}/%{source_dir}/check_source.py
%exclude %{_datadir}/%{source_dir}/devel-project.py
%exclude %{_datadir}/%{source_dir}/leaper.py
%exclude %{_datadir}/%{source_dir}/manager_42.py
%exclude %{_datadir}/%{source_dir}/metrics
%exclude %{_datadir}/%{source_dir}/metrics.py
%exclude %{_datadir}/%{source_dir}/metrics_release.py
@ -444,7 +418,6 @@ exit 0
%exclude %{_datadir}/%{source_dir}/osc-cycle.py
%exclude %{_datadir}/%{source_dir}/osc-origin.py
%exclude %{_datadir}/%{source_dir}/osc-staging.py
%exclude %{_datadir}/%{source_dir}/update_crawler.py
%exclude %{_datadir}/%{source_dir}/findfileconflicts
%exclude %{_datadir}/%{source_dir}/write_repo_susetags_file.pl
%dir %{_sysconfdir}/openSUSE-release-tools
@ -475,17 +448,6 @@ exit 0
%{_datadir}/%{source_dir}/check_source.pl
%{_datadir}/%{source_dir}/check_source.py
%files leaper
%defattr(-,root,root,-)
%{_bindir}/osrt-leaper
%{_bindir}/osrt-leaper-crawler-*
%{_bindir}/osrt-manager_42
%{_bindir}/osrt-update_crawler
%{_datadir}/%{source_dir}/leaper.py
%{_datadir}/%{source_dir}/manager_42.py
%{_datadir}/%{source_dir}/update_crawler.py
%config(noreplace) %{_sysconfdir}/openSUSE-release-tools/manager_42
%files maintenance
%defattr(-,root,root,-)
%{_bindir}/osrt-check_maintenance_incidents

View File

@ -106,27 +106,6 @@ pipelines:
- staging-bot
tasks:
- script: ./check_maintenance_incidents.py --verbose review
openSUSE.Leaper.Maintenance:
group: openSUSE.Checkers
lock_behavior: unlockWhenFinished
timer:
spec: 0 */5 * ? * *
environment_variables:
OSC_CONFIG: /home/go/config/oscrc-leaper
materials:
git:
git: https://github.com/openSUSE/openSUSE-release-tools.git
stages:
- Run:
approval:
type: manual
jobs:
Run:
timeout: 30
resources:
- staging-bot
tasks:
- script: ./leaper.py -A https://api.opensuse.org --verbose project openSUSE:Maintenance maintenance_incident
Factory.Staging.Report:
group: openSUSE.Checkers
lock_behavior: unlockWhenFinished
@ -148,45 +127,6 @@ pipelines:
- staging-bot
tasks:
- script: ./staging-report.py --debug -A https://api.opensuse.org -p openSUSE:Factory
Source.In.Factory:
group: openSUSE.Checkers
lock_behavior: unlockWhenFinished
timer:
spec: 0 */30 * ? * *
environment_variables:
OSC_CONFIG: /home/go/config/oscrc-factory-source
materials:
git:
git: https://github.com/openSUSE/openSUSE-release-tools.git
stages:
- Run:
approval:
type: manual
jobs:
Run:
timeout: 30
resources:
- staging-bot
tasks:
- script: ./check_source_in_factory.py --factory openSUSE:Factory --factory openSUSE:Leap:15.0:Update --factory openSUSE:Leap:15.0 --factory openSUSE:Leap:15.0:Update --factory openSUSE:Leap:15.0 --review-mode=fallback-onfail --fallback-group=backports-reviewers --verbose review
Leaper:
group: openSUSE.Checkers
lock_behavior: unlockWhenFinished
environment_variables:
OSC_CONFIG: /home/go/config/oscrc-leaper
materials:
script:
git: https://github.com/openSUSE/openSUSE-release-tools.git
timer:
spec: 0 */5 * ? * *
only_on_changes: false
stages:
- Run:
approval: manual
resources:
- staging-bot
tasks:
- script: ./leaper.py -A https://api.opensuse.org --verbose --manual-version-updates --manual-maintenance-updates review
OS.Origin.Manager:
group: openSUSE.Checkers
lock_behavior: unlockWhenFinished

627
leaper.py
View File

@ -1,627 +0,0 @@
#!/usr/bin/python3
from pprint import pprint
import os
import sys
import re
import logging
from optparse import OptionParser
import cmdln
try:
from xml.etree import cElementTree as ET
except ImportError:
import cElementTree as ET
import osc.core
from osclib.conf import Config
from osclib.core import devel_project_get
from urllib.error import HTTPError
import yaml
import ReviewBot
from check_source_in_factory import FactorySourceChecker
class Leaper(ReviewBot.ReviewBot):
def __init__(self, *args, **kwargs):
ReviewBot.ReviewBot.__init__(self, *args, **kwargs)
# ReviewBot options.
self.request_default_return = True
self.comment_handler = True
self.do_comments = True
# for FactorySourceChecker
self.factory = FactorySourceChecker(*args, **kwargs)
self.needs_legal_review = False
self.needs_reviewteam = False
self.pending_factory_submission = False
self.source_in_factory = None
self.needs_release_manager = False
self.release_manager_group = None
self.review_team_group = None
self.legal_review_group = None
self.must_approve_version_updates = False
self.must_approve_maintenance_updates = False
self.needs_check_source = False
self.check_source_group = None
self.automatic_submission = False
# project => package list
self.packages = {}
def prepare_review(self):
# update lookup information on every run
self.lookup.reset()
def get_source_packages(self, project, expand=False):
"""Return the list of packages in a project."""
query = {'expand': 1} if expand else {}
try:
root = ET.parse(osc.core.http_GET(osc.core.makeurl(self.apiurl, ['source', project],
query=query))).getroot()
packages = [i.get('name') for i in root.findall('entry')]
except HTTPError as e:
# in case the project doesn't exist yet (like sle update)
if e.code != 404:
raise e
packages = []
return packages
def is_package_in_project(self, project, package):
if project not in self.packages:
self.packages[project] = self.get_source_packages(project)
return True if package in self.packages[project] else False
# this is so bad
def _webui_from_api(self, apiurl):
return apiurl.replace('//api.', '//build.')
# OBS is not smart enough to redirect to the mapped instance and just
# displays 404 so we have to do it ourselves
def package_link(self, project, package):
apiurl = self.apiurl
for prefix, url, srprefix in self.config.project_namespace_api_map:
if project.startswith(prefix):
apiurl = url
project = project[len(prefix):]
break
return '[%(project)s/%(package)s](%(url)s/package/show/%(project)s/%(package)s)' % {
'url': self._webui_from_api(apiurl),
'project': project,
'package': package,
}
def rdiff_link(self, src_project, src_package, src_rev, target_project, target_package = None):
if target_package is None:
target_package = src_package
apiurl = self.apiurl
for prefix, url, srprefix in self.config.project_namespace_api_map:
# can't rdiff if both sides are remote
# https://github.com/openSUSE/open-build-service/issues/4601
if target_project.startswith(prefix) and src_project.startswith(prefix):
apiurl = url
src_project = src_project[len(prefix):]
target_project = target_project[len(prefix):]
break
return self.package_link(target_project, target_package) + ' ([diff](%(url)s/package/rdiff/%(src_project)s/%(src_package)s?opackage=%(target_package)s&oproject=%(target_project)s&rev=%(src_rev)s))' % {
'url': self._webui_from_api(apiurl),
'src_project': src_project,
'src_package': src_package,
'src_rev': src_rev,
'target_project': target_project,
'target_package': target_package,
}
def _check_same_origin(self, origin, project):
if origin == 'FORK':
return True
if origin.startswith('Devel;'):
(dummy, origin, dummy) = origin.split(';')
# FIXME: to make the rest of the code easier this should probably check
# if the srcmd5 matches the origin project. That way it doesn't really
# matter from where something got submitted as long as the sources match.
return project.startswith(origin)
def check_source_submission(self, src_project, src_package, src_rev, target_project, target_package):
ret = self.check_source_submission_inner(src_project, src_package, src_rev, target_project, target_package)
# The layout of this code is just plain wrong and awkward. What is
# really desired a "post check source submission" not
# check_one_request() which applies to all action types and all actions
# at once. The follow-ups need the same context that the main method
# determining to flip the switch had. For maintenance incidents the
# ReviewBot code does a fair bit of mangling to the package and project
# values passed in which is not present during check_one_request().
# Currently the only feature used by maintenance is
# do_check_maintainer_review and the rest of the checks apply in the
# single action workflow so this should work fine, but really all the
# processing should be done per-action instead of per-request.
if self.do_check_maintainer_review:
self.devel_project_review_ensure(self.request, target_project, target_package)
return ret
def check_source_submission_inner(self, src_project, src_package, src_rev, target_project, target_package):
super(Leaper, self).check_source_submission(src_project, src_package, src_rev, target_project, target_package)
self.automatic_submission = False
if src_project == target_project and src_package == target_package:
self.logger.info('self submission detected')
self.needs_release_manager = True
return True
src_srcinfo = self.get_sourceinfo(src_project, src_package, src_rev)
package = target_package
origin = self.lookup.get(target_project, package)
origin_same = True
if origin:
origin_same = self._check_same_origin(origin, src_project)
if src_srcinfo is None:
# source package does not exist?
# handle here to avoid crashing on the next line
self.logger.warning("Could not get source info for %s/%s@%s" % (src_project, src_package, src_rev))
return False
if self.ibs and target_project.startswith('SUSE:SLE'):
review_result = None
prj = 'openSUSE.org:openSUSE:Factory'
# True or None (open request) are acceptable for SLE.
in_factory = self._check_factory(package, src_srcinfo, prj)
if in_factory:
review_result = True
self.source_in_factory = True
elif in_factory is None:
self.pending_factory_submission = True
else:
if not self.is_package_in_project(prj, package):
self.logger.info('the package is not in Factory, nor submitted there')
else:
self.logger.info('different sources in {}'.format(self.rdiff_link(src_project, src_package, src_rev, prj, package)))
if review_result == None and not self.pending_factory_submission:
other_projects_to_check = []
m = re.match(r'SUSE:SLE-(\d+)(?:-SP(\d+)):', target_project)
if m:
sle_version = int(m.group(1))
sp_version = int(m.group(2))
versions_to_check = []
# yeah, too much harcoding here
if sle_version == 12:
versions_to_check = [ '42.3' ]
elif sle_version == 15:
versions_to_check = [ '15.%d' % i for i in range(sp_version + 1) ]
else:
self.logger.error("can't handle %d.%d", sle_version, sp_version)
for version in versions_to_check:
leap = 'openSUSE.org:openSUSE:Leap:%s' % (version)
other_projects_to_check += [ leap, leap + ':Update', leap + ':NonFree', leap + ':NonFree:Update' ]
for prj in other_projects_to_check:
if self.is_package_in_project(prj, package):
self.logger.debug('checking {}'.format(prj))
if self._check_factory(package, src_srcinfo, prj) is True:
self.logger.info('found source match in {}'.format(prj))
devel_project, devel_package = devel_project_get(self.apiurl, 'openSUSE.org:openSUSE:Factory', package)
if devel_project is not None:
# specifying devel package is optional
if devel_package is None:
devel_package = package
if self.is_package_in_project(devel_project, devel_package):
if self._check_matching_srcmd5(devel_project, devel_package, src_srcinfo.verifymd5) == True:
self.logger.info('matching sources in {}/{}'.format(devel_project, devel_package))
return True
else:
self.logger.info('different sources in devel project {}'.format(self.rdiff_link(src_project, src_package, src_rev, devel_project, devel_package)))
else:
self.logger.info('no devel project found for {}/{}'.format('openSUSE.org:openSUSE:Factory', package))
self.logger.info('no matching sources found anywhere. Needs a human to decide whether that is ok. Please provide some justification to help that person.')
else:
leap = 'openSUSE.org:openSUSE:Leap:15.1'
if not self.is_package_in_project(target_project, package) \
and self.is_package_in_project(leap, package) \
and self._check_factory(package, src_srcinfo, leap) is False:
self.logger.info('different sources in {}'.format(self.rdiff_link(src_project, src_package, src_rev, leap, package)))
if not review_result and origin is not None:
review_result = origin_same
if origin_same or origin == 'openSUSE.org:openSUSE:Factory' and self.pending_factory_submission:
self.logger.info("ok, origin %s unchanged", origin)
else:
# only log origin state if it's taken into consideration for the review result
self.logger.info("Submitted from a different origin than expected ('%s')", origin)
self.needs_release_manager = True
# no result so far and also no factory submission to wait
# for. So just pass to avoid requring too much overrides
if not self.pending_factory_submission:
review_result = True
if not review_result and self.override_allow:
# Rather than decline, leave review open in-case of change and
# ask release manager for input via override comment.
self.logger.info('Comment `(at){} override accept` to force accept.'.format(self.review_user))
self.needs_release_manager = True
review_result = None
return review_result
if target_project.endswith(':Update'):
self.logger.info("expected origin is '%s' (%s)", origin,
"unchanged" if origin_same else "changed")
# Only when not from current product should request require maintainer review.
self.do_check_maintainer_review = False
if origin_same:
return True
good = self._check_matching_srcmd5(origin, target_package, src_srcinfo.verifymd5)
if good:
self.logger.info('submission source found in origin ({})'.format(origin))
return good
good = self.factory._check_requests(origin, target_package, src_srcinfo.verifymd5)
if good or good == None:
self.logger.info('found pending submission against origin ({})'.format(origin))
return good
# TODO #1662: Uncomment once maintbot has been superseded and leaper
# is no longer run in comment-only mode.
# self.do_check_maintainer_review = True
return None
elif self.action.type == 'maintenance_incident':
self.logger.debug('unhandled incident pattern (targetting non :Update project)')
return True
# obviously
if src_project in ('openSUSE:Factory', 'openSUSE:Factory:NonFree'):
self.source_in_factory = True
is_fine_if_factory = False
not_in_factory_okish = False
if origin:
self.logger.info("expected origin is '%s' (%s)", origin,
"unchanged" if origin_same else "changed")
if origin.startswith('Devel;'):
if origin_same == False:
self.logger.debug("not submitted from devel project")
return False
is_fine_if_factory = True
not_in_factory_okish = True
if self.must_approve_version_updates:
self.needs_release_manager = True
# fall through to check history and requests
elif origin.startswith('openSUSE:Factory'):
# A large number of requests are created by hand that leaper
# would have created via update_crawler.py. This applies to
# other origins, but primary looking to let Factory submitters
# know that there is no need to make manual submissions to both.
# Since it has a lookup entry it is not a new package.
self.automatic_submission = False
if self.must_approve_version_updates:
self.needs_release_manager = True
if origin == src_project:
self.source_in_factory = True
# no need to approve submissions from Factory if
# the lookup file points to Factory. Just causes
# spam for many maintainers #1393
self.do_check_maintainer_review = False
is_fine_if_factory = True
# fall through to check history and requests
elif origin == 'FORK':
is_fine_if_factory = True
if not src_project.startswith('SUSE:SLE-'):
not_in_factory_okish = True
self.needs_check_source = True
self.needs_release_manager = True
# fall through to check history and requests
# TODO Ugly save for 15.1 (n-1).
elif origin.startswith('openSUSE:Leap:15.0'):
if self.must_approve_maintenance_updates:
self.needs_release_manager = True
if src_project.startswith('openSUSE:Leap'):
self.do_check_maintainer_review = False
# submitted from :Update
if origin_same:
self.logger.debug("submission from 15.0 ok")
return True
# switching to sle package might make sense
if src_project.startswith('SUSE:SLE-15'):
self.needs_release_manager = True
self.do_check_maintainer_review = False
return True
# submitted from elsewhere but is in :Update
else:
good = self._check_matching_srcmd5('openSUSE:Leap:15.0:Update', target_package, src_srcinfo.verifymd5)
if good:
self.logger.info("submission found in 15.0")
return good
# check release requests too
good = self.factory._check_requests('openSUSE:Leap:15.0:Update', target_package, src_srcinfo.verifymd5)
if good or good == None:
self.logger.debug("found request")
return good
# let's see where it came from before
oldorigin = self.lookup.get('openSUSE:Leap:15.0', target_package)
if oldorigin:
self.logger.debug("oldorigin {}".format(oldorigin))
# Factory. So it's ok to keep upgrading it to Factory
# TODO: whitelist packages where this is ok and block others?
self.logger.info("Package was from %s in 15.0", oldorigin)
if oldorigin.startswith('openSUSE:Factory'):
# check if an attempt to switch to SLE package is made
for sp in ('SP1:GA', 'SP1:Update'):
good = self._check_matching_srcmd5('SUSE:SLE-15-{}'.format(sp), target_package, src_srcinfo.verifymd5)
if good:
self.logger.info("request sources come from SLE")
self.needs_release_manager = True
return good
# TODO Ugly save for 15.2 (n-2).
elif False and oldorigin.startswith('openSUSE:Leap:15.0'):
self.logger.info("Package was from %s in 15.0", oldorigin)
# the release manager needs to review attempts to upgrade to Factory
is_fine_if_factory = True
self.needs_release_manager = True
elif origin.startswith('SUSE:SLE-15'):
if self.must_approve_maintenance_updates:
self.needs_release_manager = True
if src_project.startswith('SUSE:SLE-15'):
self.do_check_maintainer_review = False
for v in ('15.0', '15.1'):
prj = 'openSUSE:Leap:{}:SLE-workarounds'.format(v)
if self.is_package_in_project( prj, target_package):
self.logger.info("found package in %s", prj)
if not self._check_matching_srcmd5(prj,
target_package,
src_srcinfo.verifymd5):
self.logger.info("sources in %s are NOT identical",
self.rdiff_link(src_project, src_package, src_rev, prj, package))
self.needs_release_manager = True
# submitted from :Update
if origin == src_project:
self.logger.debug("submission origin ok")
return True
elif origin.endswith(':GA') \
and src_project == origin[:-2] + 'Update':
self.logger.debug("sle update submission")
return True
# check if submitted from higher SP
priolist = ['SUSE:SLE-15:', 'SUSE:SLE-15-SP1:', 'SUSE:SLE-15-SP2:', 'SUSE:SLE-15-SP3:']
for i in range(len(priolist) - 1):
if origin.startswith(priolist[i]):
for prj in priolist[i + 1:]:
if src_project.startswith(prj):
self.logger.info("submission from higher service pack %s:* ok", prj)
return True
in_sle_origin = self._check_factory(target_package, src_srcinfo, origin)
if in_sle_origin:
self.logger.info('parallel submission, also in {}'.format(origin))
return True
self.needs_release_manager = True
# the release manager needs to review attempts to upgrade to Factory
is_fine_if_factory = True
else:
self.logger.error("unhandled origin %s", origin)
return False
else: # no origin
# submission from SLE is ok
if src_project.startswith('SUSE:SLE-15'):
self.do_check_maintainer_review = False
return True
# new package submitted from Factory. Check if it was in
# 42.3 before and skip maintainer review if so.
subprj = src_project[len('openSUSE:Factory'):]
# disabled for reference. Needed again for 16.0 probably
if False and self.source_in_factory and target_project.startswith('openSUSE:Leap:15.0') \
and self.is_package_in_project('openSUSE:Leap:42.3' + subprj, package):
self.logger.info('package was in 42.3')
self.do_check_maintainer_review = False
return True
is_fine_if_factory = True
self.needs_release_manager = True
if origin is None or not origin.startswith('SUSE:SLE-'):
for p in (':Update', ':GA'):
prj = 'SUSE:SLE-15' + p
if self.is_package_in_project(prj, package):
self.logger.info('Package is in {}'.format(
self.rdiff_link(src_project, src_package, src_rev, prj, package)))
break
is_in_factory = self.source_in_factory
# we came here because none of the above checks find it good, so
# let's see if the package is in Factory at least
if is_in_factory is None:
is_in_factory = self._check_factory(package, src_srcinfo)
if is_in_factory:
self.source_in_factory = True
self.needs_reviewteam = False
self.needs_legal_review = False
elif is_in_factory is None:
self.pending_factory_submission = True
self.needs_reviewteam = False
self.needs_legal_review = False
else:
if src_project.startswith('SUSE:SLE-15') \
or src_project.startswith('openSUSE:Leap:15.'):
self.needs_reviewteam = False
self.needs_legal_review = False
else:
self.needs_reviewteam = True
self.needs_legal_review = True
self.source_in_factory = False
if is_fine_if_factory:
if self.source_in_factory:
return True
elif self.pending_factory_submission:
return None
elif not_in_factory_okish:
self.needs_reviewteam = True
self.needs_legal_review = True
return True
if self.override_allow:
# Rather than decline, leave review open and ask release
# manager for input via override comment.
self.logger.info('Comment `(at){} override accept` to force accept.'.format(self.review_user))
self.needs_release_manager = True
return None
return False
def _check_factory(self, target_package, src_srcinfo, target_project='openSUSE:Factory'):
for subprj in ('', ':NonFree', ':Live'):
prj = ''.join((target_project, subprj))
good = self._check_matching_srcmd5(prj, target_package, src_srcinfo.verifymd5)
if good:
return good
good = self.factory._check_requests(prj, target_package, src_srcinfo.verifymd5)
if good or good == None:
self.logger.debug("found request to %s", prj)
return good
return False
def _check_project_and_request(self, project, target_package, src_srcinfo):
good = self._check_matching_srcmd5(project, target_package, src_srcinfo.verifymd5)
if good:
return good
good = self.factory._check_requests(project, target_package, src_srcinfo.verifymd5)
if good or good == None:
return good
return False
def check_one_request(self, req):
config = Config.get(self.apiurl, req.actions[0].tgt_project)
self.needs_legal_review = False
self.needs_reviewteam = False
self.needs_release_manager = False
self.pending_factory_submission = False
self.source_in_factory = None
self.do_check_maintainer_review = not self.ibs
self.packages = {}
request_ok = ReviewBot.ReviewBot.check_one_request(self, req)
self.logger.debug("review result: %s", request_ok)
if self.pending_factory_submission:
self.logger.info("submission is waiting for a Factory request to complete")
creator = req.get_creator()
bot_name = self.bot_name.lower()
if self.automatic_submission and creator != bot_name:
self.logger.info('@{}: this request would have been automatically created by {} after the Factory submission was accepted in order to eleviate the need to manually create requests for packages sourced from Factory'.format(creator, bot_name))
elif self.source_in_factory:
self.logger.info("perfect. the submitted sources are in or accepted for Factory")
elif self.source_in_factory == False:
self.logger.warning("the submitted sources are NOT in Factory")
if request_ok == False:
self.logger.info("NOTE: if you think the automated review was wrong here, please talk to the release team before reopening the request")
if self.do_comments:
result = None
if request_ok is None:
state = 'seen'
elif request_ok:
state = 'done'
result = 'accepted'
else:
state = 'done'
result = 'declined'
self.comment_write(state, result)
add_review_groups = []
if self.needs_release_manager:
add_review_groups.append(self.release_manager_group or
config.get(self.override_group_key))
if self.needs_reviewteam:
add_review_groups.append(self.review_team_group or
config.get('review-team'))
if self.needs_legal_review:
add_review_groups.append(self.legal_review_group or
config.get('legal-review-group'))
if self.needs_check_source and self.check_source_group is not None:
add_review_groups.append(self.check_source_group)
for group in add_review_groups:
if group is None:
continue
self.logger.info("{0} needs review by [{1}](/group/show/{1})".format(req.reqid, group))
self.add_review(req, by_group=group)
return request_ok
def check_action__default(self, req, a):
self.needs_release_manager = True
return super(Leaper, self).check_action__default(req, a)
class CommandLineInterface(ReviewBot.CommandLineInterface):
def __init__(self, *args, **kwargs):
ReviewBot.CommandLineInterface.__init__(self, args, kwargs)
self.clazz = Leaper
def get_optparser(self):
parser = ReviewBot.CommandLineInterface.get_optparser(self)
parser.add_option("--no-comment", dest='comment', action="store_false", default=True, help="don't actually post comments to obs")
parser.add_option("--manual-version-updates", action="store_true", help="release manager must approve version updates")
parser.add_option("--manual-maintenance-updates", action="store_true", help="release manager must approve maintenance updates")
parser.add_option("--check-source-group", dest="check_source_group", metavar="GROUP", help="group used by check_source.py bot which will be added as a reviewer should leaper checks pass")
parser.add_option("--review-team-group", dest="review_team_group", metavar="GROUP", help="group used for package reviews")
parser.add_option("--release-manager-group", dest="release_manager_group", metavar="GROUP", help="group used for release manager reviews")
parser.add_option("--legal-review-group", dest="legal_review_group", metavar="GROUP", help="group used for legal reviews")
return parser
def setup_checker(self):
bot = ReviewBot.CommandLineInterface.setup_checker(self)
if self.options.manual_version_updates:
bot.must_approve_version_updates = True
if self.options.manual_maintenance_updates:
bot.must_approve_maintenance_updates = True
if self.options.check_source_group:
bot.check_source_group = self.options.check_source_group
if self.options.review_team_group:
bot.review_team_group = self.options.review_team_group
if self.options.legal_review_group:
bot.legal_review_group = self.options.legal_review_group
if self.options.release_manager_group:
bot.release_manager_group = self.options.release_manager_group
bot.do_comments = self.options.comment
return bot
if __name__ == "__main__":
app = CommandLineInterface()
sys.exit( app.main() )

View File

@ -1,432 +0,0 @@
#!/usr/bin/python3
import argparse
from dateutil.parser import parse as date_parse
from datetime import datetime
import itertools
import logging
import sys
from xml.etree import cElementTree as ET
import osc.conf
import osc.core
from osc.core import get_commitlog
from osc.core import get_request_list
from urllib.error import HTTPError
import subprocess
import time
import yaml
from collections import namedtuple
from osclib.memoize import memoize
logger = logging.getLogger()
makeurl = osc.core.makeurl
http_GET = osc.core.http_GET
http_DELETE = osc.core.http_DELETE
http_PUT = osc.core.http_PUT
http_POST = osc.core.http_POST
class Manager42(object):
config_defaults = {
'ignored_packages': [
'00Meta',
'00aggregates',
'000product',
'000package-groups',
'000release-packages',
],
'project_preference_order': [],
'drop_if_vanished_from': [],
'from_prj': 'openSUSE:Leap:42.3',
'factory': 'openSUSE:Factory',
}
def __init__(self, caching = True, configfh = None):
self.caching = caching
self.apiurl = osc.conf.config['apiurl']
self.config = self._load_config(configfh)
self.force = False
self.parse_lookup(self.config.from_prj)
self.fill_package_meta()
self.packages = dict()
self.sle_workarounds = None
for project in [self.config.from_prj] + self.config.project_preference_order:
self._fill_package_list(project)
if project.endswith(':SLE-workarounds'):
self.sle_workarounds = project
# FIXME: add to ToolBase and rebase Manager42 on that
def _load_config(self, handle = None):
d = self.__class__.config_defaults
y = yaml.safe_load(handle) if handle is not None else {}
return namedtuple('BotConfig', sorted(d.keys()))(*[ y.get(p, d[p]) for p in sorted(d.keys()) ])
def latest_packages(self):
data = self.cached_GET(makeurl(self.apiurl,
['project', 'latest_commits',
self.config.from_prj]))
lc = ET.fromstring(data)
packages = set()
for entry in lc.findall('{http://www.w3.org/2005/Atom}entry'):
title = entry.find('{http://www.w3.org/2005/Atom}title').text
if title.startswith('In '):
packages.add(title[3:].split(' ')[0])
return sorted(packages)
def all_packages(self):
return self.packages[self.config.from_prj]
def parse_lookup(self, project):
self.lookup_changes = 0
self.lookup = {}
try:
self.lookup = yaml.safe_load(self._load_lookup_file(project))
except HTTPError as e:
if e.code != 404:
raise
def _load_lookup_file(self, prj):
return self.cached_GET(makeurl(self.apiurl,
['source', prj, '00Meta', 'lookup.yml']))
def _put_lookup_file(self, prj, data):
return http_PUT(makeurl(self.apiurl,
['source', prj, '00Meta', 'lookup.yml']), data=data)
def store_lookup(self):
if self.lookup_changes == 0:
logger.info('no change to lookup.yml')
return
data = yaml.dump(self.lookup, default_flow_style=False, explicit_start=True)
self._put_lookup_file(self.config.from_prj, data)
self.lookup_changes = 0
@memoize()
def _cached_GET(self, url):
return self.retried_GET(url).read()
def cached_GET(self, url):
if self.caching:
return self._cached_GET(url)
return self.retried_GET(url).read()
def retried_GET(self, url):
try:
return http_GET(url)
except HTTPError as e:
if 500 <= e.code <= 599:
logger.warning('Retrying {}'.format(url))
time.sleep(1)
return self.retried_GET(url)
raise e
def get_source_packages(self, project, expand=False):
"""Return the list of packages in a project."""
query = {'expand': 1} if expand else {}
try:
root = ET.fromstring(
self.cached_GET(makeurl(self.apiurl,
['source', project],
query=query)))
packages = [i.get('name') for i in root.findall('entry')]
except HTTPError as e:
if e.code == 404:
logger.error("{}: {}".format(project, e))
packages = []
else:
raise
return packages
def _get_source_package(self, project, package, revision):
opts = { 'view': 'info' }
if revision:
opts['rev'] = revision
return self.cached_GET(makeurl(self.apiurl,
['source', project, package], opts))
def crawl(self, packages):
"""Main method of the class that runs the crawler."""
for package in sorted(packages):
try:
self.check_one_package(package)
except HTTPError as e:
logger.error("Failed to check {}: {}".format(package, e))
pass
# avoid loosing too much work
if self.lookup_changes > 50:
self.store_lookup()
self.sle_workarounds_unneeded_check(package)
if self.lookup_changes:
self.store_lookup()
def sle_workarounds_unneeded_check(self, package):
# If SLE-workarounds project and package was not sourced from
# SLE-workarounds, but it does exist in SLE-workarounds.
if (self.sle_workarounds and not self.sle_workarounds_sourced and
package in self.packages[self.sle_workarounds]):
# Determine how recently the package was updated.
root = ET.fromstringlist(
get_commitlog(self.apiurl, self.sle_workarounds, package, None, format='xml'))
updated_last = date_parse(root.find('logentry/date').text)
age = datetime.now() - updated_last
if age.total_seconds() < 3600 * 24:
logger.debug('skip removal of {}/{} since updated within 24 hours'.format(
self.sle_workarounds, package))
return
requests = get_request_list(self.apiurl, self.sle_workarounds, package, req_type='submit')
if len(requests):
logger.debug('existing submit request involving {}/{}'.format(self.sle_workarounds, package))
return
self.delete_request(self.sle_workarounds, package,
'sourced from {}'.format(self.lookup.get(package)))
def delete_request(self, project, package, message):
requests = get_request_list(self.apiurl, project, package, req_type='delete')
if len(requests):
logger.debug('existing delete request for {}/{}'.format(project, package))
return
logger.info('creating delete request for {}/{}'.format(project, package))
# No proper API function to perform the same operation.
message = '"{}"'.format(message)
print(subprocess.check_output(
' '.join(['osc', 'dr', '-m', message, project, package]), shell=True))
def get_inconsistent(self):
known = set(self.lookup.keys())
stale = known - set(self.packages[self.config.from_prj])
unknown = set(self.packages[self.config.from_prj]) - known
if (stale):
logger.info("stale packages: %s", ', '.join(stale))
if (unknown):
logger.info("unknown packages: %s", ', '.join(unknown))
return (stale | unknown)
def get_package_history(self, project, package, deleted = False):
try:
query = {}
if deleted:
query['deleted'] = 1
return self.cached_GET(makeurl(self.apiurl,
['source', project, package, '_history'], query))
except HTTPError as e:
if e.code == 404:
return None
raise
def _is_ignored(self, package):
if package in self.config.ignored_packages:
logger.debug("%s in ignore list", package)
return True
return False
def _fill_package_list(self, project):
if project not in self.packages:
self.packages[project] = [ p for p in self.get_source_packages(project) if not self._is_ignored(p) ]
def check_source_in_project(self, project, package, verifymd5, deleted=False):
self._fill_package_list(project)
if not deleted and package not in self.packages[project]:
return None, None
his = self.get_package_history(project, package, deleted)
if his is None:
return None, None
his = ET.fromstring(his)
historyrevs = dict()
revs = list()
for rev in his.findall('revision'):
historyrevs[rev.find('srcmd5').text] = rev.get('rev')
revs.append(rev.find('srcmd5').text)
revs.reverse()
for i in range(min(len(revs), 5)): # check last commits
srcmd5 = revs.pop(0)
root = self.cached_GET(makeurl(self.apiurl,
['source', project, package], { 'rev': srcmd5, 'view': 'info'}))
root = ET.fromstring(root)
if root.get('verifymd5') == verifymd5:
return srcmd5, historyrevs[srcmd5]
return None, None
# check if we can find the srcmd5 in any of our underlay
# projects
def check_one_package(self, package):
self.sle_workarounds_sourced = False
lproject = self.lookup.get(package, None)
if package not in self.packages[self.config.from_prj]:
if not self._is_ignored(package):
logger.info("{} vanished".format(package))
if self.lookup.get(package):
del self.lookup[package]
self.lookup_changes += 1
return
root = ET.fromstring(self._get_source_package(self.config.from_prj, package, None))
linked = root.find('linked')
if not linked is None and linked.get('package') != package:
lstring = 'subpackage of {}'.format(linked.get('package'))
if lstring != lproject:
logger.warning("{} links to {} (was {})".format(package, linked.get('package'), lproject))
self.lookup[package] = lstring
self.lookup_changes += 1
else:
logger.debug("{} correctly marked as subpackage of {}".format(package, linked.get('package')))
return
pm = self.package_metas[package]
devel = pm.find('devel')
if devel is not None or (lproject is not None and lproject.startswith('Devel;')):
develprj = None
develpkg = None
if devel is None:
(dummy, develprj, develpkg) = lproject.split(';')
logger.warning('{} lacks devel project setting {}/{}'.format(package, develprj, develpkg))
else:
develprj = devel.get('project')
develpkg = devel.get('package')
srcmd5, rev = self.check_source_in_project(develprj, develpkg,
root.get('verifymd5'))
if srcmd5:
lstring = 'Devel;{};{}'.format(develprj, develpkg)
if package not in self.lookup or lstring != self.lookup[package]:
logger.debug("{} from devel {}/{} (was {})".format(package, develprj, develpkg, lproject))
self.lookup[package] = lstring
self.lookup_changes += 1
else:
logger.debug("{} lookup from {}/{} is correct".format(package, develprj, develpkg))
return
elif lproject and lproject != 'FORK' and not lproject.startswith('subpackage '):
srcmd5, rev = self.check_source_in_project(lproject, package, root.get('verifymd5'))
if srcmd5:
logger.debug("{} lookup from {} is correct".format(package, lproject))
# if it's from Factory we check if the package can be found elsewhere meanwhile
if not self.force and lproject != self.config.factory:
return
elif lproject == self.config.factory and package not in self.packages[lproject]:
his = self.get_package_history(lproject, package, deleted=True)
if his:
logger.debug("{} got dropped from {}".format(package, lproject))
logger.debug("check where %s came from", package)
foundit = False
for project in self.config.project_preference_order:
srcmd5, rev = self.check_source_in_project(project, package, root.get('verifymd5'))
if srcmd5:
if project != lproject:
if project.endswith(':SLE-workarounds'):
logger.info('{} is from {} but should come from {}'.format(package, project, lproject))
self.sle_workarounds_sourced = True
else:
logger.info('{} -> {} (was {})'.format(package, project, lproject))
self.lookup[package] = project
self.lookup_changes += 1
else:
logger.debug('{} still coming from {}'.format(package, project))
foundit = True
break
if not foundit:
if lproject == 'FORK':
logger.debug("{}: lookup is correctly marked as fork".format(package))
elif lproject in self.config.drop_if_vanished_from:
logger.info('{} dropped from {}'.format(package, lproject))
else:
logger.info('{} is a fork (was {})'.format(package, lproject))
self.lookup[package] = 'FORK'
self.lookup_changes += 1
def get_link(self, project, package):
try:
link = self.cached_GET(makeurl(self.apiurl,
['source', project, package, '_link']))
except HTTPError:
return None
return ET.fromstring(link)
def fill_package_meta(self):
self.package_metas = dict()
url = makeurl(self.apiurl, ['search', 'package'],
"match=[@project='%s']" % self.config.from_prj)
root = ET.fromstring(self.cached_GET(url))
for p in root.findall('package'):
name = p.attrib['name']
self.package_metas[name] = p
def main(args):
# Configure OSC
osc.conf.get_config(override_apiurl=args.apiurl)
osc.conf.config['debug'] = args.debug
uc = Manager42(caching = args.cache_requests, configfh = args.config )
given_packages = set(args.packages)
if args.all:
given_packages = set(uc.all_packages())
elif not given_packages:
given_packages = set(uc.latest_packages())
if args.check_inconsistent:
given_packages |= uc.get_inconsistent()
if args.force:
uc.force = True
uc.crawl(given_packages)
if __name__ == '__main__':
description = 'maintain 00Meta/lookup.yml'
parser = argparse.ArgumentParser(description=description)
parser.add_argument('-A', '--apiurl', metavar='URL', help='API URL')
parser.add_argument('-d', '--debug', action='store_true',
help='print info useful for debuging')
parser.add_argument('-a', '--all', action='store_true',
help='check all packages')
parser.add_argument('-c', '--config', dest='config', metavar='FILE',
type=argparse.FileType('r'), required = True,
help='read config file FILE')
parser.add_argument('-n', '--dry', action='store_true',
help='dry run, no POST, PUT, DELETE')
parser.add_argument('--force', action='store_true',
help='don\'t take previous lookup information into consideration')
parser.add_argument('--cache-requests', action='store_true', default=False,
help='cache GET requests. Not recommended for daily use.')
parser.add_argument('--check-inconsistent', action='store_true', default=False,
help='also check insonsistent lookup entries')
parser.add_argument("packages", nargs='*', help="packages to check")
args = parser.parse_args()
# Set logging configuration
logging.basicConfig(level=logging.DEBUG if args.debug
else logging.INFO,
format='%(asctime)s - %(module)s:%(lineno)d - %(levelname)s - %(message)s')
if args.dry:
def dryrun(t, *args, **kwargs):
return lambda *args, **kwargs: logger.debug("dryrun %s %s %s", t, args, str(kwargs)[:200])
http_POST = dryrun('POST')
http_PUT = dryrun('PUT')
http_DELETE = dryrun('DELETE')
sys.exit(main(args))

View File

@ -1,227 +0,0 @@
#!/usr/bin/python3
from copy import deepcopy
from lxml import etree as ET
from osc.core import copy_pac as copy_package
from osc.core import get_commitlog
from osc.core import http_GET
from osc.core import http_POST
from osc.core import http_PUT
from osc.core import makeurl
from osc.core import show_upstream_rev
from osclib.core import project_pseudometa_package
from urllib.error import HTTPError
import argparse
import osc.conf
import sys
def project_fence(project):
if ((project.startswith('openSUSE:') and project_fence.project.startswith('openSUSE:')) and
not project.startswith(project_fence.project)):
# Exclude other openSUSE:* projects while cloning a specifc one.
return False
if project.startswith('openSUSE:Factory:ARM'):
# Troublesome.
return False
# Perhaps use devel project list as filter, but for now quick exclude.
if project.startswith('SUSE:') or project.startswith('Ubuntu:'):
return False
return True
def entity_clone(apiurl_source, apiurl_target, path, sanitize=None, clone=None, after=None):
if not hasattr(entity_clone, 'cloned'):
entity_clone.cloned = []
if path[0] == 'source' and not project_fence(path[1]):
# Skip projects outside of fence by marking as cloned.
if path not in entity_clone.cloned:
entity_clone.cloned.append(path)
if path in entity_clone.cloned:
print('skip {}'.format('/'.join(path)))
return
print('clone {}'.format('/'.join(path)))
entity_clone.cloned.append(path)
url = makeurl(apiurl_source, path)
entity = ET.parse(http_GET(url)).getroot()
if sanitize:
sanitize(entity)
if clone:
clone(apiurl_source, apiurl_target, entity)
url = makeurl(apiurl_target, path)
http_PUT(url, data=ET.tostring(entity))
if after:
after(apiurl_source, apiurl_target, entity)
def users_clone(apiurl_source, apiurl_target, entity):
for person in entity.findall('person'):
path = ['person', person.get('userid')]
entity_clone(apiurl_source, apiurl_target, path, person_sanitize, after=person_clone_after)
for group in entity.findall('group'):
path = ['group', group.get('groupid')]
entity_clone(apiurl_source, apiurl_target, path, clone=group_clone)
def project_references_remove(project):
# Remove links that reference other projects.
for link in project.xpath('link[@project]'):
link.getparent().remove(link)
# Remove repositories that reference other projects.
for repository in project.xpath('repository[releasetarget or path]'):
repository.getparent().remove(repository)
# clone(Factory)
# - stripped
# - after
# - clone(Factory:ToTest)
# - stripped
# - after
# - clone(Factory)...skip
# - write real
# - write real
def project_clone(apiurl_source, apiurl_target, project):
users_clone(apiurl_source, apiurl_target, project)
project_workaround(project)
# Write stripped version that does not include repos with path references.
url = makeurl(apiurl_target, ['source', project.get('name'), '_meta'])
stripped = deepcopy(project)
project_references_remove(stripped)
http_PUT(url, data=ET.tostring(stripped))
for link in project.xpath('link[@project]'):
if not project_fence(link.get('project')):
project.remove(link)
break
# Valid reference to project and thus should be cloned.
path = ['source', link.get('project'), '_meta']
entity_clone(apiurl_source, apiurl_target, path, clone=project_clone)
# Clone projects referenced in repository paths.
for repository in project.findall('repository'):
for target in repository.xpath('./path') + repository.xpath('./releasetarget'):
if not project_fence(target.get('project')):
project.remove(repository)
break
# Valid reference to project and thus should be cloned.
path = ['source', target.get('project'), '_meta']
entity_clone(apiurl_source, apiurl_target, path, clone=project_clone)
def project_workaround(project):
if project.get('name') == 'openSUSE:Factory':
# See #1335 for details about temporary workaround in revision 429, but
# suffice it to say that over-complicates clone with multiple loops and
# may be introduced from time to time when Factory repo is hosed.
scariness = project.xpath('repository[@name="standard"]/path[contains(@project, ":0-Bootstrap")]')
if len(scariness):
scariness[0].getparent().remove(scariness[0])
def package_clone(apiurl_source, apiurl_target, package):
# Clone project that contains the package.
path = ['source', package.get('project'), '_meta']
entity_clone(apiurl_source, apiurl_target, path, clone=project_clone)
# Clone the dependencies of package.
users_clone(apiurl_source, apiurl_target, package)
# Clone devel project referenced by package.
devel = package.find('devel')
if devel is not None:
path = ['source', devel.get('project'), devel.get('package'), '_meta']
entity_clone(apiurl_source, apiurl_target, path, clone=package_clone, after=package_clone_after)
def package_clone_after(apiurl_source, apiurl_target, package):
copy_package(apiurl_source, package.get('project'), package.get('name'),
apiurl_target, package.get('project'), package.get('name'),
# TODO Would be preferable to preserve links, but need to
# recreat them since they do not match with copied package.
expand=True,
# TODO Can copy server-side if inner-connect is setup, but not
# clear how to trigger the equivalent of save in admin UI.
client_side_copy=True)
def person_sanitize(person):
person.find('email').text = person.find('email').text.split('@')[0] + '@example.com'
watchlist = person.find('watchlist')
if watchlist:
person.remove(watchlist)
def person_clone_after(apiurl_source, apiurl_target, person):
url = makeurl(apiurl_target, ['person', person.find('login').text], {'cmd': 'change_password'})
http_POST(url, data='opensuse')
def group_clone(apiurl_source, apiurl_target, group):
for person in group.findall('maintainer') + group.findall('person/person'):
path = ['person', person.get('userid')]
entity_clone(apiurl_source, apiurl_target, path, person_sanitize, after=person_clone_after)
def clone_do(apiurl_source, apiurl_target, project):
print('clone {} from {} to {}'.format(project, apiurl_source, apiurl_target))
try:
# TODO Decide how to choose what to clone via args.
# Rather than handle the self-referencing craziness with a proper solver
# the leaf can simple be used to start the chain and works as desired.
# Disable this when running clone repeatedly during developing as the
# projects cannot be cleanly re-created without more work.
entity_clone(apiurl_source, apiurl_target, ['source', project + ':Rings:1-MinimalX', '_meta'],
clone=project_clone)
pseudometa_project, pseudometa_package = project_pseudometa_package(apiurl_source, project)
entity_clone(apiurl_source, apiurl_target, ['source', pseudometa_project, pseudometa_package, '_meta'],
clone=package_clone, after=package_clone_after)
entity_clone(apiurl_source, apiurl_target, ['source', project, 'drush', '_meta'],
clone=package_clone, after=package_clone_after)
entity_clone(apiurl_source, apiurl_target, ['group', 'opensuse-review-team'],
clone=group_clone)
except HTTPError as e:
# Print full output for any errors since message can be cryptic.
print(e.read())
return 1
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Clone projects and dependencies between OBS instances.')
parser.set_defaults(func=clone_do)
parser.add_argument('-S', '--apiurl-source', metavar='URL', help='source API URL')
parser.add_argument('-T', '--apiurl-target', metavar='URL', help='target API URL')
parser.add_argument('-c', '--cache', action='store_true', help='cache source queries for 24 hours')
parser.add_argument('-d', '--debug', action='store_true', help='print info useful for debuging')
parser.add_argument('-p', '--project', default='openSUSE:Factory', help='project from which to clone')
args = parser.parse_args()
osc.conf.get_config(override_apiurl=args.apiurl_target)
apiurl_target = osc.conf.config['apiurl']
osc.conf.get_config(override_apiurl=args.apiurl_source)
apiurl_source = osc.conf.config['apiurl']
if apiurl_target == apiurl_source:
print('target APIURL must not be the same as source APIURL')
sys.exit(1)
if args.cache:
from osclib.cache import Cache
Cache.PATTERNS = {}
# Prevent caching source information from local clone.
Cache.PATTERNS['/source/[^/]+/[^/]+/[^/]+?rev'] = 0
Cache.PATTERNS['.*'] = Cache.TTL_LONG * 2
Cache.init('clone')
osc.conf.config['debug'] = args.debug
project_fence.project = args.project
sys.exit(args.func(apiurl_source, apiurl_target, args.project))

View File

@ -77,7 +77,6 @@ DEFAULT = {
'openqa': 'https://openqa.opensuse.org',
'lock': 'openSUSE:%(project)s:Staging',
'lock-ns': 'openSUSE',
'leaper-override-group': 'leap-reviewers',
'main-repo': 'standard',
'pseudometa_package': 'openSUSE:%(project)s:Staging/dashboard',
'download-baseurl': 'http://download.opensuse.org/distribution/leap/%(version)s/',
@ -118,7 +117,6 @@ DEFAULT = {
},
r'openSUSE:(?P<project>Leap:(?P<version>[\d.]+)?:Update)$': {
'main-repo': 'standard',
'leaper-override-group': 'leap-reviewers',
'repo_checker-arch-whitelist': 'x86_64',
'repo_checker-no-filter': 'True',
'repo_checker-package-comment-devel': 'True',
@ -132,7 +130,6 @@ DEFAULT = {
'lock': 'openSUSE:%(project)s:Staging',
'lock-ns': 'openSUSE',
'onlyadi': 'True',
'leaper-override-group': 'backports-reviewers',
'review-team': 'opensuse-review-team',
'legal-review-group': 'legal-auto',
# review-team optionally added by leaper.py.

View File

@ -1,138 +0,0 @@
#!/usr/bin/python3
import argparse
import logging
import os
import re
import sys
from xml.etree import cElementTree as ET
from urllib.error import HTTPError
import osc.conf
import osc.core
from osclib.conf import Config
OPENSUSE = 'openSUSE:Factory'
PACKAGEFILE = 'packagelist_without_32bitRPMs_imported'
makeurl = osc.core.makeurl
http_GET = osc.core.http_GET
http_POST = osc.core.http_POST
class ScanBaselibs(object):
def __init__(self, project, repository, verbose, wipebinaries):
self.project = project
self.verbose = verbose
self.repo = repository
self.wipebinaries = wipebinaries
self.apiurl = osc.conf.config['apiurl']
self.debug = osc.conf.config['debug']
Config(self.apiurl, OPENSUSE)
self.config = osc.conf.config[OPENSUSE]
# TODO: would be better to parse baselibs.conf to know which arch was blocked
self.package_whitelist = list(set(self.config.get('allowed-missing-32bit-binaries-importing', '').split(' ')))
def get_packages(self, project):
"""Return the list of packages in a project."""
query = {'expand': 1}
root = ET.parse(http_GET(makeurl(self.apiurl, ['source', project], query=query))).getroot()
packages = [i.get('name') for i in root.findall('entry')]
return packages
def package_has_baselibs(self, project, package):
query = {'expand': 1}
root = ET.parse(http_GET(makeurl(self.apiurl, ['source', project, package], query=query))).getroot()
files = [i.get('name') for i in root.findall('entry') if i.get('name') == 'baselibs.conf']
if len(files):
return True
return False
def package_has_32bit_binaries(self, project, repo, package):
query = { 'package': package,
'repository': repo,
'arch': 'x86_64',
'multibuild': 1,
'view': 'binarylist' }
root = ET.parse(http_GET(makeurl(self.apiurl, ['build', project, '_result'], query = query))).getroot()
# assume 32bit importing RPMs can be appeared in multibuild-ed package
for binarylist in root.findall('./result/binarylist'):
binaries = [i.get('filename') for i in binarylist.findall('binary') if i.get('filename').startswith('::import::i586::')]
if len(binaries):
return True
return False
def check_package_baselibs(self, project, repo, wipebinaries):
"""Main method"""
# get souce packages from target
if self.verbose:
print('Gathering the package list from %s' % project)
packages = self.get_packages(project)
with open(os.getcwd() + '/' + PACKAGEFILE, "a") as f:
for pkg in packages:
if self.package_has_baselibs(project, pkg) and pkg not in self.package_whitelist:
if not self.package_has_32bit_binaries(project, repo, pkg):
f.write("%s\n" % pkg)
if self.verbose:
print('%s has baselibs.conf but 32bit RPMs does not exist on 64bit\'s build result.' % pkg)
if wipebinaries:
http_POST(makeurl(self.apiurl, ['build', project], {
'cmd': 'wipe',
'repository': repo,
'package': pkg,
'arch': 'i586' }))
f.close()
def scan(self):
"""Main method"""
try:
osc.core.show_project_meta(self.apiurl, self.project)
except HTTPError as e:
if e.code == 404:
print("Project %s does not exist!" % self.project)
return
print('Scanning...')
if os.path.isfile(os.getcwd() + '/' + PACKAGEFILE):
os.remove(os.getcwd() + '/' + PACKAGEFILE)
self.check_package_baselibs(self.project, self.repo, self.wipebinaries)
print('Done')
def main(args):
# Configure OSC
osc.conf.get_config(override_apiurl=args.apiurl)
osc.conf.config['debug'] = args.debug
uc = ScanBaselibs(args.project, args.repository, args.verbose, args.wipebinaries)
uc.scan()
if __name__ == '__main__':
description = 'Verifying 32bit binaries has imported properly towards a project, ' \
'if the 32bit binaries were not exist then wipes 32bit build result.' \
'This script is now only works on x86_64/i586'
parser = argparse.ArgumentParser(description=description)
parser.add_argument('-A', '--apiurl', metavar='URL', help='API URL')
parser.add_argument('-d', '--debug', action='store_true',
help='print the information useful for debugging')
parser.add_argument('-p', '--project', dest='project', metavar='PROJECT',
help='the project to check (default: %s)' % OPENSUSE,
default=OPENSUSE)
parser.add_argument('-r', '--repository', dest='repository', metavar='REPOSITORY',
help='the repository of binaries (default: %s)' % 'standard',
default='standard')
parser.add_argument('-v', '--verbose', action='store_true',
help='show the verbose information')
parser.add_argument('-w', '--wipebinaries', action='store_true', default=False,
help='wipe binaries found without imported 32bit RPMs')
args = parser.parse_args()
# Set logging configuration
logging.basicConfig(level=logging.DEBUG if args.debug
else logging.INFO)
sys.exit(main(args))

View File

@ -1,14 +0,0 @@
#!/bin/bash
TO=openSUSE:Leap:15.0
osrt-update_crawler --to $TO \
--from SUSE:SLE-15:GA \
--only-from SUSE:SLE-15:GA \
--only-from SUSE:SLE-15:Update \
"$@"
osrt-update_crawler --to $TO \
--from openSUSE:Factory \
--only-from openSUSE:Factory \
"$@"

View File

@ -1,40 +0,0 @@
#!/bin/bash
TO=openSUSE:Leap:42.3
osrt-update_crawler --to $TO \
--from SUSE:SLE-12-SP3:GA \
--only-from SUSE:SLE-12:GA \
--only-from SUSE:SLE-12:Update \
--only-from SUSE:SLE-12-SP1:GA \
--only-from SUSE:SLE-12-SP1:Update \
--only-from SUSE:SLE-12-SP2:GA \
--only-from SUSE:SLE-12-SP2:Update \
--only-from SUSE:SLE-12-SP3:GA \
--only-from SUSE:SLE-12-SP3:Update \
"$@"
osrt-update_crawler --to $TO \
--from openSUSE:Leap:42.3:Update \
--only-from openSUSE:Leap:42.1 \
--only-from openSUSE:Leap:42.1:Update \
--only-from openSUSE:Leap:42.2 \
--only-from openSUSE:Leap:42.2:Update \
--only-from openSUSE:Leap:42.3 \
--only-from openSUSE:Leap:42.3:Update \
"$@"
osrt-update_crawler --to $TO \
--from openSUSE:Leap:42.3:NonFree:Update \
--only-from openSUSE:Leap:42.1:NonFree \
--only-from openSUSE:Leap:42.1:NonFree:Update \
--only-from openSUSE:Leap:42.2:NonFree \
--only-from openSUSE:Leap:42.2:NonFree:Update \
--only-from openSUSE:Leap:42.3:NonFree \
--only-from openSUSE:Leap:42.3:NonFree:Update \
"$@"
osrt-update_crawler --to $TO \
--from openSUSE:Factory \
--only-from openSUSE:Factory \
"$@"

138
status.py
View File

@ -1,138 +0,0 @@
#!/usr/bin/python3
import argparse
from datetime import datetime
from osc import conf
from osc.core import ET
from osc.core import search
from osc.core import xpath_join
from osclib.comments import CommentAPI
from osclib.core import request_age
from osclib.memoize import memoize
import sys
def print_debug(message):
if conf.config['debug']:
print(message)
def request_debug(request, age, threshold):
print_debug('{}: {} {} [{}]'.format(request.get('id'), age, threshold, age <= threshold))
@memoize(session=True)
def check_comment(apiurl, bot, **kwargs):
if not len(kwargs):
return False
api = CommentAPI(apiurl)
comments = api.get_comments(**kwargs)
comment = api.comment_find(comments, bot)[0]
if comment:
return (datetime.utcnow() - comment['when']).total_seconds()
return False
def check(apiurl, entity, entity_type='group', comment=False, bot=None,
threshold=2 * 3600, threshold_require=True):
queries = {'request': {'limit': 1000, 'withfullhistory': 1}}
xpath = 'state[@name="new"] or state[@name="review"]'
if entity == 'staging-bot':
xpath = xpath_join(
xpath, 'review[starts-with(@by_project, "openSUSE:") and @state="new"]', op='and')
xpath = xpath_join(
xpath, 'history/@who="{}"'.format(entity), op='and')
requests = search(apiurl, queries, request=xpath)['request']
for request in requests:
age = request_age(request).total_seconds()
request_debug(request, age, threshold)
if age <= threshold:
return True
return False
xpath = xpath_join(
xpath, 'review[@by_{}="{}" and @state="new"]'.format(entity_type, entity), op='and')
requests = search(apiurl, queries, request=xpath)['request']
print_debug('{:,} requests'.format(len(requests)))
if not len(requests):
# Could check to see that a review has been performed in the last week.
return True
all_comment = True
for request in requests:
kwargs = {}
if comment == 'project':
# Would be a lot easier with lxml, but short of reparsing or monkey.
for review in request.findall('review[@by_project]'):
if review.get('by_project').startswith('openSUSE:'):
kwargs['project_name'] = review.get('by_project')
# TODO repo-checker will miss stagings where delete only problem so
# comment on request, but should be fixed by #1084.
elif comment:
kwargs['request_id'] = request.get('id')
age = request_age(request).total_seconds()
request_debug(request, age, threshold)
comment_age = check_comment(apiurl, bot, **kwargs)
if comment_age:
if comment_age <= threshold:
print_debug('comment found below threshold')
return True
elif age > threshold:
print_debug('no comment found and above threshold')
all_comment = False
if threshold_require:
return False
else:
continue
else:
print_debug('no comment found, but below threshold')
print_debug('all comments: {}'.format(all_comment))
return all_comment
def status(apiurl):
# TODO If request ordering via api (openSUSE/open-build-service#4108) is
# provided this can be implemented much more cleanly by looking for positive
# activity (review changes) in threshold. Without sorting, some sampling of
# all requests accepted are returned which is not useful.
# TODO legal-auto, does not make comments so pending the above.
bots = [
# No open requests older than 2 hours.
['factory-auto'],
# No open requests older than 2 hours or all old requests have comment.
['leaper', 'user', True, 'Leaper'],
# As long as some comment made in last 6 hours.
['repo-checker', 'user', 'project', 'RepoChecker', 6 * 3600, False],
# Different algorithm, any staging in last 24 hours.
['staging-bot', 'user', False, None, 24 * 3600],
]
all_alive = True
for bot in bots:
result = check(apiurl, *bot)
if not result:
all_alive = False
print('{} = {}'.format(bot[0], result))
return all_alive
def main(args):
conf.get_config(override_apiurl=args.apiurl)
conf.config['debug'] = args.debug
apiurl = conf.config['apiurl']
return not status(apiurl)
if __name__ == '__main__':
description = 'Check the status of the staging workflow bots.'
parser = argparse.ArgumentParser(description=description)
parser.add_argument('-A', '--apiurl', help='OBS instance API URL')
parser.add_argument('-d', '--debug', action='store_true', help='print useful debugging info')
parser.add_argument('-p', '--project', default='openSUSE:Factory', help='OBS project')
args = parser.parse_args()
sys.exit(main(args))

View File

@ -1,103 +0,0 @@
#!/usr/bin/python3
import sys
import os
import osc
import osc.core
import osc.conf
import xml.etree.ElementTree as ET
import re
results = []
repo = ""
architectures = ["x86_64", "i586"]
pkg = ""
projects = ['openSUSE:Factory', 'openSUSE:Factory:Rebuild']
# initialize osc config
osc.conf.get_config()
def get_prj_results(prj, arch):
url = osc.core.makeurl(osc.conf.config['apiurl'], ['build', prj, 'standard', arch, "_jobhistory?code=lastfailures"])
f = osc.core.http_GET(url)
xml = f.read()
results = []
root = ET.fromstring(xml)
xmllines = root.findall("./jobhist")
for pkg in xmllines:
if pkg.attrib['code'] == 'failed':
results.append(pkg.attrib['package'])
return results
def compare_results(factory, rebuild, testmode):
com_res = set(rebuild).symmetric_difference(set(factory))
if testmode != False:
print(com_res)
return com_res
def check_pkgs(rebuild_list):
url = osc.core.makeurl(osc.conf.config['apiurl'], ['source', 'openSUSE:Factory'])
f = osc.core.http_GET(url)
xml = f.read()
pkglist = []
root = ET.fromstring(xml)
xmllines = root.findall("./entry")
for pkg in xmllines:
if pkg.attrib['name'] in rebuild_list:
pkglist.append(pkg.attrib['name'])
return pkglist
def rebuild_pkg_in_factory(package, prj, arch, testmode, code=None):
query = { 'cmd': 'rebuild', 'arch': arch }
if package:
query['package'] = package
pkg = query['package']
u = osc.core.makeurl(osc.conf.config['apiurl'], ['build', prj], query=query)
if testmode != False:
print("Trigger rebuild for this package: " + u)
else:
try:
print('tried to trigger rebuild for project \'%s\' package \'%s\'' % (prj, pkg))
f = osc.core.http_POST(u)
except:
print('could not trigger rebuild for project \'%s\' package \'%s\'' % (prj, pkg))
testmode = False
try:
if sys.argv[1] != None:
if sys.argv[1] == '-test':
testmode = True
print("testmode: " + str(testmode))
else:
testmode = False
except:
pass
for arch in architectures:
fact_result = get_prj_results('openSUSE:Factory', arch)
rebuild_result = get_prj_results('openSUSE:Factory:Rebuild', arch)
rebuild_result = check_pkgs(rebuild_result)
fact_result = check_pkgs(fact_result)
result = compare_results(fact_result, rebuild_result, testmode)
print(sorted(result))
for package in result:
rebuild_pkg_in_factory(package, 'openSUSE:Factory', arch, testmode, None)
rebuild_pkg_in_factory(package, 'openSUSE:Factory:Rebuild', arch, testmode, None)

View File

@ -1,81 +0,0 @@
#!/usr/bin/python3
import argparse
from lxml import etree as ET
from osc import conf
from osc.core import meta_get_filelist
from osclib.core import package_binary_list
from osclib.core import source_file_load
import sys
import yaml
def kiwi_binaries(apiurl, project):
binaries = set()
for filename in meta_get_filelist(apiurl, project, '000product'):
if not filename.endswith('.kiwi'):
continue
kiwi = ET.fromstring(source_file_load(
apiurl, project, '000product', filename))
binaries.update(kiwi.xpath('//instsource/repopackages/repopackage/@name'))
return binaries
def unmaintained(apiurl, project_target):
lookup = yaml.safe_load(source_file_load(
apiurl, project_target, '00Meta', 'lookup.yml'))
lookup_total = len(lookup)
lookup = {k: v for k, v in lookup.iteritems() if v.startswith('SUSE:SLE')}
package_binaries, _ = package_binary_list(
apiurl, project_target, 'standard', 'x86_64', exclude_src_debug=True)
package_binaries_total = len(package_binaries)
package_binaries = [pb for pb in package_binaries if pb.package in lookup]
# Determine max length possible for each column.
maxes = [
len(max([b.name for b in package_binaries], key=len)),
len(max(lookup.keys(), key=len)),
len(max(lookup.values(), key=len)),
]
line_format = ' '.join(['{:<' + str(m) + '}' for m in maxes])
print(line_format.format('binary', 'package', 'source project'))
project_sources = {}
binaries_unmaintained = 0
packages_unmaintained = set()
for package_binary in sorted(package_binaries, key=lambda pb: pb.name):
project_source = lookup[package_binary.package]
if project_source not in project_sources:
# Load binaries referenced in kiwi the first time source encountered.
project_sources[project_source] = kiwi_binaries(apiurl, project_source)
if package_binary.name not in project_sources[project_source]:
print(line_format.format(
package_binary.name, package_binary.package, project_source))
binaries_unmaintained += 1
packages_unmaintained.add(package_binary.package)
print('{:,} of {:,} binaries ({:,} packages) unmaintained from SLE of {:,} total binaries ({:,} packages) in project'.format(
binaries_unmaintained, len(package_binaries), len(packages_unmaintained), package_binaries_total, lookup_total))
def main(args):
conf.get_config(override_apiurl=args.apiurl)
conf.config['debug'] = args.debug
apiurl = conf.config['apiurl']
return not unmaintained(apiurl, args.project_target)
if __name__ == '__main__':
description = 'Review each binary in target project sourced from SLE to see if utilized in kiwi files.'
parser = argparse.ArgumentParser(description=description)
parser.add_argument('-A', '--apiurl', help='OBS instance API URL')
parser.add_argument('-d', '--debug', action='store_true', help='print useful debugging info')
parser.add_argument('project_target', help='target project to search')
args = parser.parse_args()
sys.exit(main(args))

View File

@ -1,372 +0,0 @@
#!/usr/bin/python3
import argparse
import itertools
import logging
import sys
import time
from urllib.error import HTTPError
from xml.etree import cElementTree as ET
import osc.conf
import osc.core
import rpm
import yaml
import re
from urllib import quote_plus
from osclib.memoize import memoize
from osclib.conf import Config
from osclib.core import devel_project_get
from osclib.stagingapi import StagingAPI
OPENSUSE = 'openSUSE:Leap:42.3'
FACTORY = 'openSUSE:Factory'
SLE = 'SUSE:SLE-12-SP2:Update'
makeurl = osc.core.makeurl
http_GET = osc.core.http_GET
# http://stackoverflow.com/questions/312443/how-do-you-split-a-list-into-evenly-sized-chunks-in-python
def chunks(l, n):
""" Yield successive n-sized chunks from l.
"""
for i in range(0, len(l), n):
yield l[i:i + n]
class UpdateCrawler(object):
def __init__(self, from_prj, to_prj):
self.from_prj = from_prj
self.to_prj = to_prj
self.apiurl = osc.conf.config['apiurl']
self.debug = osc.conf.config['debug']
self.filter_lookup = set()
self.caching = False
self.dryrun = False
self.skipped = {}
self.submit_new = {}
self.api = StagingAPI(
osc.conf.config['apiurl'], project = to_prj)
self.parse_lookup()
# FIXME: duplicated from manager_42
def latest_packages(self):
apiurl = self.apiurl
prj = self.from_prj
if prj.startswith('openSUSE.org:'):
apiurl = 'https://api.opensuse.org'
prj = prj[len('openSUSE.org:'):]
data = self.cached_GET(makeurl(apiurl,
['project', 'latest_commits', prj]))
lc = ET.fromstring(data)
packages = set()
for entry in lc.findall('{http://www.w3.org/2005/Atom}entry'):
title = entry.find('{http://www.w3.org/2005/Atom}title').text
if title.startswith('In '):
packages.add(title[3:].split(' ')[0])
return sorted(packages)
@memoize()
def _cached_GET(self, url):
return self.retried_GET(url).read()
def cached_GET(self, url):
if self.caching:
return self._cached_GET(url)
return self.retried_GET(url).read()
def retried_GET(self, url):
try:
return http_GET(url)
except HTTPError as e:
if 500 <= e.code <= 599:
print('Retrying {}'.format(url))
time.sleep(1)
return self.retried_GET(url)
raise e
def get_project_meta(self, prj):
url = makeurl(self.apiurl, ['source', prj, '_meta'])
return self.cached_GET(url)
def is_maintenance_project(self, prj):
root = ET.fromstring(self.get_project_meta(prj))
return root.get('kind', None) == 'maintenance_release'
def _meta_get_packagelist(self, prj, deleted=None, expand=False):
query = {}
if deleted:
query['deleted'] = 1
if expand:
query['expand'] = 1
u = osc.core.makeurl(self.apiurl, ['source', prj], query)
return self.cached_GET(u)
def meta_get_packagelist(self, prj, deleted=None, expand=False):
root = ET.fromstring(self._meta_get_packagelist(prj, deleted, expand))
return [ node.get('name') for node in root.findall('entry') if not node.get('name') == '000product' and not node.get('name').startswith('patchinfo.') ]
def _get_source_infos(self, project, packages):
query = ['view=info']
if packages:
query += ['package=%s' % quote_plus(p) for p in packages]
return self.cached_GET(makeurl(self.apiurl,
['source', project],
query))
def get_source_infos(self, project, packages):
ret = dict()
for pkg_chunks in chunks(sorted(packages), 50):
root = ET.fromstring(self._get_source_infos(project, pkg_chunks))
for package in root.findall('sourceinfo'):
if package.findall('error'):
continue
ret[package.get('package')] = package
return ret
def _get_source_package(self, project, package, revision):
opts = { 'view': 'info' }
if revision:
opts['rev'] = revision
return self.cached_GET(makeurl(self.apiurl,
['source', project, package], opts))
def _find_existing_request(self, src_project, src_package, rev, dst_project,
dst_package):
"""Create a submit request."""
states = ['new', 'review', 'declined', 'revoked', 'superseded']
reqs = osc.core.get_exact_request_list(self.apiurl,
src_project,
dst_project,
src_package,
dst_package,
req_type='submit',
req_state=states)
foundrev = False
for r in reqs:
for a in r.actions:
srcrev = a.src_rev
# sometimes requests only contain the decimal revision
if re.match(r'^\d+$', srcrev) is not None:
xml = ET.fromstring(self._get_source_package(src_project, src_package, srcrev))
srcrev = xml.get('verifymd5')
logging.debug('rev {}'.format(srcrev))
if srcrev == rev:
logging.debug('{}: found existing request {} {}/{}'.format(dst_package, r.reqid, a.src_project, src_project))
foundrev = True
return foundrev
def _submitrequest(self, src_project, src_package, rev, dst_project,
dst_package, msg):
res = 0
print("creating submit request", src_project, src_package, rev, dst_project, dst_package)
if not self.dryrun:
res = osc.core.create_submit_request(self.apiurl,
src_project,
src_package,
dst_project,
dst_package,
orev=rev,
message=msg)
return res
def submitrequest(self, src_project, src_package, rev, dst_package, origin):
"""Create a submit request using the osc.commandline.Osc class."""
dst_project = self.to_prj
msg = 'Automatic request from %s by UpdateCrawler' % src_project
if not self._find_existing_request(src_project, src_package, rev, dst_project, dst_package):
return self._submitrequest(src_project, src_package, rev, dst_project,
dst_package, msg)
return 0
def is_source_innerlink(self, project, package):
try:
root = ET.fromstring(
self.cached_GET(makeurl(self.apiurl,
['source', project, package, '_link']
)))
if root.get('project') is None and root.get('cicount'):
return True
except HTTPError as err:
# if there is no link, it can't be a link
if err.code == 404:
return False
raise
def parse_lookup(self):
self.lookup = yaml.safe_load(self._load_lookup_file())
def _load_lookup_file(self):
prj = self.to_prj
return self.cached_GET(makeurl(self.apiurl,
['source', prj, '00Meta', 'lookup.yml']))
def follow_link(self, project, package, rev, verifymd5):
# print "follow", project, package, rev
# verify it's still the same package
xml = ET.fromstring(self._get_source_package(project, package, rev))
if xml.get('verifymd5') != verifymd5:
return None
xml = ET.fromstring(self.cached_GET(makeurl(self.apiurl,
['source', project, package],
{
'rev': rev
})))
linkinfo = xml.find('linkinfo')
if not linkinfo is None:
ret = self.follow_link(linkinfo.get('project'), linkinfo.get('package'), linkinfo.get('srcmd5'), verifymd5)
if ret:
project, package, rev = ret
return (project, package, rev)
def update_targets(self, targets, sources):
# special case maintenance project. Only consider main
# package names. The code later follows the link in the
# source project then.
if self.is_maintenance_project(self.from_prj):
mainpacks = set()
for package, sourceinfo in sources.items():
if package.startswith('patchinfo.'):
continue
files = set([node.text for node in sourceinfo.findall('filename')])
if '{}.spec'.format(package) in files:
mainpacks.add(package)
sources = { package: sourceinfo for package, sourceinfo in sources.iteritems() if package in mainpacks }
for package, sourceinfo in sources.items():
origin = self.lookup.get(package, '')
if origin.startswith('Devel;'):
(dummy, origin, dummy) = origin.split(';')
if self.filter_lookup and origin not in self.filter_lookup:
if not origin.startswith('subpackage of'):
self.skipped.setdefault(origin, set()).add(package)
continue
if package not in targets:
if not self.submit_new:
logging.info('Package %s not found in targets' % (package))
continue
if self.is_source_innerlink(self.from_prj, package):
logging.debug('Package %s is sub package' % (package))
continue
else:
targetinfo = targets[package]
# XXX: make more generic :-)
devel_prj = devel_project_get(self.apiurl, FACTORY, package)
if devel_prj == 'devel:languages:haskell':
logging.info('skipping haskell package %s' % package)
continue
# Compare verifymd5
md5_from = sourceinfo.get('verifymd5')
md5_to = targetinfo.get('verifymd5')
if md5_from == md5_to:
# logging.info('Package %s not marked for update' % package)
continue
if self.is_source_innerlink(self.to_prj, package):
logging.debug('Package %s is sub package' % (package))
continue
# this makes only sense if we look at the expanded view
# and want to submit from proper project
# originproject = default_origin
# if not sourceinfo.find('originproject') is None:
# originproject = sourceinfo.find('originproject').text
# logging.warn('changed originproject for {} to {}'.format(package, originproject))
src_project, src_package, src_rev = self.follow_link(self.from_prj, package,
sourceinfo.get('srcmd5'),
sourceinfo.get('verifymd5'))
res = self.submitrequest(src_project, src_package, src_rev, package, origin)
if res:
logging.info('Created request %s for %s' % (res, package))
elif res != 0:
logging.error('Error creating the request for %s' % package)
def crawl(self, packages):
"""Main method of the class that run the crawler."""
targets = self.get_source_infos(self.to_prj, packages)
sources = self.get_source_infos(self.from_prj, packages)
self.update_targets(targets, sources)
def main(args):
# Configure OSC
osc.conf.get_config(override_apiurl=args.apiurl)
osc.conf.config['debug'] = args.osc_debug
# initialize stagingapi config
Config(osc.conf.config['apiurl'], args.to_prj)
uc = UpdateCrawler(args.from_prj, args.to_prj)
uc.caching = args.cache_requests
uc.dryrun = args.dry
uc.submit_new = args.new
if args.only_from:
for prj in args.only_from:
uc.filter_lookup.add(prj)
given_packages = args.packages
if not given_packages:
if args.all:
given_packages = uc.meta_get_packagelist(args.from_prj)
else:
given_packages = uc.latest_packages()
uc.crawl(given_packages)
if uc.skipped:
from pprint import pformat
logging.debug("skipped packages: %s", pformat(uc.skipped))
if __name__ == '__main__':
description = 'Create update SRs for Leap.'
parser = argparse.ArgumentParser(description=description)
parser.add_argument('-A', '--apiurl', metavar='URL', help='API URL')
parser.add_argument('-d', '--debug', action='store_true',
help='print info useful for debuging')
parser.add_argument('-a', '--all', action='store_true',
help='check all packages')
parser.add_argument('-n', '--dry', action='store_true',
help='dry run, no POST, PUT, DELETE')
parser.add_argument('-f', '--from', dest='from_prj', metavar='PROJECT',
help='project where to get the updates (default: %s)' % SLE,
default=SLE)
parser.add_argument('-t', '--to', dest='to_prj', metavar='PROJECT',
help='project where to submit the updates to (default: %s)' % OPENSUSE,
default=OPENSUSE)
parser.add_argument('--only-from', dest='only_from', metavar='PROJECT', action ='append',
help='only submit packages that came from PROJECT')
parser.add_argument("--osc-debug", action="store_true", help="osc debug output")
parser.add_argument("--new", action="store_true", help="also submit new packages")
parser.add_argument('--cache-requests', action='store_true', default=False,
help='cache GET requests. Not recommended for daily use.')
parser.add_argument("packages", nargs='*', help="packages to check")
args = parser.parse_args()
# Set logging configuration
logging.basicConfig(level=logging.DEBUG if args.debug
else logging.INFO)
if args.dry:
def dryrun(t, *args, **kwargs):
return lambda *args, **kwargs: logging.debug("dryrun %s %s %s", t, args, str(kwargs)[:200])
http_POST = dryrun('POST')
http_PUT = dryrun('PUT')
http_DELETE = dryrun('DELETE')
sys.exit(main(args))