Accepting request 391560 from systemsmanagement:saltstack:testing
- Prevent crash if pygit2 package requests recompilation. Add: * 0013-Prevent-crash-if-pygit2-package-is-requesting-re-com.patch - Align OS grains from older SLES with the current one (bsc#975757) Add: * 0014-align-OS-grains-from-older-SLES-with-current-one-326.patch - remove patches which produce duplicate functions: Remove: * 0004-implement-version_cmp-for-zypper.patch * 0005-pylint-changes.patch * 0006-Check-if-rpm-python-can-be-imported.patch - remove patches which add and revert the same file Remove: * 0007-Initial-Zypper-Unit-Tests-and-bugfixes.patch * 0009-Bugfix-on-SLE11-series-base-product-reported-as-addi.patch - rename patches: 0008-do-not-generate-a-date-in-a-comment-to-prevent-rebui.patch to 0004-do-not-generate-a-date-in-a-comment-to-prevent-rebui.patch 0010-Use-SHA256-hash-type-by-default.patch to 0005-Use-SHA256-hash-type-by-default.patch 0011-Update-to-2015.8.8.2.patch to 0006-Update-to-2015.8.8.2.patch 0012-Force-sort-the-RPM-output-to-ensure-latest-version-o.patch to 0007-Force-sort-the-RPM-output-to-ensure-latest-version-o.patch 0013-Cleaner-deprecation-process-with-decorators.patch to 0008-Cleaner-deprecation-process-with-decorators.patch - fix sorting by latest package Add: * 0009-fix-sorting-by-latest-version-when-called-with-an-at.patch - Prevent metadata download when getting installed products Add: * 0010-Prevent-metadata-download-when-getting-installed-pro.patch - Check if EOL is available in a particular product (bsc#975093) Add: * 0011-Check-if-EOL-is-available-in-a-particular-product-bs.patch - Bugfix: salt-key crashes if tries to generate keys to the directory w/o write access (bsc#969320) Add: * 0012-Bugfix-salt-key-crashes-if-tries-to-generate-keys-to.patch - Deprecation process using decorators and re-implementation of status.update function. Add: * 0013-Cleaner-deprecation-process-with-decorators.patch - Reverted the fake 2015.8.8.2 patch, with the right one, - this patch only contains: - https://github.com/saltstack/salt/pull/32135 - https://github.com/saltstack/salt/pull/32023 - https://github.com/saltstack/salt/pull/32117 - Ensure that in case of multi-packages installed on the system, the latest is reported by pkg.info_installed (bsc#972490) Add: * 0012-Force-sort-the-RPM-output-to-ensure-latest-version-o.patch - Update to the fake 2015.8.8.2 release upstream released a bunch of fixes on top of 2015.8.8, without creating a new tag and proper release. This commit includes all the changes between tag v2015.8.8 and commit ID 596444e2b447b7378dbcdfeb9fc9610b90057745 which introduces the fake 2015.8.8.2 release. see https://docs.saltstack.com/en/latest/topics/releases/2015.8.8.html#salt-2015-8-8-2 - Update to 2015.8.8 see https://docs.saltstack.com/en/latest/topics/releases/2015.8.8.html Patches renamed: * 0004-implement-version_cmp-for-zypper.patch * 0005-pylint-changes.patch * 0006-Check-if-rpm-python-can-be-imported.patch * 0007-Initial-Zypper-Unit-Tests-and-bugfixes.patch * 0008-do-not-generate-a-date-in-a-comment-to-prevent-rebui.patch * 0009-Bugfix-on-SLE11-series-base-product-reported-as-addi.patch * 0010-Use-SHA256-hash-type-by-default.patch Patches removed: * 0004-Fix-pkg.latest-prevent-crash-on-multiple-package-ins.patch * 0005-Fix-package-status-filtering-on-latest-version-and-i.patch * 0006-add_key-reject_key-do-not-crash-w-Permission-denied-.patch * 0007-Force-kill-websocket-s-child-processes-faster-than-d.patch * 0008-Fix-types-in-the-output-data-and-return-just-a-list-.patch * 0009-The-functions-in-the-state-module-that-return-a-retc.patch * 0010-add-handling-for-OEM-products.patch * 0011-improve-doc-for-list_pkgs.patch * 0012-implement-version_cmp-for-zypper.patch * 0013-pylint-changes.patch * 0014-Check-if-rpm-python-can-be-imported.patch * 0015-call-zypper-with-option-non-interactive-everywhere.patch * 0016-write-a-zypper-command-builder-function.patch * 0017-Fix-crash-with-scheduler-and-runners-31106.patch * 0018-unify-behavior-of-refresh.patch * 0019-add-refresh-option-to-more-functions.patch * 0020-simplify-checking-the-refresh-paramater.patch * 0021-do-not-change-kwargs-in-refresh-while-checking-a-val.patch * 0022-fix-argument-handling-for-pkg.download.patch * 0023-Initial-Zypper-Unit-Tests-and-bugfixes.patch * 0024-proper-checking-if-zypper-exit-codes-and-handling-of.patch * 0025-adapt-tests-to-new-zypper_check_result-output.patch * 0026-do-not-generate-a-date-in-a-comment-to-prevent-rebui.patch * 0027-make-suse-check-consistent-with-rh_service.patch * 0028-fix-numerical-check-of-osrelease.patch * 0029-Make-use-of-checksum-configurable-defaults-to-MD5-SH.patch * 0030-Bugfix-on-SLE11-series-base-product-reported-as-addi.patch * 0031-Only-use-LONGSIZE-in-rpm.info-if-available.-Otherwis.patch * 0032-Add-error-check-when-retcode-is-0-but-stderr-is-pres.patch * 0033-fixing-init-system-dectection-on-sles-11-refs-31617.patch * 0034-Fix-git_pillar-race-condition.patch * 0035-Fix-the-always-false-behavior-on-checking-state.patch * 0036-Use-SHA256-hash-type-by-default.patch OBS-URL: https://build.opensuse.org/request/show/391560 OBS-URL: https://build.opensuse.org/package/show/systemsmanagement:saltstack/salt?expand=0&rev=66
This commit is contained in:
parent
eac5c0b1cc
commit
56c946e70f
@ -1,7 +1,7 @@
|
||||
From f6534519ed6dcd443e9b5b8f7dc6dfe1fb508ab3 Mon Sep 17 00:00:00 2001
|
||||
From f9dbfde1c3e7782d78f6b0b2b6b564f61749941f Mon Sep 17 00:00:00 2001
|
||||
From: =?UTF-8?q?Klaus=20K=C3=A4mpf?= <kkaempf@suse.de>
|
||||
Date: Wed, 20 Jan 2016 11:00:15 +0100
|
||||
Subject: [PATCH 01/22] tserong@suse.com -- We don't have python-systemd, so
|
||||
Subject: [PATCH 01/12] tserong@suse.com -- We don't have python-systemd, so
|
||||
notify can't work
|
||||
|
||||
---
|
||||
|
@ -1,7 +1,7 @@
|
||||
From cd60b85c9e6bfd8ebf3505e5ff05e7fdec6211d6 Mon Sep 17 00:00:00 2001
|
||||
From af193a109fcae502c4cdd47507aea9f67d809b4b Mon Sep 17 00:00:00 2001
|
||||
From: =?UTF-8?q?Klaus=20K=C3=A4mpf?= <kkaempf@suse.de>
|
||||
Date: Wed, 20 Jan 2016 11:01:06 +0100
|
||||
Subject: [PATCH 02/22] Run salt master as dedicated salt user
|
||||
Subject: [PATCH 02/12] Run salt master as dedicated salt user
|
||||
|
||||
---
|
||||
conf/master | 3 ++-
|
||||
@ -9,7 +9,7 @@ Subject: [PATCH 02/22] Run salt master as dedicated salt user
|
||||
2 files changed, 5 insertions(+), 1 deletion(-)
|
||||
|
||||
diff --git a/conf/master b/conf/master
|
||||
index 643b5f4..36657e8 100644
|
||||
index aae46ef..064828a 100644
|
||||
--- a/conf/master
|
||||
+++ b/conf/master
|
||||
@@ -25,7 +25,8 @@
|
||||
|
@ -1,7 +1,7 @@
|
||||
From 9dc25e7dfb08a7cd583215d0206f18b15a44ccb1 Mon Sep 17 00:00:00 2001
|
||||
From 6035aef0c80ae12a068bee7613c5b7f7f48aa9d3 Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@suse.de>
|
||||
Date: Mon, 18 Jan 2016 16:28:48 +0100
|
||||
Subject: [PATCH 03/22] Check if byte strings are properly encoded in UTF-8
|
||||
Subject: [PATCH 03/12] Check if byte strings are properly encoded in UTF-8
|
||||
|
||||
Rename keywords arguments variable to a default name.
|
||||
---
|
||||
@ -9,10 +9,10 @@ Rename keywords arguments variable to a default name.
|
||||
1 file changed, 6 insertions(+), 5 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 0d62c68..dddcc2f 100644
|
||||
index fecb671..27b00d5 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -112,9 +112,9 @@ def info_installed(*names, **kwargs):
|
||||
@@ -164,9 +164,9 @@ def info_installed(*names, **kwargs):
|
||||
summary, description.
|
||||
|
||||
:param errors:
|
||||
@ -25,7 +25,7 @@ index 0d62c68..dddcc2f 100644
|
||||
|
||||
Valid attributes are:
|
||||
ignore, report
|
||||
@@ -127,7 +127,8 @@ def info_installed(*names, **kwargs):
|
||||
@@ -179,7 +179,8 @@ def info_installed(*names, **kwargs):
|
||||
salt '*' pkg.info_installed <package1> <package2> <package3> ...
|
||||
salt '*' pkg.info_installed <package1> attr=version,vendor
|
||||
salt '*' pkg.info_installed <package1> <package2> <package3> ... attr=version,vendor
|
||||
@ -35,7 +35,7 @@ index 0d62c68..dddcc2f 100644
|
||||
'''
|
||||
ret = dict()
|
||||
for pkg_name, pkg_nfo in __salt__['lowpkg.info'](*names, **kwargs).items():
|
||||
@@ -138,7 +139,7 @@ def info_installed(*names, **kwargs):
|
||||
@@ -190,7 +191,7 @@ def info_installed(*names, **kwargs):
|
||||
# Check, if string is encoded in a proper UTF-8
|
||||
value_ = value.decode('UTF-8', 'ignore').encode('UTF-8', 'ignore')
|
||||
if value != value_:
|
||||
|
@ -1,255 +0,0 @@
|
||||
From e21ddab93f22d1b2e1ad94368527f41102b69f16 Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@suse.de>
|
||||
Date: Fri, 22 Jan 2016 18:37:12 +0100
|
||||
Subject: [PATCH 04/22] Fix pkg.latest - prevent crash on multiple package
|
||||
installation
|
||||
|
||||
Fix PEP8: line continuation
|
||||
|
||||
Replace old fashion string memcopy with the list
|
||||
|
||||
Fix PEP8: line continuation
|
||||
|
||||
Bugfix: crash on "key not found" error
|
||||
|
||||
Fix formatting
|
||||
|
||||
Check the version of the package, instead of the package name
|
||||
|
||||
Get version as an explicit parameter
|
||||
|
||||
Use regexp type for the string.
|
||||
|
||||
Avoid backslashes where they are not needed
|
||||
|
||||
Remove unnecessary complexity and string increment
|
||||
|
||||
Add a new line before the last return
|
||||
|
||||
Add error handling
|
||||
|
||||
Cleanup formatting
|
||||
|
||||
Bugfix: do not treat SLS id as a package name if an empty 'pkgs' list specified.
|
||||
|
||||
Put 'kwargs' on its own line according to the common pattern
|
||||
---
|
||||
salt/modules/zypper.py | 43 +++++++++++++---------------------
|
||||
salt/states/pkg.py | 62 ++++++++++++++++++++++++--------------------------
|
||||
2 files changed, 46 insertions(+), 59 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index dddcc2f..63b38af 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -598,6 +598,7 @@ def install(name=None,
|
||||
pkgs=None,
|
||||
sources=None,
|
||||
downloadonly=None,
|
||||
+ version=None,
|
||||
**kwargs):
|
||||
'''
|
||||
Install the passed package(s), add refresh=True to run 'zypper refresh'
|
||||
@@ -666,23 +667,20 @@ def install(name=None,
|
||||
'new': '<new-version>'}}
|
||||
'''
|
||||
try:
|
||||
- pkg_params, pkg_type = __salt__['pkg_resource.parse_targets'](
|
||||
- name, pkgs, sources, **kwargs
|
||||
- )
|
||||
+ pkg_params, pkg_type = __salt__['pkg_resource.parse_targets'](name, pkgs, sources, **kwargs)
|
||||
except MinionError as exc:
|
||||
raise CommandExecutionError(exc)
|
||||
|
||||
if pkg_params is None or len(pkg_params) == 0:
|
||||
return {}
|
||||
|
||||
- version_num = kwargs.get('version')
|
||||
+ version_num = version
|
||||
if version_num:
|
||||
if pkgs is None and sources is None:
|
||||
# Allow "version" to work for single package target
|
||||
pkg_params = {name: version_num}
|
||||
else:
|
||||
- log.warning('\'version\' parameter will be ignored for multiple '
|
||||
- 'package targets')
|
||||
+ log.warning("'version' parameter will be ignored for multiple package targets")
|
||||
|
||||
if pkg_type == 'repository':
|
||||
targets = []
|
||||
@@ -691,18 +689,13 @@ def install(name=None,
|
||||
if version_num is None:
|
||||
targets.append(param)
|
||||
else:
|
||||
- match = re.match('^([<>])?(=)?([^<>=]+)$', version_num)
|
||||
+ match = re.match(r'^([<>])?(=)?([^<>=]+)$', version_num)
|
||||
if match:
|
||||
gt_lt, equal, verstr = match.groups()
|
||||
- prefix = gt_lt or ''
|
||||
- prefix += equal or ''
|
||||
- # If no prefix characters were supplied, use '='
|
||||
- prefix = prefix or '='
|
||||
- targets.append('{0}{1}{2}'.format(param, prefix, verstr))
|
||||
+ targets.append('{0}{1}{2}'.format(param, ((gt_lt or '') + (equal or '')) or '=', verstr))
|
||||
log.debug(targets)
|
||||
else:
|
||||
- msg = ('Invalid version string {0!r} for package '
|
||||
- '{1!r}'.format(version_num, name))
|
||||
+ msg = ('Invalid version string {0!r} for package {1!r}'.format(version_num, name))
|
||||
problems.append(msg)
|
||||
if problems:
|
||||
for problem in problems:
|
||||
@@ -731,19 +724,14 @@ def install(name=None,
|
||||
while targets:
|
||||
cmd = cmd_install + targets[:500]
|
||||
targets = targets[500:]
|
||||
-
|
||||
- out = __salt__['cmd.run'](
|
||||
- cmd,
|
||||
- output_loglevel='trace',
|
||||
- python_shell=False
|
||||
- )
|
||||
- for line in out.splitlines():
|
||||
- match = re.match(
|
||||
- "^The selected package '([^']+)'.+has lower version",
|
||||
- line
|
||||
- )
|
||||
- if match:
|
||||
- downgrades.append(match.group(1))
|
||||
+ call = __salt__['cmd.run_all'](cmd, output_loglevel='trace', python_shell=False)
|
||||
+ if call['retcode'] != 0:
|
||||
+ raise CommandExecutionError(call['stderr']) # Fixme: This needs a proper report mechanism.
|
||||
+ else:
|
||||
+ for line in call['stdout'].splitlines():
|
||||
+ match = re.match(r"^The selected package '([^']+)'.+has lower version", line)
|
||||
+ if match:
|
||||
+ downgrades.append(match.group(1))
|
||||
|
||||
while downgrades:
|
||||
cmd = cmd_install + ['--force'] + downgrades[:500]
|
||||
@@ -752,6 +740,7 @@ def install(name=None,
|
||||
__salt__['cmd.run'](cmd, output_loglevel='trace', python_shell=False)
|
||||
__context__.pop('pkg.list_pkgs', None)
|
||||
new = list_pkgs()
|
||||
+
|
||||
return salt.utils.compare_dicts(old, new)
|
||||
|
||||
|
||||
diff --git a/salt/states/pkg.py b/salt/states/pkg.py
|
||||
index 65ba779..d7c4503 100644
|
||||
--- a/salt/states/pkg.py
|
||||
+++ b/salt/states/pkg.py
|
||||
@@ -1373,8 +1373,7 @@ def latest(
|
||||
'''
|
||||
rtag = __gen_rtag()
|
||||
refresh = bool(
|
||||
- salt.utils.is_true(refresh)
|
||||
- or (os.path.isfile(rtag) and refresh is not False)
|
||||
+ salt.utils.is_true(refresh) or (os.path.isfile(rtag) and refresh is not False)
|
||||
)
|
||||
|
||||
if kwargs.get('sources'):
|
||||
@@ -1392,7 +1391,15 @@ def latest(
|
||||
'comment': 'Invalidly formatted "pkgs" parameter. See '
|
||||
'minion log.'}
|
||||
else:
|
||||
- desired_pkgs = [name]
|
||||
+ if isinstance(pkgs, list) and len(pkgs) == 0:
|
||||
+ return {
|
||||
+ 'name': name,
|
||||
+ 'changes': {},
|
||||
+ 'result': True,
|
||||
+ 'comment': 'No packages to install provided'
|
||||
+ }
|
||||
+ else:
|
||||
+ desired_pkgs = [name]
|
||||
|
||||
cur = __salt__['pkg.version'](*desired_pkgs, **kwargs)
|
||||
try:
|
||||
@@ -1431,33 +1438,29 @@ def latest(
|
||||
log.error(msg)
|
||||
problems.append(msg)
|
||||
else:
|
||||
- if salt.utils.compare_versions(ver1=cur[pkg],
|
||||
- oper='!=',
|
||||
- ver2=avail[pkg],
|
||||
- cmp_func=cmp_func):
|
||||
+ if salt.utils.compare_versions(ver1=cur[pkg], oper='!=', ver2=avail[pkg], cmp_func=cmp_func):
|
||||
targets[pkg] = avail[pkg]
|
||||
else:
|
||||
if not cur[pkg] or __salt__['portage_config.is_changed_uses'](pkg):
|
||||
targets[pkg] = avail[pkg]
|
||||
else:
|
||||
for pkg in desired_pkgs:
|
||||
- if not avail[pkg]:
|
||||
- if not cur[pkg]:
|
||||
+ if pkg not in avail:
|
||||
+ if not cur.get(pkg):
|
||||
msg = 'No information found for \'{0}\'.'.format(pkg)
|
||||
log.error(msg)
|
||||
problems.append(msg)
|
||||
- elif not cur[pkg] \
|
||||
- or salt.utils.compare_versions(ver1=cur[pkg],
|
||||
- oper='<',
|
||||
- ver2=avail[pkg],
|
||||
- cmp_func=cmp_func):
|
||||
+ elif not cur.get(pkg) \
|
||||
+ or salt.utils.compare_versions(ver1=cur[pkg], oper='<', ver2=avail[pkg], cmp_func=cmp_func):
|
||||
targets[pkg] = avail[pkg]
|
||||
|
||||
if problems:
|
||||
- return {'name': name,
|
||||
- 'changes': {},
|
||||
- 'result': False,
|
||||
- 'comment': ' '.join(problems)}
|
||||
+ return {
|
||||
+ 'name': name,
|
||||
+ 'changes': {},
|
||||
+ 'result': False,
|
||||
+ 'comment': ' '.join(problems)
|
||||
+ }
|
||||
|
||||
if targets:
|
||||
# Find up-to-date packages
|
||||
@@ -1471,9 +1474,7 @@ def latest(
|
||||
|
||||
if __opts__['test']:
|
||||
to_be_upgraded = ', '.join(sorted(targets))
|
||||
- comment = 'The following packages are set to be ' \
|
||||
- 'installed/upgraded: ' \
|
||||
- '{0}'.format(to_be_upgraded)
|
||||
+ comment = ['The following packages are set to be installed/upgraded: {0}'.format(to_be_upgraded)]
|
||||
if up_to_date:
|
||||
up_to_date_nb = len(up_to_date)
|
||||
if up_to_date_nb <= 10:
|
||||
@@ -1482,19 +1483,16 @@ def latest(
|
||||
'{0} ({1})'.format(name, cur[name])
|
||||
for name in up_to_date_sorted
|
||||
)
|
||||
- comment += (
|
||||
- ' The following packages are already '
|
||||
- 'up-to-date: {0}'
|
||||
- ).format(up_to_date_details)
|
||||
+ comment.append('The following packages are already up-to-date: {0}'.format(up_to_date_details))
|
||||
else:
|
||||
- comment += ' {0} packages are already up-to-date'.format(
|
||||
- up_to_date_nb
|
||||
- )
|
||||
+ comment.append('{0} packages are already up-to-date'.format(up_to_date_nb))
|
||||
|
||||
- return {'name': name,
|
||||
- 'changes': {},
|
||||
- 'result': None,
|
||||
- 'comment': comment}
|
||||
+ return {
|
||||
+ 'name': name,
|
||||
+ 'changes': {},
|
||||
+ 'result': None,
|
||||
+ 'comment': ' '.join(comment)
|
||||
+ }
|
||||
|
||||
# Build updated list of pkgs to exclude non-targeted ones
|
||||
targeted_pkgs = list(targets.keys()) if pkgs else None
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,7 +1,7 @@
|
||||
From 80784a70e90d16c5d8290fcc6bf8a0f4ec657ec0 Mon Sep 17 00:00:00 2001
|
||||
From a2ffa8e54f3cd8dba3c4b73cad086a6b93fb3a41 Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Fri, 4 Mar 2016 09:51:22 +0100
|
||||
Subject: [PATCH 26/26] do not generate a date in a comment to prevent rebuilds
|
||||
Subject: [PATCH 04/12] do not generate a date in a comment to prevent rebuilds
|
||||
(bsc#969407)
|
||||
|
||||
---
|
||||
@ -9,10 +9,10 @@ Subject: [PATCH 26/26] do not generate a date in a comment to prevent rebuilds
|
||||
1 file changed, 1 insertion(+), 2 deletions(-)
|
||||
|
||||
diff --git a/setup.py b/setup.py
|
||||
index 8caa45e..dd76c64 100755
|
||||
index 742eae5..d2dd8f7 100755
|
||||
--- a/setup.py
|
||||
+++ b/setup.py
|
||||
@@ -600,8 +600,7 @@ class Clean(clean):
|
||||
@@ -605,8 +605,7 @@ class Clean(clean):
|
||||
|
||||
|
||||
INSTALL_VERSION_TEMPLATE = '''\
|
@ -1,47 +0,0 @@
|
||||
From 2747b83d8009fb7386986ada1640456de2fe5304 Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@suse.de>
|
||||
Date: Wed, 27 Jan 2016 12:37:06 +0100
|
||||
Subject: [PATCH 05/22] Fix package status filtering on latest version and
|
||||
implement epoch support
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 9 ++++++---
|
||||
1 file changed, 6 insertions(+), 3 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 63b38af..4699904 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -245,7 +245,8 @@ def latest_version(*names, **kwargs):
|
||||
package_info = info_available(*names)
|
||||
for name in names:
|
||||
pkg_info = package_info.get(name, {})
|
||||
- if pkg_info.get('status', '').lower() in ['not installed', 'out-of-date']:
|
||||
+ status = pkg_info.get('status', '').lower()
|
||||
+ if status.find('not installed') > -1 or status.find('out-of-date') > -1:
|
||||
ret[name] = pkg_info.get('version')
|
||||
|
||||
# Return a string if only one package name passed
|
||||
@@ -314,7 +315,7 @@ def list_pkgs(versions_as_list=False, **kwargs):
|
||||
__salt__['pkg_resource.stringify'](ret)
|
||||
return ret
|
||||
|
||||
- cmd = ['rpm', '-qa', '--queryformat', '%{NAME}_|-%{VERSION}_|-%{RELEASE}\\n']
|
||||
+ cmd = ['rpm', '-qa', '--queryformat', '%{NAME}_|-%{VERSION}_|-%{RELEASE}_|-%|EPOCH?{%{EPOCH}}:{}|\\n']
|
||||
ret = {}
|
||||
out = __salt__['cmd.run'](
|
||||
cmd,
|
||||
@@ -322,7 +323,9 @@ def list_pkgs(versions_as_list=False, **kwargs):
|
||||
python_shell=False
|
||||
)
|
||||
for line in out.splitlines():
|
||||
- name, pkgver, rel = line.split('_|-')
|
||||
+ name, pkgver, rel, epoch = line.split('_|-')
|
||||
+ if epoch:
|
||||
+ pkgver = '{0}:{1}'.format(epoch, pkgver)
|
||||
if rel:
|
||||
pkgver += '-{0}'.format(rel)
|
||||
__salt__['pkg_resource.add_pkg'](ret, name, pkgver)
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,7 +1,7 @@
|
||||
From eea48a283a184a02223fc440fec54a47a5b47b62 Mon Sep 17 00:00:00 2001
|
||||
From d5fc00efc2f73018c4c6bf3bea03648dfd1340fc Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@suse.de>
|
||||
Date: Thu, 17 Mar 2016 12:30:23 +0100
|
||||
Subject: [PATCH 36/36] Use SHA256 hash type by default
|
||||
Subject: [PATCH 05/12] Use SHA256 hash type by default
|
||||
|
||||
---
|
||||
conf/master | 2 +-
|
||||
@ -10,7 +10,7 @@ Subject: [PATCH 36/36] Use SHA256 hash type by default
|
||||
3 files changed, 3 insertions(+), 3 deletions(-)
|
||||
|
||||
diff --git a/conf/master b/conf/master
|
||||
index cf05ec4..5f6beaa 100644
|
||||
index 064828a..5e75b15 100644
|
||||
--- a/conf/master
|
||||
+++ b/conf/master
|
||||
@@ -474,7 +474,7 @@ syndic_user: salt
|
||||
@ -23,10 +23,10 @@ index cf05ec4..5f6beaa 100644
|
||||
# The buffer size in the file server can be adjusted here:
|
||||
#file_buffer_size: 1048576
|
||||
diff --git a/conf/minion b/conf/minion
|
||||
index e17ec61..ba4111a 100644
|
||||
index b408942..32b0d0a 100644
|
||||
--- a/conf/minion
|
||||
+++ b/conf/minion
|
||||
@@ -447,7 +447,7 @@
|
||||
@@ -451,7 +451,7 @@
|
||||
#
|
||||
# Warning: Prior to changing this value, the minion should be stopped and all
|
||||
# Salt caches should be cleared.
|
||||
@ -36,7 +36,7 @@ index e17ec61..ba4111a 100644
|
||||
# The Salt pillar is searched for locally if file_client is set to local. If
|
||||
# this is the case, and pillar data is defined, then the pillar_roots need to
|
||||
diff --git a/conf/proxy b/conf/proxy
|
||||
index 0de6af8..77ecf3b 100644
|
||||
index e6ca631..e697357 100644
|
||||
--- a/conf/proxy
|
||||
+++ b/conf/proxy
|
||||
@@ -427,7 +427,7 @@
|
||||
@ -49,5 +49,5 @@ index 0de6af8..77ecf3b 100644
|
||||
# The Salt pillar is searched for locally if file_client is set to local. If
|
||||
# this is the case, and pillar data is defined, then the pillar_roots need to
|
||||
--
|
||||
2.7.3
|
||||
2.1.4
|
||||
|
168
0006-Update-to-2015.8.8.2.patch
Normal file
168
0006-Update-to-2015.8.8.2.patch
Normal file
@ -0,0 +1,168 @@
|
||||
From 00600229ac41ae618bf01e8af6e2c0183d924204 Mon Sep 17 00:00:00 2001
|
||||
From: Theo Chatzimichos <tampakrap@gmail.com>
|
||||
Date: Sat, 2 Apr 2016 12:29:04 +0200
|
||||
Subject: [PATCH 06/12] Update to 2015.8.8.2
|
||||
|
||||
upstream released a bunch of fixes on top of 2015.8.8, without creating a new
|
||||
tag and proper release. This commit includes:
|
||||
- https://github.com/saltstack/salt/pull/32135
|
||||
- https://github.com/saltstack/salt/pull/32023
|
||||
- https://github.com/saltstack/salt/pull/32117
|
||||
see https://docs.saltstack.com/en/latest/topics/releases/2015.8.8.html#salt-2015-8-8-2
|
||||
---
|
||||
salt/config.py | 63 ++++++++++++++++++++++++++++--------------------
|
||||
salt/modules/win_dacl.py | 7 +++---
|
||||
2 files changed, 41 insertions(+), 29 deletions(-)
|
||||
|
||||
diff --git a/salt/config.py b/salt/config.py
|
||||
index fe1f572..929e094 100644
|
||||
--- a/salt/config.py
|
||||
+++ b/salt/config.py
|
||||
@@ -63,7 +63,7 @@ FLO_DIR = os.path.join(
|
||||
|
||||
VALID_OPTS = {
|
||||
# The address of the salt master. May be specified as IP address or hostname
|
||||
- 'master': str,
|
||||
+ 'master': (str, list),
|
||||
|
||||
# The TCP/UDP port of the master to connect to in order to listen to publications
|
||||
'master_port': int,
|
||||
@@ -541,7 +541,7 @@ VALID_OPTS = {
|
||||
'file_recv': bool,
|
||||
'file_recv_max_size': int,
|
||||
'file_ignore_regex': list,
|
||||
- 'file_ignore_glob': bool,
|
||||
+ 'file_ignore_glob': list,
|
||||
'fileserver_backend': list,
|
||||
'fileserver_followsymlinks': bool,
|
||||
'fileserver_ignoresymlinks': bool,
|
||||
@@ -833,7 +833,7 @@ DEFAULT_MINION_OPTS = {
|
||||
'file_recv': False,
|
||||
'file_recv_max_size': 100,
|
||||
'file_ignore_regex': [],
|
||||
- 'file_ignore_glob': None,
|
||||
+ 'file_ignore_glob': [],
|
||||
'fileserver_backend': ['roots'],
|
||||
'fileserver_followsymlinks': True,
|
||||
'fileserver_ignoresymlinks': False,
|
||||
@@ -1348,26 +1348,30 @@ def _validate_opts(opts):
|
||||
Check that all of the types of values passed into the config are
|
||||
of the right types
|
||||
'''
|
||||
+ def format_multi_opt(valid_type):
|
||||
+ try:
|
||||
+ num_types = len(valid_type)
|
||||
+ except TypeError:
|
||||
+ # Bare type name won't have a length, return the name of the type
|
||||
+ # passed.
|
||||
+ return valid_type.__name__
|
||||
+ else:
|
||||
+ if num_types == 1:
|
||||
+ return valid_type.__name__
|
||||
+ elif num_types > 1:
|
||||
+ ret = ', '.join(x.__name__ for x in valid_type[:-1])
|
||||
+ ret += ' or ' + valid_type[-1].__name__
|
||||
+
|
||||
errors = []
|
||||
- err = ('Key {0} with value {1} has an invalid type of {2}, a {3} is '
|
||||
+
|
||||
+ err = ('Key \'{0}\' with value {1} has an invalid type of {2}, a {3} is '
|
||||
'required for this value')
|
||||
for key, val in six.iteritems(opts):
|
||||
if key in VALID_OPTS:
|
||||
- if isinstance(VALID_OPTS[key](), list):
|
||||
- if isinstance(val, VALID_OPTS[key]):
|
||||
- continue
|
||||
- else:
|
||||
- errors.append(
|
||||
- err.format(key, val, type(val).__name__, 'list')
|
||||
- )
|
||||
- if isinstance(VALID_OPTS[key](), dict):
|
||||
- if isinstance(val, VALID_OPTS[key]):
|
||||
- continue
|
||||
- else:
|
||||
- errors.append(
|
||||
- err.format(key, val, type(val).__name__, 'dict')
|
||||
- )
|
||||
- else:
|
||||
+ if isinstance(val, VALID_OPTS[key]):
|
||||
+ continue
|
||||
+
|
||||
+ if hasattr(VALID_OPTS[key], '__call__'):
|
||||
try:
|
||||
VALID_OPTS[key](val)
|
||||
if isinstance(val, (list, dict)):
|
||||
@@ -1384,14 +1388,21 @@ def _validate_opts(opts):
|
||||
VALID_OPTS[key].__name__
|
||||
)
|
||||
)
|
||||
- except ValueError:
|
||||
+ except (TypeError, ValueError):
|
||||
errors.append(
|
||||
- err.format(key, val, type(val).__name__, VALID_OPTS[key])
|
||||
- )
|
||||
- except TypeError:
|
||||
- errors.append(
|
||||
- err.format(key, val, type(val).__name__, VALID_OPTS[key])
|
||||
+ err.format(key,
|
||||
+ val,
|
||||
+ type(val).__name__,
|
||||
+ VALID_OPTS[key].__name__)
|
||||
)
|
||||
+ continue
|
||||
+
|
||||
+ errors.append(
|
||||
+ err.format(key,
|
||||
+ val,
|
||||
+ type(val).__name__,
|
||||
+ format_multi_opt(VALID_OPTS[key].__name__))
|
||||
+ )
|
||||
|
||||
# RAET on Windows uses 'win32file.CreateMailslot()' for IPC. Due to this,
|
||||
# sock_dirs must start with '\\.\mailslot\' and not contain any colons.
|
||||
@@ -1404,7 +1415,7 @@ def _validate_opts(opts):
|
||||
'\\\\.\\mailslot\\' + opts['sock_dir'].replace(':', ''))
|
||||
|
||||
for error in errors:
|
||||
- log.warning(error)
|
||||
+ log.debug(error)
|
||||
if errors:
|
||||
return False
|
||||
return True
|
||||
diff --git a/salt/modules/win_dacl.py b/salt/modules/win_dacl.py
|
||||
index d57bb7b..d9ee27a 100644
|
||||
--- a/salt/modules/win_dacl.py
|
||||
+++ b/salt/modules/win_dacl.py
|
||||
@@ -44,9 +44,10 @@ class daclConstants(object):
|
||||
# in ntsecuritycon has the extra bits 0x200 enabled.
|
||||
# Note that you when you set this permission what you'll generally get back is it
|
||||
# ORed with 0x200 (SI_NO_ACL_PROTECT), which is what ntsecuritycon incorrectly defines.
|
||||
- FILE_ALL_ACCESS = (ntsecuritycon.STANDARD_RIGHTS_REQUIRED | ntsecuritycon.SYNCHRONIZE | 0x1ff)
|
||||
|
||||
def __init__(self):
|
||||
+ self.FILE_ALL_ACCESS = (ntsecuritycon.STANDARD_RIGHTS_REQUIRED | ntsecuritycon.SYNCHRONIZE | 0x1ff)
|
||||
+
|
||||
self.hkeys_security = {
|
||||
'HKEY_LOCAL_MACHINE': 'MACHINE',
|
||||
'HKEY_USERS': 'USERS',
|
||||
@@ -88,7 +89,7 @@ class daclConstants(object):
|
||||
ntsecuritycon.DELETE,
|
||||
'TEXT': 'modify'},
|
||||
'FULLCONTROL': {
|
||||
- 'BITS': daclConstants.FILE_ALL_ACCESS,
|
||||
+ 'BITS': self.FILE_ALL_ACCESS,
|
||||
'TEXT': 'full control'}
|
||||
}
|
||||
}
|
||||
@@ -368,7 +369,7 @@ def add_ace(path, objectType, user, permission, acetype, propagation):
|
||||
path: path to the object (i.e. c:\\temp\\file, HKEY_LOCAL_MACHINE\\SOFTWARE\\KEY, etc)
|
||||
user: user to add
|
||||
permission: permissions for the user
|
||||
- acetypes: either allow/deny for each user/permission (ALLOW, DENY)
|
||||
+ acetype: either allow/deny for each user/permission (ALLOW, DENY)
|
||||
propagation: how the ACE applies to children for Registry Keys and Directories(KEY, KEY&SUBKEYS, SUBKEYS)
|
||||
|
||||
CLI Example:
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,29 +0,0 @@
|
||||
From 697e42284fdd1e18fef1d1747f64cb75be1e0bef Mon Sep 17 00:00:00 2001
|
||||
From: Duncan Mac-Vicar P <dmacvicar@suse.de>
|
||||
Date: Wed, 10 Feb 2016 09:24:57 +0100
|
||||
Subject: [PATCH 06/22] add_key/reject_key: do not crash w/Permission denied:
|
||||
'/var/cache/salt/master/.dfn' (#27796)
|
||||
|
||||
already upstream
|
||||
https://github.com/saltstack/salt/pull/30998
|
||||
---
|
||||
salt/crypt.py | 3 +++
|
||||
1 file changed, 3 insertions(+)
|
||||
|
||||
diff --git a/salt/crypt.py b/salt/crypt.py
|
||||
index ce27d9f..907ec0c 100644
|
||||
--- a/salt/crypt.py
|
||||
+++ b/salt/crypt.py
|
||||
@@ -55,6 +55,9 @@ def dropfile(cachedir, user=None):
|
||||
mask = os.umask(191)
|
||||
try:
|
||||
log.info('Rotating AES key')
|
||||
+ if os.path.isfile(dfn):
|
||||
+ log.info('AES key rotation already requested')
|
||||
+ return
|
||||
|
||||
with salt.utils.fopen(dfn, 'wb+') as fp_:
|
||||
fp_.write('')
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,25 +0,0 @@
|
||||
From a1782a9c76f0af20e88fa913dd2ac6dcb20c9c37 Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@suse.de>
|
||||
Date: Thu, 11 Feb 2016 15:16:43 +0100
|
||||
Subject: [PATCH 07/22] Force-kill websocket's child processes faster than
|
||||
default two minutes.
|
||||
|
||||
---
|
||||
pkg/salt-api.service | 1 +
|
||||
1 file changed, 1 insertion(+)
|
||||
|
||||
diff --git a/pkg/salt-api.service b/pkg/salt-api.service
|
||||
index ccf3d34..72379ba 100644
|
||||
--- a/pkg/salt-api.service
|
||||
+++ b/pkg/salt-api.service
|
||||
@@ -6,6 +6,7 @@ After=network.target
|
||||
Type=simple
|
||||
LimitNOFILE=8192
|
||||
ExecStart=/usr/bin/salt-api
|
||||
+TimeoutStopSec=3
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
||||
--
|
||||
2.1.4
|
||||
|
350
0007-Force-sort-the-RPM-output-to-ensure-latest-version-o.patch
Normal file
350
0007-Force-sort-the-RPM-output-to-ensure-latest-version-o.patch
Normal file
@ -0,0 +1,350 @@
|
||||
From e3a599712daafb88b6b77ebf6c7684fdd10ffedf Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@suse.de>
|
||||
Date: Wed, 30 Mar 2016 12:14:21 +0200
|
||||
Subject: [PATCH 07/12] Force-sort the RPM output to ensure latest version of
|
||||
the multi-package on top of the list.
|
||||
|
||||
- Remove version_cmp from the yumpkg and use just a lowpkg alias
|
||||
- Remove version_cmp from Zypper module and use just lowpkg alias
|
||||
- Merge yumpkg's and zypper's version_cmp for a common use
|
||||
- Sort installed pkgs data by version_cmp
|
||||
- Move "string to EVR" function to the utilities
|
||||
- Remove suse/redhat checks, refactor code.
|
||||
- Fix condition from returning None on 0
|
||||
- Remove tests from the zypper_test that belongs to rpm_test
|
||||
- Add lowpkg tests for version comparison
|
||||
- Fix lint
|
||||
- Fix the documentation
|
||||
---
|
||||
salt/modules/rpm.py | 60 +++++++++++++++++++++++++++++++++++++--
|
||||
salt/modules/yumpkg.py | 28 ++----------------
|
||||
salt/modules/zypper.py | 58 +------------------------------------
|
||||
salt/utils/__init__.py | 35 +++++++++++++++++++++++
|
||||
tests/unit/modules/rpm_test.py | 21 ++++++++++++++
|
||||
tests/unit/modules/zypper_test.py | 22 --------------
|
||||
6 files changed, 117 insertions(+), 107 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/rpm.py b/salt/modules/rpm.py
|
||||
index 5d60dd2..6026f18 100644
|
||||
--- a/salt/modules/rpm.py
|
||||
+++ b/salt/modules/rpm.py
|
||||
@@ -17,6 +17,19 @@ import salt.utils.pkg.rpm
|
||||
# pylint: disable=import-error,redefined-builtin
|
||||
from salt.ext.six.moves import shlex_quote as _cmd_quote
|
||||
from salt.ext.six.moves import zip
|
||||
+
|
||||
+try:
|
||||
+ import rpm
|
||||
+ HAS_RPM = True
|
||||
+except ImportError:
|
||||
+ HAS_RPM = False
|
||||
+
|
||||
+try:
|
||||
+ import rpmUtils.miscutils
|
||||
+ HAS_RPMUTILS = True
|
||||
+except ImportError:
|
||||
+ HAS_RPMUTILS = False
|
||||
+
|
||||
# pylint: enable=import-error,redefined-builtin
|
||||
from salt.exceptions import CommandExecutionError, SaltInvocationError
|
||||
|
||||
@@ -491,7 +504,7 @@ def info(*packages, **attr):
|
||||
else:
|
||||
out = call['stdout']
|
||||
|
||||
- ret = dict()
|
||||
+ _ret = list()
|
||||
for pkg_info in re.split(r"----*", out):
|
||||
pkg_info = pkg_info.strip()
|
||||
if not pkg_info:
|
||||
@@ -538,6 +551,49 @@ def info(*packages, **attr):
|
||||
if attr and 'description' in attr or not attr:
|
||||
pkg_data['description'] = os.linesep.join(descr)
|
||||
if pkg_name:
|
||||
- ret[pkg_name] = pkg_data
|
||||
+ pkg_data['name'] = pkg_name
|
||||
+ _ret.append(pkg_data)
|
||||
+
|
||||
+ # Force-sort package data by version,
|
||||
+ # pick only latest versions
|
||||
+ # (in case multiple packages installed, e.g. kernel)
|
||||
+ ret = dict()
|
||||
+ for pkg_data in reversed(sorted(_ret, cmp=lambda a_vrs, b_vrs: version_cmp(a_vrs['version'], b_vrs['version']))):
|
||||
+ pkg_name = pkg_data.pop('name')
|
||||
+ if pkg_name not in ret:
|
||||
+ ret[pkg_name] = pkg_data.copy()
|
||||
|
||||
return ret
|
||||
+
|
||||
+
|
||||
+def version_cmp(ver1, ver2):
|
||||
+ '''
|
||||
+ .. versionadded:: 2015.8.9
|
||||
+
|
||||
+ Do a cmp-style comparison on two packages. Return -1 if ver1 < ver2, 0 if
|
||||
+ ver1 == ver2, and 1 if ver1 > ver2. Return None if there was a problem
|
||||
+ making the comparison.
|
||||
+
|
||||
+ CLI Example:
|
||||
+
|
||||
+ .. code-block:: bash
|
||||
+
|
||||
+ salt '*' pkg.version_cmp '0.2-001' '0.2.0.1-002'
|
||||
+ '''
|
||||
+ try:
|
||||
+ if HAS_RPM:
|
||||
+ cmp_func = rpm.labelCompare
|
||||
+ elif HAS_RPMUTILS:
|
||||
+ cmp_func = rpmUtils.miscutils.compareEVR
|
||||
+ else:
|
||||
+ cmp_func = None
|
||||
+ cmp_result = cmp_func is None and 2 or cmp_func(salt.utils.str_version_to_evr(ver1),
|
||||
+ salt.utils.str_version_to_evr(ver2))
|
||||
+ if cmp_result not in (-1, 0, 1):
|
||||
+ raise Exception("Comparison result '{0}' is invalid".format(cmp_result))
|
||||
+
|
||||
+ return cmp_result
|
||||
+ except Exception as exc:
|
||||
+ log.warning("Failed to compare version '{0}' to '{1}' using RPM: {2}".format(ver1, ver2, exc))
|
||||
+
|
||||
+ return salt.utils.version_cmp(ver1, ver2)
|
||||
diff --git a/salt/modules/yumpkg.py b/salt/modules/yumpkg.py
|
||||
index 1bfc38d..1cde676 100644
|
||||
--- a/salt/modules/yumpkg.py
|
||||
+++ b/salt/modules/yumpkg.py
|
||||
@@ -40,12 +40,6 @@ try:
|
||||
except ImportError:
|
||||
from salt.ext.six.moves import configparser
|
||||
HAS_YUM = False
|
||||
-
|
||||
-try:
|
||||
- import rpmUtils.miscutils
|
||||
- HAS_RPMUTILS = True
|
||||
-except ImportError:
|
||||
- HAS_RPMUTILS = False
|
||||
# pylint: enable=import-error,redefined-builtin
|
||||
|
||||
# Import salt libs
|
||||
@@ -665,26 +659,8 @@ def version_cmp(pkg1, pkg2):
|
||||
|
||||
salt '*' pkg.version_cmp '0.2-001' '0.2.0.1-002'
|
||||
'''
|
||||
- if HAS_RPMUTILS:
|
||||
- try:
|
||||
- cmp_result = rpmUtils.miscutils.compareEVR(
|
||||
- rpmUtils.miscutils.stringToVersion(pkg1),
|
||||
- rpmUtils.miscutils.stringToVersion(pkg2)
|
||||
- )
|
||||
- if cmp_result not in (-1, 0, 1):
|
||||
- raise Exception(
|
||||
- 'cmp result \'{0}\' is invalid'.format(cmp_result)
|
||||
- )
|
||||
- return cmp_result
|
||||
- except Exception as exc:
|
||||
- log.warning(
|
||||
- 'Failed to compare version \'%s\' to \'%s\' using '
|
||||
- 'rpmUtils: %s', pkg1, pkg2, exc
|
||||
- )
|
||||
- # Fall back to distutils.version.LooseVersion (should only need to do
|
||||
- # this for RHEL5, or if an exception is raised when attempting to compare
|
||||
- # using rpmUtils)
|
||||
- return salt.utils.version_cmp(pkg1, pkg2)
|
||||
+
|
||||
+ return __salt__['lowpkg.version_cmp'](pkg1, pkg2)
|
||||
|
||||
|
||||
def list_pkgs(versions_as_list=False, **kwargs):
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 27b00d5..63c473c 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -17,12 +17,6 @@ import os
|
||||
import salt.ext.six as six
|
||||
from salt.ext.six.moves import configparser
|
||||
from salt.ext.six.moves.urllib.parse import urlparse as _urlparse
|
||||
-
|
||||
-try:
|
||||
- import rpm
|
||||
- HAS_RPM = True
|
||||
-except ImportError:
|
||||
- HAS_RPM = False
|
||||
# pylint: enable=import-error,redefined-builtin,no-name-in-module
|
||||
|
||||
from xml.dom import minidom as dom
|
||||
@@ -347,40 +341,6 @@ def version(*names, **kwargs):
|
||||
return __salt__['pkg_resource.version'](*names, **kwargs) or {}
|
||||
|
||||
|
||||
-def _string_to_evr(verstring):
|
||||
- '''
|
||||
- Split the version string into epoch, version and release and
|
||||
- return this as tuple.
|
||||
-
|
||||
- epoch is always not empty.
|
||||
- version and release can be an empty string if such a component
|
||||
- could not be found in the version string.
|
||||
-
|
||||
- "2:1.0-1.2" => ('2', '1.0', '1.2)
|
||||
- "1.0" => ('0', '1.0', '')
|
||||
- "" => ('0', '', '')
|
||||
- '''
|
||||
- if verstring in [None, '']:
|
||||
- return ('0', '', '')
|
||||
- idx_e = verstring.find(':')
|
||||
- if idx_e != -1:
|
||||
- try:
|
||||
- epoch = str(int(verstring[:idx_e]))
|
||||
- except ValueError:
|
||||
- # look, garbage in the epoch field, how fun, kill it
|
||||
- epoch = '0' # this is our fallback, deal
|
||||
- else:
|
||||
- epoch = '0'
|
||||
- idx_r = verstring.find('-')
|
||||
- if idx_r != -1:
|
||||
- version = verstring[idx_e + 1:idx_r]
|
||||
- release = verstring[idx_r + 1:]
|
||||
- else:
|
||||
- version = verstring[idx_e + 1:]
|
||||
- release = ''
|
||||
- return (epoch, version, release)
|
||||
-
|
||||
-
|
||||
def version_cmp(ver1, ver2):
|
||||
'''
|
||||
.. versionadded:: 2015.5.4
|
||||
@@ -395,23 +355,7 @@ def version_cmp(ver1, ver2):
|
||||
|
||||
salt '*' pkg.version_cmp '0.2-001' '0.2.0.1-002'
|
||||
'''
|
||||
- if HAS_RPM:
|
||||
- try:
|
||||
- cmp_result = rpm.labelCompare(
|
||||
- _string_to_evr(ver1),
|
||||
- _string_to_evr(ver2)
|
||||
- )
|
||||
- if cmp_result not in (-1, 0, 1):
|
||||
- raise Exception(
|
||||
- 'cmp result \'{0}\' is invalid'.format(cmp_result)
|
||||
- )
|
||||
- return cmp_result
|
||||
- except Exception as exc:
|
||||
- log.warning(
|
||||
- 'Failed to compare version \'{0}\' to \'{1}\' using '
|
||||
- 'rpmUtils: {2}'.format(ver1, ver2, exc)
|
||||
- )
|
||||
- return salt.utils.version_cmp(ver1, ver2)
|
||||
+ return __salt__['lowpkg.version_cmp'](ver1, ver2)
|
||||
|
||||
|
||||
def list_pkgs(versions_as_list=False, **kwargs):
|
||||
diff --git a/salt/utils/__init__.py b/salt/utils/__init__.py
|
||||
index f83a677..8956a15 100644
|
||||
--- a/salt/utils/__init__.py
|
||||
+++ b/salt/utils/__init__.py
|
||||
@@ -2881,3 +2881,38 @@ def split_input(val):
|
||||
return [x.strip() for x in val.split(',')]
|
||||
except AttributeError:
|
||||
return [x.strip() for x in str(val).split(',')]
|
||||
+
|
||||
+
|
||||
+def str_version_to_evr(verstring):
|
||||
+ '''
|
||||
+ Split the package version string into epoch, version and release.
|
||||
+ Return this as tuple.
|
||||
+
|
||||
+ The epoch is always not empty. The version and the release can be an empty
|
||||
+ string if such a component could not be found in the version string.
|
||||
+
|
||||
+ "2:1.0-1.2" => ('2', '1.0', '1.2)
|
||||
+ "1.0" => ('0', '1.0', '')
|
||||
+ "" => ('0', '', '')
|
||||
+ '''
|
||||
+ if verstring in [None, '']:
|
||||
+ return '0', '', ''
|
||||
+
|
||||
+ idx_e = verstring.find(':')
|
||||
+ if idx_e != -1:
|
||||
+ try:
|
||||
+ epoch = str(int(verstring[:idx_e]))
|
||||
+ except ValueError:
|
||||
+ # look, garbage in the epoch field, how fun, kill it
|
||||
+ epoch = '0' # this is our fallback, deal
|
||||
+ else:
|
||||
+ epoch = '0'
|
||||
+ idx_r = verstring.find('-')
|
||||
+ if idx_r != -1:
|
||||
+ version = verstring[idx_e + 1:idx_r]
|
||||
+ release = verstring[idx_r + 1:]
|
||||
+ else:
|
||||
+ version = verstring[idx_e + 1:]
|
||||
+ release = ''
|
||||
+
|
||||
+ return epoch, version, release
|
||||
diff --git a/tests/unit/modules/rpm_test.py b/tests/unit/modules/rpm_test.py
|
||||
index 8bfce9b..f180736 100644
|
||||
--- a/tests/unit/modules/rpm_test.py
|
||||
+++ b/tests/unit/modules/rpm_test.py
|
||||
@@ -95,6 +95,27 @@ class RpmTestCase(TestCase):
|
||||
self.assertDictEqual(rpm.owner('/usr/bin/python', '/usr/bin/vim'),
|
||||
ret)
|
||||
|
||||
+ @patch('salt.modules.rpm.HAS_RPM', True)
|
||||
+ def test_version_cmp_rpm(self):
|
||||
+ '''
|
||||
+ Test package version is called RPM version if RPM-Python is installed
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ rpm.rpm = MagicMock(return_value=MagicMock)
|
||||
+ with patch('salt.modules.rpm.rpm.labelCompare', MagicMock(return_value=0)):
|
||||
+ self.assertEqual(0, rpm.version_cmp('1', '2')) # mock returns 0, which means RPM was called
|
||||
+
|
||||
+ @patch('salt.modules.rpm.HAS_RPM', False)
|
||||
+ def test_version_cmp_fallback(self):
|
||||
+ '''
|
||||
+ Test package version is called RPM version if RPM-Python is installed
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ rpm.rpm = MagicMock(return_value=MagicMock)
|
||||
+ with patch('salt.modules.rpm.rpm.labelCompare', MagicMock(return_value=0)):
|
||||
+ self.assertEqual(-1, rpm.version_cmp('1', '2')) # mock returns -1, a python implementation was called
|
||||
|
||||
if __name__ == '__main__':
|
||||
from integration import run_tests
|
||||
diff --git a/tests/unit/modules/zypper_test.py b/tests/unit/modules/zypper_test.py
|
||||
index 5c4eb67..67cf52a 100644
|
||||
--- a/tests/unit/modules/zypper_test.py
|
||||
+++ b/tests/unit/modules/zypper_test.py
|
||||
@@ -301,28 +301,6 @@ class ZypperTestCase(TestCase):
|
||||
self.assertFalse(zypper.upgrade_available(pkg_name))
|
||||
self.assertTrue(zypper.upgrade_available('vim'))
|
||||
|
||||
- @patch('salt.modules.zypper.HAS_RPM', True)
|
||||
- def test_version_cmp_rpm(self):
|
||||
- '''
|
||||
- Test package version is called RPM version if RPM-Python is installed
|
||||
-
|
||||
- :return:
|
||||
- '''
|
||||
- with patch('salt.modules.zypper.rpm', MagicMock(return_value=MagicMock)):
|
||||
- with patch('salt.modules.zypper.rpm.labelCompare', MagicMock(return_value=0)):
|
||||
- self.assertEqual(0, zypper.version_cmp('1', '2')) # mock returns 0, which means RPM was called
|
||||
-
|
||||
- @patch('salt.modules.zypper.HAS_RPM', False)
|
||||
- def test_version_cmp_fallback(self):
|
||||
- '''
|
||||
- Test package version is called RPM version if RPM-Python is installed
|
||||
-
|
||||
- :return:
|
||||
- '''
|
||||
- with patch('salt.modules.zypper.rpm', MagicMock(return_value=MagicMock)):
|
||||
- with patch('salt.modules.zypper.rpm.labelCompare', MagicMock(return_value=0)):
|
||||
- self.assertEqual(-1, zypper.version_cmp('1', '2')) # mock returns -1, a python implementation was called
|
||||
-
|
||||
def test_list_pkgs(self):
|
||||
'''
|
||||
Test packages listing.
|
||||
--
|
||||
2.1.4
|
||||
|
922
0008-Cleaner-deprecation-process-with-decorators.patch
Normal file
922
0008-Cleaner-deprecation-process-with-decorators.patch
Normal file
@ -0,0 +1,922 @@
|
||||
From 2dcc979ab2897619baebfef5779120a98284d408 Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@maryniuk.net>
|
||||
Date: Wed, 6 Apr 2016 20:55:45 +0200
|
||||
Subject: [PATCH 08/12] Cleaner deprecation process with decorators
|
||||
|
||||
* Add deprecation decorator scaffold
|
||||
|
||||
* Capture type error and unhandled exceptions while function calls
|
||||
|
||||
* Aware of the current and future version of deprecation
|
||||
|
||||
* Implement initially is_deprecated decorator
|
||||
|
||||
* Add an alias for the capitalization
|
||||
|
||||
* Fix capitalization easier way
|
||||
|
||||
* Remove an extra line
|
||||
|
||||
* Add successor name to the deprecation decorator.
|
||||
|
||||
* Granulate logging and error messages.
|
||||
|
||||
* Implement function swapper
|
||||
|
||||
* Raise later the caught exception
|
||||
|
||||
* Clarify exception message
|
||||
|
||||
* Save function original name
|
||||
|
||||
* Remove an extra line
|
||||
|
||||
* Hide an alternative hidden function name in the error message, preserving the error itself
|
||||
|
||||
* Rename variable as private
|
||||
|
||||
* Add a method to detect if a function is using its previous version
|
||||
|
||||
* Message to the log and/or raise an exception accordingly to the status of used function
|
||||
|
||||
* Log an error along with the exception
|
||||
|
||||
* Add internal method documentation
|
||||
|
||||
* Add documentation and usage process for decorator "is_deprecated"
|
||||
|
||||
* Add documentation and process usage for the decorator "with_deprecated"
|
||||
|
||||
* Hide private method name
|
||||
|
||||
* Fix PEP8, re-word the error message
|
||||
|
||||
* Deprecate basic uptime function
|
||||
|
||||
* Add initial decorator unit test
|
||||
|
||||
* Rename old/new functions, mock versions
|
||||
|
||||
* Move frequent data to the test setup
|
||||
|
||||
* Add logging on EOL exception
|
||||
|
||||
* Rename and document high to low version test on is_deprecated
|
||||
|
||||
* Implement a test on low to high version of is_deprecated decorator
|
||||
|
||||
* Add a correction to the test description
|
||||
|
||||
* Remove a dead code
|
||||
|
||||
* Implement a test for high to low version on is_deprecated, using with_successor param
|
||||
|
||||
* Correct typso adn mistaeks
|
||||
|
||||
* Implement high to low version with successor param on is_deprecated
|
||||
|
||||
* Setup a virtual name for the module
|
||||
|
||||
* Implement test for with_deprecated should raise an exception if same deprecated function not found
|
||||
|
||||
* Implement test for with_deprecated an old function is picked up if configured
|
||||
|
||||
* Correct test description purpose
|
||||
|
||||
* Implement test with_deprecated when no deprecation is requested
|
||||
|
||||
* Add logging test to the configured deprecation request
|
||||
|
||||
* Add logging testing when deprecated version wasn't requested
|
||||
|
||||
* Implement test EOL for with_deprecated decorator
|
||||
|
||||
* Correct test explanation
|
||||
|
||||
* Rename the test
|
||||
|
||||
* Implement with_deprecated no EOL, deprecated other function name
|
||||
|
||||
* Implement with_deprecated, deprecated other function name, EOL reached
|
||||
|
||||
* Add test description for the with_deprecated + with_name + EOL
|
||||
|
||||
* Fix confusing test names
|
||||
|
||||
* Add logging test to the is_deprecated decorator when function as not found.
|
||||
|
||||
* Add more test point to each test, remove empty lines
|
||||
|
||||
* Bugfix: at certain conditions a wrong alias name is reported to the log
|
||||
|
||||
* Fix a typo in a comment
|
||||
|
||||
* Add test for the logging
|
||||
|
||||
* Disable a pylint: None will _never_ be raised
|
||||
|
||||
* Fix test for the deprecated "status.uptime" version
|
||||
|
||||
* Bugfix: Do not yank raised exceptions
|
||||
|
||||
* Remove unnecessary decorator
|
||||
|
||||
* Add test for the new uptime
|
||||
|
||||
* Add test for the new uptime fails when /proc/uptime does not exists
|
||||
|
||||
* Rename old test case
|
||||
|
||||
* Skip test for the UTC time, unless freeze time is used.
|
||||
|
||||
* Fix pylint
|
||||
|
||||
* Fix documentation
|
||||
|
||||
* Bugfix: proxy-pass the docstring of the decorated function
|
||||
|
||||
* Lint fix
|
||||
---
|
||||
salt/modules/status.py | 40 ++++-
|
||||
salt/utils/decorators/__init__.py | 345 +++++++++++++++++++++++++++++++++++-
|
||||
tests/unit/modules/status_test.py | 48 ++++-
|
||||
tests/unit/utils/decorators_test.py | 232 ++++++++++++++++++++++++
|
||||
4 files changed, 649 insertions(+), 16 deletions(-)
|
||||
create mode 100644 tests/unit/utils/decorators_test.py
|
||||
|
||||
diff --git a/salt/modules/status.py b/salt/modules/status.py
|
||||
index 1e80b36..04c6204 100644
|
||||
--- a/salt/modules/status.py
|
||||
+++ b/salt/modules/status.py
|
||||
@@ -11,6 +11,8 @@ import os
|
||||
import re
|
||||
import fnmatch
|
||||
import collections
|
||||
+import time
|
||||
+import datetime
|
||||
|
||||
# Import 3rd-party libs
|
||||
import salt.ext.six as six
|
||||
@@ -23,6 +25,8 @@ import salt.utils.event
|
||||
from salt.utils.network import host_to_ip as _host_to_ip
|
||||
from salt.utils.network import remote_port_tcp as _remote_port_tcp
|
||||
from salt.ext.six.moves import zip
|
||||
+from salt.utils.decorators import with_deprecated
|
||||
+from salt.exceptions import CommandExecutionError
|
||||
|
||||
__virtualname__ = 'status'
|
||||
__opts__ = {}
|
||||
@@ -30,7 +34,8 @@ __opts__ = {}
|
||||
|
||||
def __virtual__():
|
||||
if salt.utils.is_windows():
|
||||
- return (False, 'Cannot load status module on windows')
|
||||
+ return False, 'Windows platform is not supported by this module'
|
||||
+
|
||||
return __virtualname__
|
||||
|
||||
|
||||
@@ -120,7 +125,38 @@ def custom():
|
||||
return ret
|
||||
|
||||
|
||||
-def uptime(human_readable=True):
|
||||
+@with_deprecated(globals(), "Boron")
|
||||
+def uptime():
|
||||
+ '''
|
||||
+ Return the uptime for this system.
|
||||
+
|
||||
+ CLI Example:
|
||||
+
|
||||
+ .. code-block:: bash
|
||||
+
|
||||
+ salt '*' status.uptime
|
||||
+ '''
|
||||
+ ut_path = "/proc/uptime"
|
||||
+ if not os.path.exists(ut_path):
|
||||
+ raise CommandExecutionError("File {ut_path} was not found.".format(ut_path=ut_path))
|
||||
+
|
||||
+ ut_ret = {
|
||||
+ 'seconds': int(float(open(ut_path).read().strip().split()[0]))
|
||||
+ }
|
||||
+
|
||||
+ utc_time = datetime.datetime.utcfromtimestamp(time.time() - ut_ret['seconds'])
|
||||
+ ut_ret['since_iso'] = utc_time.isoformat()
|
||||
+ ut_ret['since_t'] = time.mktime(utc_time.timetuple())
|
||||
+ ut_ret['days'] = ut_ret['seconds'] / 60 / 60 / 24
|
||||
+ hours = (ut_ret['seconds'] - (ut_ret['days'] * 24 * 60 * 60)) / 60 / 60
|
||||
+ minutes = ((ut_ret['seconds'] - (ut_ret['days'] * 24 * 60 * 60)) / 60) - hours * 60
|
||||
+ ut_ret['time'] = '{0}:{1}'.format(hours, minutes)
|
||||
+ ut_ret['users'] = len(__salt__['cmd.run']("who -s").split(os.linesep))
|
||||
+
|
||||
+ return ut_ret
|
||||
+
|
||||
+
|
||||
+def _uptime(human_readable=True):
|
||||
'''
|
||||
Return the uptime for this minion
|
||||
|
||||
diff --git a/salt/utils/decorators/__init__.py b/salt/utils/decorators/__init__.py
|
||||
index 45d3bd6..3b43504 100644
|
||||
--- a/salt/utils/decorators/__init__.py
|
||||
+++ b/salt/utils/decorators/__init__.py
|
||||
@@ -13,7 +13,8 @@ from collections import defaultdict
|
||||
|
||||
# Import salt libs
|
||||
import salt.utils
|
||||
-from salt.exceptions import CommandNotFoundError
|
||||
+from salt.exceptions import CommandNotFoundError, CommandExecutionError
|
||||
+from salt.version import SaltStackVersion, __saltstack_version__
|
||||
from salt.log import LOG_LEVELS
|
||||
|
||||
# Import 3rd-party libs
|
||||
@@ -144,10 +145,7 @@ class Depends(object):
|
||||
continue
|
||||
|
||||
|
||||
-class depends(Depends): # pylint: disable=C0103
|
||||
- '''
|
||||
- Wrapper of Depends for capitalization
|
||||
- '''
|
||||
+depends = Depends
|
||||
|
||||
|
||||
def timing(function):
|
||||
@@ -248,3 +246,340 @@ def memoize(func):
|
||||
cache[args] = func(*args)
|
||||
return cache[args]
|
||||
return _memoize
|
||||
+
|
||||
+
|
||||
+class _DeprecationDecorator(object):
|
||||
+ '''
|
||||
+ Base mix-in class for the deprecation decorator.
|
||||
+ Takes care of a common functionality, used in its derivatives.
|
||||
+ '''
|
||||
+
|
||||
+ def __init__(self, globals, version):
|
||||
+ '''
|
||||
+ Constructor.
|
||||
+
|
||||
+ :param globals: Module globals. Important for finding out replacement functions
|
||||
+ :param version: Expiration version
|
||||
+ :return:
|
||||
+ '''
|
||||
+
|
||||
+ self._globals = globals
|
||||
+ self._exp_version_name = version
|
||||
+ self._exp_version = SaltStackVersion.from_name(self._exp_version_name)
|
||||
+ self._curr_version = __saltstack_version__.info
|
||||
+ self._options = self._globals['__opts__']
|
||||
+ self._raise_later = None
|
||||
+ self._function = None
|
||||
+ self._orig_f_name = None
|
||||
+
|
||||
+ def _get_args(self, kwargs):
|
||||
+ '''
|
||||
+ Extract function-specific keywords from all of the kwargs.
|
||||
+
|
||||
+ :param kwargs:
|
||||
+ :return:
|
||||
+ '''
|
||||
+ _args = list()
|
||||
+ _kwargs = dict()
|
||||
+
|
||||
+ for arg_item in kwargs.get('__pub_arg', list()):
|
||||
+ if type(arg_item) == dict:
|
||||
+ _kwargs.update(arg_item.copy())
|
||||
+ else:
|
||||
+ _args.append(arg_item)
|
||||
+ return _args, _kwargs
|
||||
+
|
||||
+ def _call_function(self, kwargs):
|
||||
+ '''
|
||||
+ Call target function that has been decorated.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ if self._raise_later:
|
||||
+ raise self._raise_later # pylint: disable=E0702
|
||||
+
|
||||
+ if self._function:
|
||||
+ args, kwargs = self._get_args(kwargs)
|
||||
+ try:
|
||||
+ return self._function(*args, **kwargs)
|
||||
+ except TypeError as error:
|
||||
+ error = str(error).replace(self._function.func_name, self._orig_f_name) # Hide hidden functions
|
||||
+ log.error('Function "{f_name}" was not properly called: {error}'.format(f_name=self._orig_f_name,
|
||||
+ error=error))
|
||||
+ return self._function.__doc__
|
||||
+ except Exception as error:
|
||||
+ log.error('Unhandled exception occurred in '
|
||||
+ 'function "{f_name}: {error}'.format(f_name=self._function.func_name,
|
||||
+ error=error))
|
||||
+ raise error
|
||||
+ else:
|
||||
+ raise CommandExecutionError("Function is deprecated, but the successor function was not found.")
|
||||
+
|
||||
+ def __call__(self, function):
|
||||
+ '''
|
||||
+ Callable method of the decorator object when
|
||||
+ the decorated function is gets called.
|
||||
+
|
||||
+ :param function:
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self._function = function
|
||||
+ self._orig_f_name = self._function.func_name
|
||||
+
|
||||
+
|
||||
+class _IsDeprecated(_DeprecationDecorator):
|
||||
+ '''
|
||||
+ This decorator should be used only with the deprecated functions
|
||||
+ to mark them as deprecated and alter its behavior a corresponding way.
|
||||
+ The usage is only suitable if deprecation process is renaming
|
||||
+ the function from one to another. In case function name or even function
|
||||
+ signature stays the same, please use 'with_deprecated' decorator instead.
|
||||
+
|
||||
+ It has the following functionality:
|
||||
+
|
||||
+ 1. Put a warning level message to the log, informing that
|
||||
+ the deprecated function has been in use.
|
||||
+
|
||||
+ 2. Raise an exception, if deprecated function is being called,
|
||||
+ but the lifetime of it already expired.
|
||||
+
|
||||
+ 3. Point to the successor of the deprecated function in the
|
||||
+ log messages as well during the blocking it, once expired.
|
||||
+
|
||||
+ Usage of this decorator as follows. In this example no successor
|
||||
+ is mentioned, hence the function "foo()" will be logged with the
|
||||
+ warning each time is called and blocked completely, once EOF of
|
||||
+ it is reached:
|
||||
+
|
||||
+ from salt.util.decorators import is_deprecated
|
||||
+
|
||||
+ @is_deprecated(globals(), "Beryllium")
|
||||
+ def foo():
|
||||
+ pass
|
||||
+
|
||||
+ In the following example a successor function is mentioned, hence
|
||||
+ every time the function "bar()" is called, message will suggest
|
||||
+ to use function "baz()" instead. Once EOF is reached of the function
|
||||
+ "bar()", an exception will ask to use function "baz()", in order
|
||||
+ to continue:
|
||||
+
|
||||
+ from salt.util.decorators import is_deprecated
|
||||
+
|
||||
+ @is_deprecated(globals(), "Beryllium", with_successor="baz")
|
||||
+ def bar():
|
||||
+ pass
|
||||
+
|
||||
+ def baz():
|
||||
+ pass
|
||||
+ '''
|
||||
+
|
||||
+ def __init__(self, globals, version, with_successor=None):
|
||||
+ '''
|
||||
+ Constructor of the decorator 'is_deprecated'.
|
||||
+
|
||||
+ :param globals: Module globals
|
||||
+ :param version: Version to be deprecated
|
||||
+ :param with_successor: Successor function (optional)
|
||||
+ :return:
|
||||
+ '''
|
||||
+ _DeprecationDecorator.__init__(self, globals, version)
|
||||
+ self._successor = with_successor
|
||||
+
|
||||
+ def __call__(self, function):
|
||||
+ '''
|
||||
+ Callable method of the decorator object when
|
||||
+ the decorated function is gets called.
|
||||
+
|
||||
+ :param function:
|
||||
+ :return:
|
||||
+ '''
|
||||
+ _DeprecationDecorator.__call__(self, function)
|
||||
+
|
||||
+ def _decorate(*args, **kwargs):
|
||||
+ '''
|
||||
+ Decorator function.
|
||||
+
|
||||
+ :param args:
|
||||
+ :param kwargs:
|
||||
+ :return:
|
||||
+ '''
|
||||
+ if self._curr_version < self._exp_version:
|
||||
+ msg = ['The function "{f_name}" is deprecated and will '
|
||||
+ 'expire in version "{version_name}".'.format(f_name=self._function.func_name,
|
||||
+ version_name=self._exp_version_name)]
|
||||
+ if self._successor:
|
||||
+ msg.append('Use successor "{successor}" instead.'.format(successor=self._successor))
|
||||
+ log.warning(' '.join(msg))
|
||||
+ else:
|
||||
+ msg = ['The lifetime of the function "{f_name}" expired.'.format(f_name=self._function.func_name)]
|
||||
+ if self._successor:
|
||||
+ msg.append('Please use its successor "{successor}" instead.'.format(successor=self._successor))
|
||||
+ log.warning(' '.join(msg))
|
||||
+ raise CommandExecutionError(' '.join(msg))
|
||||
+ return self._call_function(kwargs)
|
||||
+ return _decorate
|
||||
+
|
||||
+
|
||||
+is_deprecated = _IsDeprecated
|
||||
+
|
||||
+
|
||||
+class _WithDeprecated(_DeprecationDecorator):
|
||||
+ '''
|
||||
+ This decorator should be used with the successor functions
|
||||
+ to mark them as a new and alter its behavior in a corresponding way.
|
||||
+ It is used alone if a function content or function signature
|
||||
+ needs to be replaced, leaving the name of the function same.
|
||||
+ In case function needs to be renamed or just dropped, it has
|
||||
+ to be used in pair with 'is_deprecated' decorator.
|
||||
+
|
||||
+ It has the following functionality:
|
||||
+
|
||||
+ 1. Put a warning level message to the log, in case a component
|
||||
+ is using its deprecated version.
|
||||
+
|
||||
+ 2. Switch between old and new function in case an older version
|
||||
+ is configured for the desired use.
|
||||
+
|
||||
+ 3. Raise an exception, if deprecated version reached EOL and
|
||||
+ point out for the new version.
|
||||
+
|
||||
+ Usage of this decorator as follows. If 'with_name' is not specified,
|
||||
+ then the name of the deprecated function is assumed with the "_" prefix.
|
||||
+ In this case, in order to deprecate a function, it is required:
|
||||
+
|
||||
+ - Add a prefix "_" to an existing function. E.g.: "foo()" to "_foo()".
|
||||
+
|
||||
+ - Implement a new function with exactly the same name, just without
|
||||
+ the prefix "_".
|
||||
+
|
||||
+ Example:
|
||||
+
|
||||
+ from salt.util.decorators import with_deprecated
|
||||
+
|
||||
+ @with_deprecated(globals(), "Beryllium")
|
||||
+ def foo():
|
||||
+ "This is a new function"
|
||||
+
|
||||
+ def _foo():
|
||||
+ "This is a deprecated function"
|
||||
+
|
||||
+
|
||||
+ In case there is a need to deprecate a function and rename it,
|
||||
+ the decorator shuld be used with the 'with_name' parameter. This
|
||||
+ parameter is pointing to the existing deprecated function. In this
|
||||
+ case deprecation process as follows:
|
||||
+
|
||||
+ - Leave a deprecated function without changes, as is.
|
||||
+
|
||||
+ - Implement a new function and decorate it with this decorator.
|
||||
+
|
||||
+ - Set a parameter 'with_name' to the deprecated function.
|
||||
+
|
||||
+ - If a new function has a different name than a deprecated,
|
||||
+ decorate a deprecated function with the 'is_deprecated' decorator
|
||||
+ in order to let the function have a deprecated behavior.
|
||||
+
|
||||
+ Example:
|
||||
+
|
||||
+ from salt.util.decorators import with_deprecated
|
||||
+
|
||||
+ @with_deprecated(globals(), "Beryllium", with_name="an_old_function")
|
||||
+ def a_new_function():
|
||||
+ "This is a new function"
|
||||
+
|
||||
+ @is_deprecated(globals(), "Beryllium", with_successor="a_new_function")
|
||||
+ def an_old_function():
|
||||
+ "This is a deprecated function"
|
||||
+
|
||||
+ '''
|
||||
+ MODULE_NAME = '__virtualname__'
|
||||
+ CFG_KEY = 'use_deprecated'
|
||||
+
|
||||
+ def __init__(self, globals, version, with_name=None):
|
||||
+ '''
|
||||
+ Constructor of the decorator 'with_deprecated'
|
||||
+
|
||||
+ :param globals:
|
||||
+ :param version:
|
||||
+ :param with_name:
|
||||
+ :return:
|
||||
+ '''
|
||||
+ _DeprecationDecorator.__init__(self, globals, version)
|
||||
+ self._with_name = with_name
|
||||
+
|
||||
+ def _set_function(self, function):
|
||||
+ '''
|
||||
+ Based on the configuration, set to execute an old or a new function.
|
||||
+ :return:
|
||||
+ '''
|
||||
+ full_name = "{m_name}.{f_name}".format(m_name=self._globals.get(self.MODULE_NAME, ''),
|
||||
+ f_name=function.func_name)
|
||||
+ if full_name.startswith("."):
|
||||
+ self._raise_later = CommandExecutionError('Module not found for function "{f_name}"'.format(
|
||||
+ f_name=function.func_name))
|
||||
+
|
||||
+ if full_name in self._options.get(self.CFG_KEY, list()):
|
||||
+ self._function = self._globals.get(self._with_name or "_{0}".format(function.func_name))
|
||||
+
|
||||
+ def _is_used_deprecated(self):
|
||||
+ '''
|
||||
+ Returns True, if a component configuration explicitly is
|
||||
+ asking to use an old version of the deprecated function.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ return "{m_name}.{f_name}".format(m_name=self._globals.get(self.MODULE_NAME, ''),
|
||||
+ f_name=self._orig_f_name) in self._options.get(self.CFG_KEY, list())
|
||||
+
|
||||
+ def __call__(self, function):
|
||||
+ '''
|
||||
+ Callable method of the decorator object when
|
||||
+ the decorated function is gets called.
|
||||
+
|
||||
+ :param function:
|
||||
+ :return:
|
||||
+ '''
|
||||
+ _DeprecationDecorator.__call__(self, function)
|
||||
+
|
||||
+ def _decorate(*args, **kwargs):
|
||||
+ '''
|
||||
+ Decorator function.
|
||||
+
|
||||
+ :param args:
|
||||
+ :param kwargs:
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self._set_function(function)
|
||||
+ if self._is_used_deprecated():
|
||||
+ if self._curr_version < self._exp_version:
|
||||
+ msg = list()
|
||||
+ if self._with_name:
|
||||
+ msg.append('The function "{f_name}" is deprecated and will '
|
||||
+ 'expire in version "{version_name}".'.format(
|
||||
+ f_name=self._with_name.startswith("_") and self._orig_f_name or self._with_name,
|
||||
+ version_name=self._exp_version_name))
|
||||
+ else:
|
||||
+ msg.append('The function is using its deprecated version and will '
|
||||
+ 'expire in version "{version_name}".'.format(version_name=self._exp_version_name))
|
||||
+ msg.append('Use its successor "{successor}" instead.'.format(successor=self._orig_f_name))
|
||||
+ log.warning(' '.join(msg))
|
||||
+ else:
|
||||
+ msg_patt = 'The lifetime of the function "{f_name}" expired.'
|
||||
+ if '_' + self._orig_f_name == self._function.func_name:
|
||||
+ msg = [msg_patt.format(f_name=self._orig_f_name),
|
||||
+ 'Please turn off its deprecated version in the configuration']
|
||||
+ else:
|
||||
+ msg = ['Although function "{f_name}" is called, an alias "{f_alias}" '
|
||||
+ 'is configured as its deprecated version.'.format(
|
||||
+ f_name=self._orig_f_name, f_alias=self._with_name or self._orig_f_name),
|
||||
+ msg_patt.format(f_name=self._with_name or self._orig_f_name),
|
||||
+ 'Please use its successor "{successor}" instead.'.format(successor=self._orig_f_name)]
|
||||
+ log.error(' '.join(msg))
|
||||
+ raise CommandExecutionError(' '.join(msg))
|
||||
+ return self._call_function(kwargs)
|
||||
+
|
||||
+ _decorate.__doc__ = self._function.__doc__
|
||||
+ return _decorate
|
||||
+
|
||||
+
|
||||
+with_deprecated = _WithDeprecated
|
||||
diff --git a/tests/unit/modules/status_test.py b/tests/unit/modules/status_test.py
|
||||
index 191da09..b5cee4f 100644
|
||||
--- a/tests/unit/modules/status_test.py
|
||||
+++ b/tests/unit/modules/status_test.py
|
||||
@@ -5,15 +5,14 @@ from __future__ import absolute_import
|
||||
|
||||
# Import Salt Libs
|
||||
from salt.modules import status
|
||||
+from salt.exceptions import CommandExecutionError
|
||||
|
||||
# Import Salt Testing Libs
|
||||
-from salttesting import skipIf, TestCase
|
||||
+from salttesting import TestCase
|
||||
from salttesting.helpers import ensure_in_syspath
|
||||
from salttesting.mock import (
|
||||
MagicMock,
|
||||
patch,
|
||||
- NO_MOCK,
|
||||
- NO_MOCK_REASON
|
||||
)
|
||||
|
||||
ensure_in_syspath('../../')
|
||||
@@ -22,36 +21,67 @@ ensure_in_syspath('../../')
|
||||
status.__salt__ = {}
|
||||
|
||||
|
||||
-@skipIf(NO_MOCK, NO_MOCK_REASON)
|
||||
class StatusTestCase(TestCase):
|
||||
'''
|
||||
test modules.status functions
|
||||
'''
|
||||
+
|
||||
def test_uptime(self):
|
||||
'''
|
||||
- test modules.status.uptime function
|
||||
+ Test modules.status.uptime function, new version
|
||||
+ :return:
|
||||
+ '''
|
||||
+ class ProcUptime(object):
|
||||
+ def __init__(self, *args, **kwargs):
|
||||
+ self.data = "773865.18 1003405.46"
|
||||
+
|
||||
+ def read(self):
|
||||
+ return self.data
|
||||
+
|
||||
+ with patch.dict(status.__salt__, {'cmd.run': MagicMock(return_value="1\n2\n3")}):
|
||||
+ with patch('os.path.exists', MagicMock(return_value=True)):
|
||||
+ with patch('time.time', MagicMock(return_value=1458821523.72)):
|
||||
+ status.open = ProcUptime
|
||||
+ u_time = status.uptime()
|
||||
+ self.assertEqual(u_time['users'], 3)
|
||||
+ self.assertEqual(u_time['seconds'], 773865)
|
||||
+ self.assertEqual(u_time['days'], 8)
|
||||
+ self.assertEqual(u_time['time'], '22:57')
|
||||
+
|
||||
+ def test_uptime_failure(self):
|
||||
+ '''
|
||||
+ Test modules.status.uptime function should raise an exception if /proc/uptime does not exists.
|
||||
+ :return:
|
||||
+ '''
|
||||
+ with patch('os.path.exists', MagicMock(return_value=False)):
|
||||
+ with self.assertRaises(CommandExecutionError):
|
||||
+ status.uptime()
|
||||
+
|
||||
+ def test_deprecated_uptime(self):
|
||||
+ '''
|
||||
+ test modules.status.uptime function, deprecated version
|
||||
'''
|
||||
mock_uptime = 'very often'
|
||||
mock_run = MagicMock(return_value=mock_uptime)
|
||||
with patch.dict(status.__salt__, {'cmd.run': mock_run}):
|
||||
- self.assertEqual(status.uptime(), mock_uptime)
|
||||
+ self.assertEqual(status._uptime(), mock_uptime)
|
||||
|
||||
mock_uptime = 'very idle'
|
||||
mock_run = MagicMock(return_value=mock_uptime)
|
||||
with patch.dict(status.__salt__, {'cmd.run': mock_run}):
|
||||
with patch('os.path.exists', MagicMock(return_value=True)):
|
||||
- self.assertEqual(status.uptime(human_readable=False), mock_uptime.split()[0])
|
||||
+ self.assertEqual(status._uptime(human_readable=False), mock_uptime.split()[0])
|
||||
|
||||
mock_uptime = ''
|
||||
mock_return = 'unexpected format in /proc/uptime'
|
||||
mock_run = MagicMock(return_value=mock_uptime)
|
||||
with patch.dict(status.__salt__, {'cmd.run': mock_run}):
|
||||
with patch('os.path.exists', MagicMock(return_value=True)):
|
||||
- self.assertEqual(status.uptime(human_readable=False), mock_return)
|
||||
+ self.assertEqual(status._uptime(human_readable=False), mock_return)
|
||||
|
||||
mock_return = 'cannot find /proc/uptime'
|
||||
with patch('os.path.exists', MagicMock(return_value=False)):
|
||||
- self.assertEqual(status.uptime(human_readable=False), mock_return)
|
||||
+ self.assertEqual(status._uptime(human_readable=False), mock_return)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
diff --git a/tests/unit/utils/decorators_test.py b/tests/unit/utils/decorators_test.py
|
||||
new file mode 100644
|
||||
index 0000000..4078340
|
||||
--- /dev/null
|
||||
+++ b/tests/unit/utils/decorators_test.py
|
||||
@@ -0,0 +1,232 @@
|
||||
+# -*- coding: utf-8 -*-
|
||||
+'''
|
||||
+ :codeauthor: :email:`Bo Maryniuk (bo@suse.de)`
|
||||
+ unit.utils.decorators_test
|
||||
+'''
|
||||
+
|
||||
+# Import Python libs
|
||||
+from __future__ import absolute_import
|
||||
+
|
||||
+# Import Salt Testing libs
|
||||
+from salttesting import TestCase
|
||||
+from salttesting.helpers import ensure_in_syspath
|
||||
+from salt.utils import decorators
|
||||
+from salt.version import SaltStackVersion
|
||||
+from salt.exceptions import CommandExecutionError
|
||||
+
|
||||
+ensure_in_syspath('../../')
|
||||
+
|
||||
+
|
||||
+class DummyLogger(object):
|
||||
+ '''
|
||||
+ Dummy logger accepts everything and simply logs
|
||||
+ '''
|
||||
+ def __init__(self, messages):
|
||||
+ self._messages = messages
|
||||
+
|
||||
+ def __getattr__(self, item):
|
||||
+ return self._log
|
||||
+
|
||||
+ def _log(self, msg):
|
||||
+ self._messages.append(msg)
|
||||
+
|
||||
+
|
||||
+class DecoratorsTest(TestCase):
|
||||
+ '''
|
||||
+ Testing decorators.
|
||||
+ '''
|
||||
+ def old_function(self):
|
||||
+ return "old"
|
||||
+
|
||||
+ def new_function(self):
|
||||
+ return "new"
|
||||
+
|
||||
+ def _mk_version(self, name):
|
||||
+ '''
|
||||
+ Make a version
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ return name, SaltStackVersion.from_name(name)
|
||||
+
|
||||
+ def setUp(self):
|
||||
+ '''
|
||||
+ Setup a test
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self.globs = {
|
||||
+ '__virtualname__': 'test',
|
||||
+ '__opts__': {},
|
||||
+ 'old_function': self.old_function,
|
||||
+ 'new_function': self.new_function,
|
||||
+ }
|
||||
+ self.messages = list()
|
||||
+ decorators.log = DummyLogger(self.messages)
|
||||
+
|
||||
+ def test_is_deprecated_version_eol(self):
|
||||
+ '''
|
||||
+ Use of is_deprecated will result to the exception,
|
||||
+ if the expiration version is lower than the current version.
|
||||
+ A successor function is not pointed out.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ depr = decorators.is_deprecated(self.globs, "Helium")
|
||||
+ depr._curr_version = self._mk_version("Beryllium")[1]
|
||||
+ with self.assertRaises(CommandExecutionError):
|
||||
+ depr(self.old_function)()
|
||||
+ self.assertEqual(self.messages,
|
||||
+ ['The lifetime of the function "old_function" expired.'])
|
||||
+
|
||||
+ def test_is_deprecated_with_successor_eol(self):
|
||||
+ '''
|
||||
+ Use of is_deprecated will result to the exception,
|
||||
+ if the expiration version is lower than the current version.
|
||||
+ A successor function is pointed out.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ depr = decorators.is_deprecated(self.globs, "Helium", with_successor="new_function")
|
||||
+ depr._curr_version = self._mk_version("Beryllium")[1]
|
||||
+ with self.assertRaises(CommandExecutionError):
|
||||
+ depr(self.old_function)()
|
||||
+ self.assertEqual(self.messages,
|
||||
+ ['The lifetime of the function "old_function" expired. '
|
||||
+ 'Please use its successor "new_function" instead.'])
|
||||
+
|
||||
+ def test_is_deprecated(self):
|
||||
+ '''
|
||||
+ Use of is_deprecated will result to the log message,
|
||||
+ if the expiration version is higher than the current version.
|
||||
+ A successor function is not pointed out.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ depr = decorators.is_deprecated(self.globs, "Beryllium")
|
||||
+ depr._curr_version = self._mk_version("Helium")[1]
|
||||
+ self.assertEqual(depr(self.old_function)(), self.old_function())
|
||||
+ self.assertEqual(self.messages,
|
||||
+ ['The function "old_function" is deprecated '
|
||||
+ 'and will expire in version "Beryllium".'])
|
||||
+
|
||||
+ def test_is_deprecated_with_successor(self):
|
||||
+ '''
|
||||
+ Use of is_deprecated will result to the log message,
|
||||
+ if the expiration version is higher than the current version.
|
||||
+ A successor function is pointed out.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ depr = decorators.is_deprecated(self.globs, "Beryllium", with_successor="old_function")
|
||||
+ depr._curr_version = self._mk_version("Helium")[1]
|
||||
+ self.assertEqual(depr(self.old_function)(), self.old_function())
|
||||
+ self.assertEqual(self.messages,
|
||||
+ ['The function "old_function" is deprecated '
|
||||
+ 'and will expire in version "Beryllium". '
|
||||
+ 'Use successor "old_function" instead.'])
|
||||
+
|
||||
+ def test_with_deprecated_notfound(self):
|
||||
+ '''
|
||||
+ Test with_deprecated should raise an exception, if a same name
|
||||
+ function with the "_" prefix not implemented.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self.globs['__opts__']['use_deprecated'] = ['test.new_function']
|
||||
+ depr = decorators.with_deprecated(self.globs, "Beryllium")
|
||||
+ depr._curr_version = self._mk_version("Helium")[1]
|
||||
+ with self.assertRaises(CommandExecutionError):
|
||||
+ depr(self.new_function)()
|
||||
+ self.assertEqual(self.messages,
|
||||
+ ['The function is using its deprecated version and will expire in version "Beryllium". '
|
||||
+ 'Use its successor "new_function" instead.'])
|
||||
+
|
||||
+ def test_with_deprecated_found(self):
|
||||
+ '''
|
||||
+ Test with_deprecated should not raise an exception, if a same name
|
||||
+ function with the "_" prefix is implemented, but should use
|
||||
+ an old version instead, if "use_deprecated" is requested.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self.globs['__opts__']['use_deprecated'] = ['test.new_function']
|
||||
+ self.globs['_new_function'] = self.old_function
|
||||
+ depr = decorators.with_deprecated(self.globs, "Beryllium")
|
||||
+ depr._curr_version = self._mk_version("Helium")[1]
|
||||
+ self.assertEqual(depr(self.new_function)(), self.old_function())
|
||||
+ log_msg = ['The function is using its deprecated version and will expire in version "Beryllium". '
|
||||
+ 'Use its successor "new_function" instead.']
|
||||
+ self.assertEqual(self.messages, log_msg)
|
||||
+
|
||||
+ def test_with_deprecated_found_eol(self):
|
||||
+ '''
|
||||
+ Test with_deprecated should raise an exception, if a same name
|
||||
+ function with the "_" prefix is implemented, "use_deprecated" is requested
|
||||
+ and EOL is reached.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self.globs['__opts__']['use_deprecated'] = ['test.new_function']
|
||||
+ self.globs['_new_function'] = self.old_function
|
||||
+ depr = decorators.with_deprecated(self.globs, "Helium")
|
||||
+ depr._curr_version = self._mk_version("Beryllium")[1]
|
||||
+ with self.assertRaises(CommandExecutionError):
|
||||
+ depr(self.new_function)()
|
||||
+ self.assertEqual(self.messages,
|
||||
+ ['Although function "new_function" is called, an alias "new_function" '
|
||||
+ 'is configured as its deprecated version. The lifetime of the function '
|
||||
+ '"new_function" expired. Please use its successor "new_function" instead.'])
|
||||
+
|
||||
+ def test_with_deprecated_no_conf(self):
|
||||
+ '''
|
||||
+ Test with_deprecated should not raise an exception, if a same name
|
||||
+ function with the "_" prefix is implemented, but should use
|
||||
+ a new version instead, if "use_deprecated" is not requested.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self.globs['_new_function'] = self.old_function
|
||||
+ depr = decorators.with_deprecated(self.globs, "Beryllium")
|
||||
+ depr._curr_version = self._mk_version("Helium")[1]
|
||||
+ self.assertEqual(depr(self.new_function)(), self.new_function())
|
||||
+ self.assertFalse(self.messages)
|
||||
+
|
||||
+ def test_with_deprecated_with_name(self):
|
||||
+ '''
|
||||
+ Test with_deprecated should not raise an exception, if a different name
|
||||
+ function is implemented and specified with the "with_name" parameter,
|
||||
+ but should use an old version instead and log a warning log message.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self.globs['__opts__']['use_deprecated'] = ['test.new_function']
|
||||
+ depr = decorators.with_deprecated(self.globs, "Beryllium", with_name="old_function")
|
||||
+ depr._curr_version = self._mk_version("Helium")[1]
|
||||
+ self.assertEqual(depr(self.new_function)(), self.old_function())
|
||||
+ self.assertEqual(self.messages,
|
||||
+ ['The function "old_function" is deprecated and will expire in version "Beryllium". '
|
||||
+ 'Use its successor "new_function" instead.'])
|
||||
+
|
||||
+ def test_with_deprecated_with_name_eol(self):
|
||||
+ '''
|
||||
+ Test with_deprecated should raise an exception, if a different name
|
||||
+ function is implemented and specified with the "with_name" parameter
|
||||
+ and EOL is reached.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self.globs['__opts__']['use_deprecated'] = ['test.new_function']
|
||||
+ depr = decorators.with_deprecated(self.globs, "Helium", with_name="old_function")
|
||||
+ depr._curr_version = self._mk_version("Beryllium")[1]
|
||||
+ with self.assertRaises(CommandExecutionError):
|
||||
+ depr(self.new_function)()
|
||||
+ self.assertEqual(self.messages,
|
||||
+ ['Although function "new_function" is called, '
|
||||
+ 'an alias "old_function" is configured as its deprecated version. '
|
||||
+ 'The lifetime of the function "old_function" expired. '
|
||||
+ 'Please use its successor "new_function" instead.'])
|
||||
+
|
||||
+
|
||||
+if __name__ == '__main__':
|
||||
+ from integration import run_tests
|
||||
+ run_tests(DecoratorsTest, needs_daemon=False)
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,44 +0,0 @@
|
||||
From 1f6af694ef7296f4a32d4adcb658f66865a4c38a Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@suse.de>
|
||||
Date: Thu, 11 Feb 2016 18:26:51 +0100
|
||||
Subject: [PATCH 08/22] Fix types in the output data and return just a list of
|
||||
products
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 20 ++++++++++++--------
|
||||
1 file changed, 12 insertions(+), 8 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 4699904..76170e6 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -1216,14 +1216,18 @@ def list_products(all=False):
|
||||
doc = dom.parseString(__salt__['cmd.run'](("zypper -x products{0}".format(not all and ' -i' or '')),
|
||||
output_loglevel='trace'))
|
||||
for prd in doc.getElementsByTagName('product-list')[0].getElementsByTagName('product'):
|
||||
- p_data = dict()
|
||||
- p_nfo = dict(prd.attributes.items())
|
||||
- p_name = p_nfo.pop('name')
|
||||
- p_data[p_name] = p_nfo
|
||||
- p_data[p_name]['eol'] = prd.getElementsByTagName('endoflife')[0].getAttribute('text')
|
||||
- descr = _get_first_aggregate_text(prd.getElementsByTagName('description'))
|
||||
- p_data[p_name]['description'] = " ".join([line.strip() for line in descr.split(os.linesep)])
|
||||
- ret.append(p_data)
|
||||
+ p_nfo = dict()
|
||||
+ for k_p_nfo, v_p_nfo in prd.attributes.items():
|
||||
+ p_nfo[k_p_nfo] = k_p_nfo not in ['isbase', 'installed'] and v_p_nfo or v_p_nfo == 'true'
|
||||
+ p_nfo['eol'] = prd.getElementsByTagName('endoflife')[0].getAttribute('text')
|
||||
+ p_nfo['eol_t'] = int(prd.getElementsByTagName('endoflife')[0].getAttribute('time_t'))
|
||||
+ p_nfo['description'] = " ".join(
|
||||
+ [line.strip() for line in _get_first_aggregate_text(
|
||||
+ prd.getElementsByTagName('description')
|
||||
+ ).split(os.linesep)]
|
||||
+ )
|
||||
+
|
||||
+ ret.append(p_nfo)
|
||||
|
||||
return ret
|
||||
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,59 +0,0 @@
|
||||
From 8d77e22c0c570a0a725216f70c41d4fe00a184ca Mon Sep 17 00:00:00 2001
|
||||
From: "Gareth J. Greenaway" <gareth@wiked.org>
|
||||
Date: Sat, 6 Feb 2016 15:52:17 -0800
|
||||
Subject: [PATCH 09/22] The functions in the state module that return a retcode
|
||||
when something goes wrong, eg. a 1 or a 2, do not return a 0 when things go
|
||||
the way they're supposed to go. With the recent changes to the scheduler to
|
||||
ensure that the retcode is returned this is problematic and results in
|
||||
exceptions when a state function is run from the schedule. This simple fix
|
||||
ensures a default retcode of 0 exists, it is then override in the
|
||||
_set_retcode function if there is an issue with the run
|
||||
|
||||
---
|
||||
salt/modules/state.py | 9 +++++----
|
||||
1 file changed, 5 insertions(+), 4 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/state.py b/salt/modules/state.py
|
||||
index 9cb195b..27e588c 100644
|
||||
--- a/salt/modules/state.py
|
||||
+++ b/salt/modules/state.py
|
||||
@@ -70,6 +70,10 @@ def _set_retcode(ret):
|
||||
'''
|
||||
Set the return code based on the data back from the state system
|
||||
'''
|
||||
+
|
||||
+ # Set default retcode to 0
|
||||
+ __context__['retcode'] = 0
|
||||
+
|
||||
if isinstance(ret, list):
|
||||
__context__['retcode'] = 1
|
||||
return
|
||||
@@ -576,7 +580,6 @@ def highstate(test=None,
|
||||
|
||||
serial = salt.payload.Serial(__opts__)
|
||||
cache_file = os.path.join(__opts__['cachedir'], 'highstate.p')
|
||||
-
|
||||
_set_retcode(ret)
|
||||
# Work around Windows multiprocessing bug, set __opts__['test'] back to
|
||||
# value from before this function was run.
|
||||
@@ -770,7 +773,6 @@ def sls(mods,
|
||||
except (IOError, OSError):
|
||||
msg = 'Unable to write to SLS cache file {0}. Check permission.'
|
||||
log.error(msg.format(cache_file))
|
||||
-
|
||||
_set_retcode(ret)
|
||||
# Work around Windows multiprocessing bug, set __opts__['test'] back to
|
||||
# value from before this function was run.
|
||||
@@ -876,8 +878,7 @@ def show_highstate(queue=False, **kwargs):
|
||||
ret = st_.compile_highstate()
|
||||
finally:
|
||||
st_.pop_active()
|
||||
- if isinstance(ret, list):
|
||||
- __context__['retcode'] = 1
|
||||
+ _set_retcode(ret)
|
||||
return ret
|
||||
|
||||
|
||||
--
|
||||
2.1.4
|
||||
|
@ -0,0 +1,48 @@
|
||||
From cb588505919b6c74ed824d26a184eec0f47a585b Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Mon, 4 Apr 2016 09:49:31 +0200
|
||||
Subject: [PATCH 09/12] fix sorting by latest version when called with an
|
||||
attribute
|
||||
|
||||
---
|
||||
salt/modules/rpm.py | 7 ++++++-
|
||||
1 file changed, 6 insertions(+), 1 deletion(-)
|
||||
|
||||
diff --git a/salt/modules/rpm.py b/salt/modules/rpm.py
|
||||
index 6026f18..1469368 100644
|
||||
--- a/salt/modules/rpm.py
|
||||
+++ b/salt/modules/rpm.py
|
||||
@@ -471,6 +471,7 @@ def info(*packages, **attr):
|
||||
"url": "%|URL?{url: %{URL}\\n}|",
|
||||
"summary": "summary: %{SUMMARY}\\n",
|
||||
"description": "description:\\n%{DESCRIPTION}\\n",
|
||||
+ "edition": "edition: %|EPOCH?{%{EPOCH}:}|%{VERSION}-%{RELEASE}\\n",
|
||||
}
|
||||
|
||||
attr = attr.get('attr', None) and attr['attr'].split(",") or None
|
||||
@@ -484,6 +485,9 @@ def info(*packages, **attr):
|
||||
if 'name' not in attr:
|
||||
attr.append('name')
|
||||
query.append(attr_map['name'])
|
||||
+ if 'edition' not in attr:
|
||||
+ attr.append('edition')
|
||||
+ query.append(attr_map['edition'])
|
||||
else:
|
||||
for attr_k, attr_v in attr_map.iteritems():
|
||||
if attr_k != 'description':
|
||||
@@ -558,10 +562,11 @@ def info(*packages, **attr):
|
||||
# pick only latest versions
|
||||
# (in case multiple packages installed, e.g. kernel)
|
||||
ret = dict()
|
||||
- for pkg_data in reversed(sorted(_ret, cmp=lambda a_vrs, b_vrs: version_cmp(a_vrs['version'], b_vrs['version']))):
|
||||
+ for pkg_data in reversed(sorted(_ret, cmp=lambda a_vrs, b_vrs: version_cmp(a_vrs['edition'], b_vrs['edition']))):
|
||||
pkg_name = pkg_data.pop('name')
|
||||
if pkg_name not in ret:
|
||||
ret[pkg_name] = pkg_data.copy()
|
||||
+ del ret[pkg_name]['edition']
|
||||
|
||||
return ret
|
||||
|
||||
--
|
||||
2.1.4
|
||||
|
@ -0,0 +1,29 @@
|
||||
From 336929a4cadca55b00dbf1cd33eb35d19f420c73 Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Tue, 5 Apr 2016 12:06:29 +0200
|
||||
Subject: [PATCH 10/12] Prevent metadata download when getting installed
|
||||
products
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 5 ++++-
|
||||
1 file changed, 4 insertions(+), 1 deletion(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 63c473c..9702f42 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -1309,7 +1309,10 @@ def list_products(all=False, refresh=False):
|
||||
|
||||
ret = list()
|
||||
OEM_PATH = "/var/lib/suseRegister/OEM"
|
||||
- cmd = _zypper('-x', 'products')
|
||||
+ cmd = _zypper()
|
||||
+ if not all:
|
||||
+ cmd.append('--disable-repos')
|
||||
+ cmd.extend(['-x', 'products'])
|
||||
if not all:
|
||||
cmd.append('-i')
|
||||
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,49 +0,0 @@
|
||||
From a7a0b80b0ca22a0a898ac4a4f671c08337f8f996 Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Tue, 16 Feb 2016 12:56:24 +0100
|
||||
Subject: [PATCH 10/22] add handling for OEM products
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 12 +++++++++++-
|
||||
1 file changed, 11 insertions(+), 1 deletion(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 76170e6..6930f1a 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -1205,6 +1205,9 @@ def list_products(all=False):
|
||||
all
|
||||
List all products available or only installed. Default is False.
|
||||
|
||||
+ Includes handling for OEM products, which read the OEM productline file
|
||||
+ and overwrite the release value.
|
||||
+
|
||||
CLI Examples:
|
||||
|
||||
.. code-block:: bash
|
||||
@@ -1213,6 +1216,7 @@ def list_products(all=False):
|
||||
salt '*' pkg.list_products all=True
|
||||
'''
|
||||
ret = list()
|
||||
+ OEM_PATH = "/var/lib/suseRegister/OEM"
|
||||
doc = dom.parseString(__salt__['cmd.run'](("zypper -x products{0}".format(not all and ' -i' or '')),
|
||||
output_loglevel='trace'))
|
||||
for prd in doc.getElementsByTagName('product-list')[0].getElementsByTagName('product'):
|
||||
@@ -1226,7 +1230,13 @@ def list_products(all=False):
|
||||
prd.getElementsByTagName('description')
|
||||
).split(os.linesep)]
|
||||
)
|
||||
-
|
||||
+ if 'productline' in p_nfo and p_nfo['productline']:
|
||||
+ oem_file = os.path.join(OEM_PATH, p_nfo['productline'])
|
||||
+ if os.path.isfile(oem_file):
|
||||
+ with salt.utils.fopen(oem_file, 'r') as rfile:
|
||||
+ oem_release = rfile.readline().strip()
|
||||
+ if oem_release:
|
||||
+ p_nfo['release'] = oem_release
|
||||
ret.append(p_nfo)
|
||||
|
||||
return ret
|
||||
--
|
||||
2.1.4
|
||||
|
146
0011-Check-if-EOL-is-available-in-a-particular-product-bs.patch
Normal file
146
0011-Check-if-EOL-is-available-in-a-particular-product-bs.patch
Normal file
@ -0,0 +1,146 @@
|
||||
From aae1c09957eab3c89a6c8f78a579cdf9dcfbe188 Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@suse.de>
|
||||
Date: Tue, 12 Apr 2016 13:52:35 +0200
|
||||
Subject: [PATCH 11/12] Check if EOL is available in a particular product
|
||||
(bsc#975093)
|
||||
|
||||
Update SLE11 SP3 data
|
||||
|
||||
Update SLE12 SP1 data
|
||||
|
||||
Adjust test values according to the testing data
|
||||
---
|
||||
salt/modules/zypper.py | 13 +++++++--
|
||||
.../unit/modules/zypp/zypper-products-sle11sp3.xml | 10 +++++++
|
||||
.../unit/modules/zypp/zypper-products-sle12sp1.xml | 8 ++++++
|
||||
tests/unit/modules/zypper_test.py | 32 ++++++++++++----------
|
||||
4 files changed, 45 insertions(+), 18 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 9702f42..4ce5853 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -1318,12 +1318,19 @@ def list_products(all=False, refresh=False):
|
||||
|
||||
call = __salt__['cmd.run_all'](cmd, output_loglevel='trace')
|
||||
doc = dom.parseString(_zypper_check_result(call, xml=True))
|
||||
- for prd in doc.getElementsByTagName('product-list')[0].getElementsByTagName('product'):
|
||||
+ product_list = doc.getElementsByTagName('product-list')
|
||||
+ if not product_list:
|
||||
+ return ret # No products found
|
||||
+
|
||||
+ for prd in product_list[0].getElementsByTagName('product'):
|
||||
p_nfo = dict()
|
||||
for k_p_nfo, v_p_nfo in prd.attributes.items():
|
||||
p_nfo[k_p_nfo] = k_p_nfo not in ['isbase', 'installed'] and v_p_nfo or v_p_nfo in ['true', '1']
|
||||
- p_nfo['eol'] = prd.getElementsByTagName('endoflife')[0].getAttribute('text')
|
||||
- p_nfo['eol_t'] = int(prd.getElementsByTagName('endoflife')[0].getAttribute('time_t'))
|
||||
+
|
||||
+ eol = prd.getElementsByTagName('endoflife')
|
||||
+ if eol:
|
||||
+ p_nfo['eol'] = eol[0].getAttribute('text')
|
||||
+ p_nfo['eol_t'] = int(eol[0].getAttribute('time_t') or 0)
|
||||
p_nfo['description'] = " ".join(
|
||||
[line.strip() for line in _get_first_aggregate_text(
|
||||
prd.getElementsByTagName('description')
|
||||
diff --git a/tests/unit/modules/zypp/zypper-products-sle11sp3.xml b/tests/unit/modules/zypp/zypper-products-sle11sp3.xml
|
||||
index 89a85e3..99444fe 100644
|
||||
--- a/tests/unit/modules/zypp/zypper-products-sle11sp3.xml
|
||||
+++ b/tests/unit/modules/zypp/zypper-products-sle11sp3.xml
|
||||
@@ -31,7 +31,17 @@
|
||||
offers common management tools and technology
|
||||
certifications across the platform, and
|
||||
each product is enterprise-class.</description></product>
|
||||
+<product name="SUSE_SLES" version="11.3" release="1.201" epoch="0" arch="x86_64" productline="" registerrelease="" vendor="SUSE LINUX Products GmbH, Nuernberg, Germany" summary="SUSE Linux Enterprise Server 11 SP3 No EOL" shortname="" flavor="" isbase="0" repo="nu_novell_com:SLES11-SP3-Updates" installed="0">0x7ffdb538e948<description>SUSE Linux Enterprise offers a comprehensive
|
||||
+ suite of products built on a single code base.
|
||||
+ The platform addresses business needs from
|
||||
+ the smallest thin-client devices to the world’s
|
||||
+ most powerful high-performance computing
|
||||
+ and mainframe servers. SUSE Linux Enterprise
|
||||
+ offers common management tools and technology
|
||||
+ certifications across the platform, and
|
||||
+ each product is enterprise-class.</description></product>
|
||||
<product name="SUSE-Manager-Server" version="2.1" release="1.2" epoch="0" arch="x86_64" productline="" registerrelease="" vendor="SUSE LINUX Products GmbH, Nuernberg, Germany" summary="SUSE Manager Server" shortname="" flavor="cd" isbase="0" repo="nu_novell_com:SUSE-Manager-Server-2.1-Pool" installed="0"><endoflife time_t="0" text="1970-01-01T01:00:00+0100"/>0x7ffdb538e948<description>SUSE Manager Server appliance</description></product>
|
||||
<product name="SUSE-Manager-Server" version="2.1" release="1.2" epoch="0" arch="x86_64" productline="manager" registerrelease="" vendor="SUSE LINUX Products GmbH, Nuernberg, Germany" summary="SUSE Manager Server" shortname="" flavor="cd" isbase="1" repo="@System" installed="1"><endoflife time_t="0" text="1970-01-01T01:00:00+0100"/>0x7ffdb538e948<description>SUSE Manager Server appliance</description></product>
|
||||
+<product name="SUSE-Manager-Server-Broken-EOL" version="2.1" release="1.2" epoch="0" arch="x86_64" productline="manager" registerrelease="" vendor="SUSE LINUX Products GmbH, Nuernberg, Germany" summary="SUSE Manager Server" shortname="" flavor="cd" isbase="1" repo="@System" installed="1"><endoflife wrong="attribute"/>0x7ffdb538e948<description>SUSE Manager Server appliance</description></product>
|
||||
</product-list>
|
||||
</stream>
|
||||
diff --git a/tests/unit/modules/zypp/zypper-products-sle12sp1.xml b/tests/unit/modules/zypp/zypper-products-sle12sp1.xml
|
||||
index 1a50363..a086058 100644
|
||||
--- a/tests/unit/modules/zypp/zypper-products-sle12sp1.xml
|
||||
+++ b/tests/unit/modules/zypp/zypper-products-sle12sp1.xml
|
||||
@@ -24,6 +24,14 @@ provisioning.</description></product>
|
||||
SUSE Manager Tools provide packages required to connect to a
|
||||
SUSE Manager Server.
|
||||
<p></description></product>
|
||||
+<product name="sle-manager-tools-beta-no-eol" version="12" release="0" epoch="0" arch="x86_64" vendor="obs://build.suse.de/Devel:Galaxy:Manager:Head" summary="SUSE Manager Tools" repo="SUSE-Manager-Head" productline="" registerrelease="" shortname="Manager-Tools" flavor="POOL" isbase="false" installed="false"><registerflavor>extension</registerflavor><description><p>
|
||||
+ SUSE Manager Tools provide packages required to connect to a
|
||||
+ SUSE Manager Server.
|
||||
+ <p></description></product>
|
||||
+<product name="sle-manager-tools-beta-broken-eol" version="12" release="0" epoch="0" arch="x86_64" vendor="obs://build.suse.de/Devel:Galaxy:Manager:Head" summary="SUSE Manager Tools" repo="SUSE-Manager-Head" productline="" registerrelease="" shortname="Manager-Tools" flavor="POOL" isbase="false" installed="false"><endoflife wrong="attribute"/><registerflavor>extension</registerflavor><description><p>
|
||||
+ SUSE Manager Tools provide packages required to connect to a
|
||||
+ SUSE Manager Server.
|
||||
+ <p></description></product>
|
||||
<product name="SLES" version="12.1" release="0" epoch="0" arch="x86_64" vendor="SUSE" summary="SUSE Linux Enterprise Server 12 SP1" repo="@System" productline="sles" registerrelease="" shortname="SLES12-SP1" flavor="DVD" isbase="true" installed="true"><endoflife time_t="1730332800" text="2024-10-31T01:00:00+01"/><registerflavor/><description>SUSE Linux Enterprise offers a comprehensive
|
||||
suite of products built on a single code base.
|
||||
The platform addresses business needs from
|
||||
diff --git a/tests/unit/modules/zypper_test.py b/tests/unit/modules/zypper_test.py
|
||||
index 67cf52a..97e42ef 100644
|
||||
--- a/tests/unit/modules/zypper_test.py
|
||||
+++ b/tests/unit/modules/zypper_test.py
|
||||
@@ -153,24 +153,26 @@ class ZypperTestCase(TestCase):
|
||||
for filename, test_data in {
|
||||
'zypper-products-sle12sp1.xml': {
|
||||
'name': ['SLES', 'SLES', 'SUSE-Manager-Proxy',
|
||||
- 'SUSE-Manager-Server', 'sle-manager-tools-beta'],
|
||||
+ 'SUSE-Manager-Server', 'sle-manager-tools-beta',
|
||||
+ 'sle-manager-tools-beta-broken-eol', 'sle-manager-tools-beta-no-eol'],
|
||||
'vendor': 'SUSE LLC <https://www.suse.com/>',
|
||||
- 'release': ['0', '0', '0', '0', '0'],
|
||||
- 'productline': [False, False, False, False, 'sles'],
|
||||
- 'eol_t': [1509408000, 1522454400, 1522454400, 1730332800, 1730332800],
|
||||
- 'isbase': [False, False, False, False, True],
|
||||
- 'installed': [False, False, False, False, True],
|
||||
+ 'release': ['0', '0', '0', '0', '0', '0', '0'],
|
||||
+ 'productline': [False, False, False, False, False, False, 'sles'],
|
||||
+ 'eol_t': [None, 0, 1509408000, 1522454400, 1522454400, 1730332800, 1730332800],
|
||||
+ 'isbase': [False, False, False, False, False, False, True],
|
||||
+ 'installed': [False, False, False, False, False, False, True],
|
||||
},
|
||||
'zypper-products-sle11sp3.xml': {
|
||||
- 'name': ['SUSE-Manager-Server', 'SUSE-Manager-Server',
|
||||
- 'SUSE_SLES', 'SUSE_SLES', 'SUSE_SLES-SP4-migration'],
|
||||
+ 'name': ['SUSE-Manager-Server', 'SUSE-Manager-Server', 'SUSE-Manager-Server-Broken-EOL',
|
||||
+ 'SUSE_SLES', 'SUSE_SLES', 'SUSE_SLES', 'SUSE_SLES-SP4-migration'],
|
||||
'vendor': 'SUSE LINUX Products GmbH, Nuernberg, Germany',
|
||||
- 'release': ['1.138', '1.2', '1.2', '1.201', '1.4'],
|
||||
- 'productline': [False, False, False, False, 'manager'],
|
||||
- 'eol_t': [0, 0, 0, 0, 0],
|
||||
- 'isbase': [False, False, False, False, True],
|
||||
- 'installed': [False, False, False, False, True],
|
||||
+ 'release': ['1.138', '1.2', '1.2', '1.2', '1.201', '1.201', '1.4'],
|
||||
+ 'productline': [False, False, False, False, False, 'manager', 'manager'],
|
||||
+ 'eol_t': [None, 0, 0, 0, 0, 0, 0],
|
||||
+ 'isbase': [False, False, False, False, False, True, True],
|
||||
+ 'installed': [False, False, False, False, False, True, True],
|
||||
}}.items():
|
||||
+
|
||||
ref_out = {
|
||||
'retcode': 0,
|
||||
'stdout': get_test_data(filename)
|
||||
@@ -178,10 +180,10 @@ class ZypperTestCase(TestCase):
|
||||
|
||||
with patch.dict(zypper.__salt__, {'cmd.run_all': MagicMock(return_value=ref_out)}):
|
||||
products = zypper.list_products()
|
||||
- self.assertEqual(len(products), 5)
|
||||
+ self.assertEqual(len(products), 7)
|
||||
self.assertIn(test_data['vendor'], [product['vendor'] for product in products])
|
||||
for kwd in ['name', 'isbase', 'installed', 'release', 'productline', 'eol_t']:
|
||||
- self.assertEqual(test_data[kwd], sorted([prod[kwd] for prod in products]))
|
||||
+ self.assertEqual(test_data[kwd], sorted([prod.get(kwd) for prod in products]))
|
||||
|
||||
def test_refresh_db(self):
|
||||
'''
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,40 +0,0 @@
|
||||
From 13fd4a3becab7fd991ae2c6a8ca1c52a51048cef Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Wed, 10 Feb 2016 14:20:34 +0100
|
||||
Subject: [PATCH 11/22] improve doc for list_pkgs
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 16 ++++++++++++++--
|
||||
1 file changed, 14 insertions(+), 2 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 6930f1a..56d9ffb 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -291,9 +291,21 @@ def version(*names, **kwargs):
|
||||
|
||||
def list_pkgs(versions_as_list=False, **kwargs):
|
||||
'''
|
||||
- List the packages currently installed as a dict::
|
||||
+ List the packages currently installed as a dict with versions
|
||||
+ as a comma separated string::
|
||||
|
||||
- {'<package_name>': '<version>'}
|
||||
+ {'<package_name>': '<version>[,<version>...]'}
|
||||
+
|
||||
+ versions_as_list:
|
||||
+ If set to true, the versions are provided as a list
|
||||
+
|
||||
+ {'<package_name>': ['<version>', '<version>']}
|
||||
+
|
||||
+ removed:
|
||||
+ not supported
|
||||
+
|
||||
+ purge_desired:
|
||||
+ not supported
|
||||
|
||||
CLI Example:
|
||||
|
||||
--
|
||||
2.1.4
|
||||
|
@ -0,0 +1,69 @@
|
||||
From 5e99ee2bec1139b1944284975454c716d477f3e0 Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@maryniuk.net>
|
||||
Date: Wed, 13 Apr 2016 16:15:37 +0200
|
||||
Subject: [PATCH 12/12] Bugfix: salt-key crashes if tries to generate keys to
|
||||
the directory w/o write access (#32436)
|
||||
|
||||
* Raise an exception if keys are tried to be written to the directory that has no write access permissions
|
||||
|
||||
* Show an reasonable error message instead of a traceback crash.
|
||||
|
||||
* Fix the unit tests
|
||||
---
|
||||
salt/crypt.py | 6 ++++++
|
||||
salt/scripts.py | 2 ++
|
||||
tests/unit/crypt_test.py | 1 +
|
||||
3 files changed, 9 insertions(+)
|
||||
|
||||
diff --git a/salt/crypt.py b/salt/crypt.py
|
||||
index 573a3c1..e5f3317 100644
|
||||
--- a/salt/crypt.py
|
||||
+++ b/salt/crypt.py
|
||||
@@ -15,6 +15,7 @@ import logging
|
||||
import traceback
|
||||
import binascii
|
||||
import weakref
|
||||
+import getpass
|
||||
from salt.ext.six.moves import zip # pylint: disable=import-error,redefined-builtin
|
||||
|
||||
# Import third party libs
|
||||
@@ -94,6 +95,11 @@ def gen_keys(keydir, keyname, keysize, user=None):
|
||||
# Between first checking and the generation another process has made
|
||||
# a key! Use the winner's key
|
||||
return priv
|
||||
+
|
||||
+ # Do not try writing anything, if directory has no permissions.
|
||||
+ if not os.access(keydir, os.W_OK):
|
||||
+ raise IOError('Write access denied to "{0}" for user "{1}".'.format(os.path.abspath(keydir), getpass.getuser()))
|
||||
+
|
||||
cumask = os.umask(191)
|
||||
with salt.utils.fopen(priv, 'wb+') as f:
|
||||
f.write(gen.exportKey('PEM'))
|
||||
diff --git a/salt/scripts.py b/salt/scripts.py
|
||||
index 7da79bf..38b100d 100644
|
||||
--- a/salt/scripts.py
|
||||
+++ b/salt/scripts.py
|
||||
@@ -297,6 +297,8 @@ def salt_key():
|
||||
SystemExit('\nExiting gracefully on Ctrl-c'),
|
||||
err,
|
||||
hardcrash, trace=trace)
|
||||
+ except Exception as err:
|
||||
+ sys.stderr.write("Error: {0}\n".format(err.message))
|
||||
|
||||
|
||||
def salt_cp():
|
||||
diff --git a/tests/unit/crypt_test.py b/tests/unit/crypt_test.py
|
||||
index 3ff3b09..f548820 100644
|
||||
--- a/tests/unit/crypt_test.py
|
||||
+++ b/tests/unit/crypt_test.py
|
||||
@@ -86,6 +86,7 @@ class CryptTestCase(TestCase):
|
||||
@patch('os.umask', MagicMock())
|
||||
@patch('os.chmod', MagicMock())
|
||||
@patch('os.chown', MagicMock())
|
||||
+ @patch('os.access', MagicMock(return_value=True))
|
||||
def test_gen_keys(self):
|
||||
with patch('salt.utils.fopen', mock_open()):
|
||||
open_priv_wb = call('/keydir/keyname.pem', 'wb+')
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,95 +0,0 @@
|
||||
From 82a9f07f27cf95a7dcff32c8434af9b5d7cf55ad Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Wed, 10 Feb 2016 11:47:12 +0100
|
||||
Subject: [PATCH 12/22] implement version_cmp for zypper
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 65 ++++++++++++++++++++++++++++++++++++++++++++++++++
|
||||
1 file changed, 65 insertions(+)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 56d9ffb..bd9c30a 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -11,6 +11,7 @@ import copy
|
||||
import logging
|
||||
import re
|
||||
import os
|
||||
+import rpm
|
||||
|
||||
# Import 3rd-party libs
|
||||
# pylint: disable=import-error,redefined-builtin,no-name-in-module
|
||||
@@ -288,6 +289,70 @@ def version(*names, **kwargs):
|
||||
'''
|
||||
return __salt__['pkg_resource.version'](*names, **kwargs) or {}
|
||||
|
||||
+def _stringToEVR(verstring):
|
||||
+ '''
|
||||
+ Split the version string into epoch, version and release and
|
||||
+ return this as tuple.
|
||||
+
|
||||
+ epoch is always not empty.
|
||||
+ version and release can be an empty string if such a component
|
||||
+ could not be found in the version string.
|
||||
+
|
||||
+ "2:1.0-1.2" => ('2', '1.0', '1.2)
|
||||
+ "1.0" => ('0', '1.0', '')
|
||||
+ "" => ('0', '', '')
|
||||
+ '''
|
||||
+ if verstring in [None, '']:
|
||||
+ return ('0', '', '')
|
||||
+ i = verstring.find(':')
|
||||
+ if i != -1:
|
||||
+ try:
|
||||
+ epoch = str(long(verstring[:i]))
|
||||
+ except ValueError:
|
||||
+ # look, garbage in the epoch field, how fun, kill it
|
||||
+ epoch = '0' # this is our fallback, deal
|
||||
+ else:
|
||||
+ epoch = '0'
|
||||
+ j = verstring.find('-')
|
||||
+ if j != -1:
|
||||
+ version = verstring[i + 1:j]
|
||||
+ release = verstring[j + 1:]
|
||||
+ else:
|
||||
+ version = verstring[i + 1:]
|
||||
+ release = ''
|
||||
+ return (epoch, version, release)
|
||||
+
|
||||
+def version_cmp(ver1, ver2):
|
||||
+ '''
|
||||
+ .. versionadded:: 2015.5.4
|
||||
+
|
||||
+ Do a cmp-style comparison on two packages. Return -1 if ver1 < ver2, 0 if
|
||||
+ ver1 == ver2, and 1 if ver1 > ver2. Return None if there was a problem
|
||||
+ making the comparison.
|
||||
+
|
||||
+ CLI Example:
|
||||
+
|
||||
+ .. code-block:: bash
|
||||
+
|
||||
+ salt '*' pkg.version_cmp '0.2-001' '0.2.0.1-002'
|
||||
+ '''
|
||||
+ try:
|
||||
+ cmp_result = rpm.labelCompare(
|
||||
+ _stringToEVR(ver1),
|
||||
+ _stringToEVR(ver2)
|
||||
+ )
|
||||
+ if cmp_result not in (-1, 0, 1):
|
||||
+ raise Exception(
|
||||
+ 'cmp result \'{0}\' is invalid'.format(cmp_result)
|
||||
+ )
|
||||
+ return cmp_result
|
||||
+ except Exception as exc:
|
||||
+ log.warning(
|
||||
+ 'Failed to compare version \'{0}\' to \'{1}\' using '
|
||||
+ 'rpmUtils: {2}'.format(ver1, ver2, exc)
|
||||
+ )
|
||||
+ return salt.utils.version_cmp(ver1, ver2)
|
||||
+
|
||||
|
||||
def list_pkgs(versions_as_list=False, **kwargs):
|
||||
'''
|
||||
--
|
||||
2.1.4
|
||||
|
@ -0,0 +1,86 @@
|
||||
From f187ee058eb221eb5a34d51ca5db53bb8eeea5e1 Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@maryniuk.net>
|
||||
Date: Mon, 18 Apr 2016 16:25:05 +0200
|
||||
Subject: [PATCH 13/14] Prevent crash if pygit2 package is requesting
|
||||
re-compilation
|
||||
|
||||
* Prevent crash if pygit2 package is requesting re-compilation of the entire library on production systems (no *devel packages)
|
||||
|
||||
* Fix PEP8: move imports to the top of the file
|
||||
|
||||
* Move logger up
|
||||
|
||||
* Add log error message in case if exception is not an ImportError
|
||||
---
|
||||
salt/utils/gitfs.py | 33 ++++++++++++++++++++-------------
|
||||
1 file changed, 20 insertions(+), 13 deletions(-)
|
||||
|
||||
diff --git a/salt/utils/gitfs.py b/salt/utils/gitfs.py
|
||||
index 164c92e..5452c28 100644
|
||||
--- a/salt/utils/gitfs.py
|
||||
+++ b/salt/utils/gitfs.py
|
||||
@@ -19,6 +19,18 @@ import subprocess
|
||||
import time
|
||||
from datetime import datetime
|
||||
|
||||
+# Import salt libs
|
||||
+import salt.utils
|
||||
+import salt.utils.itertools
|
||||
+import salt.utils.url
|
||||
+import salt.fileserver
|
||||
+from salt.utils.process import os_is_running as pid_exists
|
||||
+from salt.exceptions import FileserverConfigError, GitLockError
|
||||
+from salt.utils.event import tagify
|
||||
+
|
||||
+# Import third party libs
|
||||
+import salt.ext.six as six
|
||||
+
|
||||
VALID_PROVIDERS = ('gitpython', 'pygit2', 'dulwich')
|
||||
# Optional per-remote params that can only be used on a per-remote basis, and
|
||||
# thus do not have defaults in salt/config.py.
|
||||
@@ -54,16 +66,8 @@ _INVALID_REPO = (
|
||||
'master to continue to use this {2} remote.'
|
||||
)
|
||||
|
||||
-# Import salt libs
|
||||
-import salt.utils
|
||||
-import salt.utils.itertools
|
||||
-import salt.utils.url
|
||||
-import salt.fileserver
|
||||
-from salt.exceptions import FileserverConfigError, GitLockError
|
||||
-from salt.utils.event import tagify
|
||||
+log = logging.getLogger(__name__)
|
||||
|
||||
-# Import third party libs
|
||||
-import salt.ext.six as six
|
||||
# pylint: disable=import-error
|
||||
try:
|
||||
import git
|
||||
@@ -79,8 +83,13 @@ try:
|
||||
GitError = pygit2.errors.GitError
|
||||
except AttributeError:
|
||||
GitError = Exception
|
||||
-except ImportError:
|
||||
- HAS_PYGIT2 = False
|
||||
+except Exception as err: # cffi VerificationError also may happen
|
||||
+ HAS_PYGIT2 = False # and pygit2 requrests re-compilation
|
||||
+ # on a production system (!),
|
||||
+ # but cffi might be absent as well!
|
||||
+ # Therefore just a generic Exception class.
|
||||
+ if not isinstance(err, ImportError):
|
||||
+ log.error('Import pygit2 failed: {0}'.format(err))
|
||||
|
||||
try:
|
||||
import dulwich.errors
|
||||
@@ -93,8 +102,6 @@ except ImportError:
|
||||
HAS_DULWICH = False
|
||||
# pylint: enable=import-error
|
||||
|
||||
-log = logging.getLogger(__name__)
|
||||
-
|
||||
# Minimum versions for backend providers
|
||||
GITPYTHON_MINVER = '0.3'
|
||||
PYGIT2_MINVER = '0.20.3'
|
||||
--
|
||||
2.8.1
|
||||
|
@ -1,93 +0,0 @@
|
||||
From c26d1b6987a06e972749a10af1c54befae14c6e6 Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Tue, 16 Feb 2016 13:48:50 +0100
|
||||
Subject: [PATCH 13/22] pylint changes
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 30 ++++++++++++++++--------------
|
||||
1 file changed, 16 insertions(+), 14 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index bd9c30a..7448f8b 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -2,7 +2,7 @@
|
||||
'''
|
||||
Package support for openSUSE via the zypper package manager
|
||||
|
||||
-:depends: - ``zypp`` Python module. Install with ``zypper install python-zypp``
|
||||
+:depends: - ``rpm`` Python module. Install with ``zypper install rpm-python``
|
||||
'''
|
||||
|
||||
# Import python libs
|
||||
@@ -11,10 +11,10 @@ import copy
|
||||
import logging
|
||||
import re
|
||||
import os
|
||||
-import rpm
|
||||
|
||||
# Import 3rd-party libs
|
||||
# pylint: disable=import-error,redefined-builtin,no-name-in-module
|
||||
+import rpm
|
||||
import salt.ext.six as six
|
||||
from salt.ext.six.moves import configparser
|
||||
from salt.ext.six.moves.urllib.parse import urlparse as _urlparse
|
||||
@@ -289,7 +289,8 @@ def version(*names, **kwargs):
|
||||
'''
|
||||
return __salt__['pkg_resource.version'](*names, **kwargs) or {}
|
||||
|
||||
-def _stringToEVR(verstring):
|
||||
+
|
||||
+def _string_to_evr(verstring):
|
||||
'''
|
||||
Split the version string into epoch, version and release and
|
||||
return this as tuple.
|
||||
@@ -304,24 +305,25 @@ def _stringToEVR(verstring):
|
||||
'''
|
||||
if verstring in [None, '']:
|
||||
return ('0', '', '')
|
||||
- i = verstring.find(':')
|
||||
- if i != -1:
|
||||
+ idx_e = verstring.find(':')
|
||||
+ if idx_e != -1:
|
||||
try:
|
||||
- epoch = str(long(verstring[:i]))
|
||||
+ epoch = str(int(verstring[:idx_e]))
|
||||
except ValueError:
|
||||
# look, garbage in the epoch field, how fun, kill it
|
||||
- epoch = '0' # this is our fallback, deal
|
||||
+ epoch = '0' # this is our fallback, deal
|
||||
else:
|
||||
epoch = '0'
|
||||
- j = verstring.find('-')
|
||||
- if j != -1:
|
||||
- version = verstring[i + 1:j]
|
||||
- release = verstring[j + 1:]
|
||||
+ idx_r = verstring.find('-')
|
||||
+ if idx_r != -1:
|
||||
+ version = verstring[idx_e + 1:idx_r]
|
||||
+ release = verstring[idx_r + 1:]
|
||||
else:
|
||||
- version = verstring[i + 1:]
|
||||
+ version = verstring[idx_e + 1:]
|
||||
release = ''
|
||||
return (epoch, version, release)
|
||||
|
||||
+
|
||||
def version_cmp(ver1, ver2):
|
||||
'''
|
||||
.. versionadded:: 2015.5.4
|
||||
@@ -338,8 +340,8 @@ def version_cmp(ver1, ver2):
|
||||
'''
|
||||
try:
|
||||
cmp_result = rpm.labelCompare(
|
||||
- _stringToEVR(ver1),
|
||||
- _stringToEVR(ver2)
|
||||
+ _string_to_evr(ver1),
|
||||
+ _string_to_evr(ver2)
|
||||
)
|
||||
if cmp_result not in (-1, 0, 1):
|
||||
raise Exception(
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,70 +0,0 @@
|
||||
From 4accc710ab2f92118f4777d13bc585d26e8e939e Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Wed, 17 Feb 2016 08:49:15 +0100
|
||||
Subject: [PATCH 14/22] Check if rpm-python can be imported
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 36 +++++++++++++++++++++---------------
|
||||
1 file changed, 21 insertions(+), 15 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 7448f8b..d44ad6a 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -14,10 +14,15 @@ import os
|
||||
|
||||
# Import 3rd-party libs
|
||||
# pylint: disable=import-error,redefined-builtin,no-name-in-module
|
||||
-import rpm
|
||||
import salt.ext.six as six
|
||||
from salt.ext.six.moves import configparser
|
||||
from salt.ext.six.moves.urllib.parse import urlparse as _urlparse
|
||||
+
|
||||
+try:
|
||||
+ import rpm
|
||||
+ HAS_RPM = True
|
||||
+except ImportError:
|
||||
+ HAS_RPM = False
|
||||
# pylint: enable=import-error,redefined-builtin,no-name-in-module
|
||||
|
||||
from xml.dom import minidom as dom
|
||||
@@ -338,21 +343,22 @@ def version_cmp(ver1, ver2):
|
||||
|
||||
salt '*' pkg.version_cmp '0.2-001' '0.2.0.1-002'
|
||||
'''
|
||||
- try:
|
||||
- cmp_result = rpm.labelCompare(
|
||||
- _string_to_evr(ver1),
|
||||
- _string_to_evr(ver2)
|
||||
- )
|
||||
- if cmp_result not in (-1, 0, 1):
|
||||
- raise Exception(
|
||||
- 'cmp result \'{0}\' is invalid'.format(cmp_result)
|
||||
+ if HAS_RPM:
|
||||
+ try:
|
||||
+ cmp_result = rpm.labelCompare(
|
||||
+ _string_to_evr(ver1),
|
||||
+ _string_to_evr(ver2)
|
||||
+ )
|
||||
+ if cmp_result not in (-1, 0, 1):
|
||||
+ raise Exception(
|
||||
+ 'cmp result \'{0}\' is invalid'.format(cmp_result)
|
||||
+ )
|
||||
+ return cmp_result
|
||||
+ except Exception as exc:
|
||||
+ log.warning(
|
||||
+ 'Failed to compare version \'{0}\' to \'{1}\' using '
|
||||
+ 'rpmUtils: {2}'.format(ver1, ver2, exc)
|
||||
)
|
||||
- return cmp_result
|
||||
- except Exception as exc:
|
||||
- log.warning(
|
||||
- 'Failed to compare version \'{0}\' to \'{1}\' using '
|
||||
- 'rpmUtils: {2}'.format(ver1, ver2, exc)
|
||||
- )
|
||||
return salt.utils.version_cmp(ver1, ver2)
|
||||
|
||||
|
||||
--
|
||||
2.1.4
|
||||
|
@ -0,0 +1,39 @@
|
||||
From 0961f5bd3e3b7aa3ebd75fe064044d078df62724 Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Mon, 18 Apr 2016 16:31:58 +0200
|
||||
Subject: [PATCH 14/14] align OS grains from older SLES with current one
|
||||
(#32649)
|
||||
|
||||
---
|
||||
salt/grains/core.py | 9 +++++++--
|
||||
1 file changed, 7 insertions(+), 2 deletions(-)
|
||||
|
||||
diff --git a/salt/grains/core.py b/salt/grains/core.py
|
||||
index eb62b97..d5dbef8 100644
|
||||
--- a/salt/grains/core.py
|
||||
+++ b/salt/grains/core.py
|
||||
@@ -1184,14 +1184,19 @@ def os_data():
|
||||
for line in fhr:
|
||||
if 'enterprise' in line.lower():
|
||||
grains['lsb_distrib_id'] = 'SLES'
|
||||
+ grains['lsb_distrib_codename'] = re.sub(r'\(.+\)', '', line).strip()
|
||||
elif 'version' in line.lower():
|
||||
version = re.sub(r'[^0-9]', '', line)
|
||||
elif 'patchlevel' in line.lower():
|
||||
patch = re.sub(r'[^0-9]', '', line)
|
||||
grains['lsb_distrib_release'] = version
|
||||
if patch:
|
||||
- grains['lsb_distrib_release'] += ' SP' + patch
|
||||
- grains['lsb_distrib_codename'] = 'n.a'
|
||||
+ grains['lsb_distrib_release'] += '.' + patch
|
||||
+ patchstr = 'SP' + patch
|
||||
+ if grains['lsb_distrib_codename'] and patchstr not in grains['lsb_distrib_codename']:
|
||||
+ grains['lsb_distrib_codename'] += ' ' + patchstr
|
||||
+ if not grains['lsb_distrib_codename']:
|
||||
+ grains['lsb_distrib_codename'] = 'n.a'
|
||||
elif os.path.isfile('/etc/altlinux-release'):
|
||||
# ALT Linux
|
||||
grains['lsb_distrib_id'] = 'altlinux'
|
||||
--
|
||||
2.8.1
|
||||
|
@ -1,184 +0,0 @@
|
||||
From 5009e290b5a318b168fd03095ced7043203fe34c Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Thu, 18 Feb 2016 10:12:55 +0100
|
||||
Subject: [PATCH 15/22] call zypper with option --non-interactive everywhere
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 49 ++++++++++++++++++++++++++++++++-----------------
|
||||
1 file changed, 32 insertions(+), 17 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index d44ad6a..cb26b51 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -56,6 +56,21 @@ def __virtual__():
|
||||
return __virtualname__
|
||||
|
||||
|
||||
+def _zypper(as_list=False):
|
||||
+ '''
|
||||
+ Return zypper command with default options as a string.
|
||||
+
|
||||
+ CMD: zypper --non-interactive
|
||||
+
|
||||
+ as_list:
|
||||
+ if set to True, the command and the default options
|
||||
+ are returned as a list
|
||||
+ '''
|
||||
+ if as_list:
|
||||
+ return ['zypper', '--non-interactive']
|
||||
+ return "zypper --non-interactive "
|
||||
+
|
||||
+
|
||||
def list_upgrades(refresh=True):
|
||||
'''
|
||||
List all available package upgrades on this system
|
||||
@@ -70,7 +85,7 @@ def list_upgrades(refresh=True):
|
||||
refresh_db()
|
||||
ret = {}
|
||||
call = __salt__['cmd.run_all'](
|
||||
- 'zypper list-updates', output_loglevel='trace'
|
||||
+ _zypper() + 'list-updates', output_loglevel='trace'
|
||||
)
|
||||
if call['retcode'] != 0:
|
||||
comment = ''
|
||||
@@ -185,7 +200,7 @@ def info_available(*names, **kwargs):
|
||||
|
||||
# Run in batches
|
||||
while batch:
|
||||
- cmd = 'zypper info -t package {0}'.format(' '.join(batch[:batch_size]))
|
||||
+ cmd = '{0} info -t package {1}'.format(_zypper(), ' '.join(batch[:batch_size]))
|
||||
pkg_info.extend(re.split(r"Information for package*", __salt__['cmd.run_stdout'](cmd, output_loglevel='trace')))
|
||||
batch = batch[batch_size:]
|
||||
|
||||
@@ -494,7 +509,7 @@ def del_repo(repo):
|
||||
repos_cfg = _get_configured_repos()
|
||||
for alias in repos_cfg.sections():
|
||||
if alias == repo:
|
||||
- cmd = ('zypper -x --non-interactive rr --loose-auth --loose-query {0}'.format(alias))
|
||||
+ cmd = ('{0} -x rr --loose-auth --loose-query {1}'.format(_zypper(), alias))
|
||||
doc = dom.parseString(__salt__['cmd.run'](cmd, output_loglevel='trace'))
|
||||
msg = doc.getElementsByTagName('message')
|
||||
if doc.getElementsByTagName('progress') and msg:
|
||||
@@ -583,7 +598,7 @@ def mod_repo(repo, **kwargs):
|
||||
try:
|
||||
# Try to parse the output and find the error,
|
||||
# but this not always working (depends on Zypper version)
|
||||
- doc = dom.parseString(__salt__['cmd.run'](('zypper -x ar {0} \'{1}\''.format(url, repo)),
|
||||
+ doc = dom.parseString(__salt__['cmd.run'](('{0} -x ar {1} \'{2}\''.format(_zypper(), url, repo)),
|
||||
output_loglevel='trace'))
|
||||
except Exception:
|
||||
# No XML out available, but it is still unknown the state of the result.
|
||||
@@ -629,7 +644,7 @@ def mod_repo(repo, **kwargs):
|
||||
cmd_opt.append("--name='{0}'".format(kwargs.get('humanname')))
|
||||
|
||||
if cmd_opt:
|
||||
- __salt__['cmd.run'](('zypper -x mr {0} \'{1}\''.format(' '.join(cmd_opt), repo)),
|
||||
+ __salt__['cmd.run'](('{0} -x mr {1} \'{2}\''.format(_zypper(), ' '.join(cmd_opt), repo)),
|
||||
output_loglevel='trace')
|
||||
|
||||
# If repo nor added neither modified, error should be thrown
|
||||
@@ -652,7 +667,7 @@ def refresh_db():
|
||||
|
||||
salt '*' pkg.refresh_db
|
||||
'''
|
||||
- cmd = 'zypper refresh'
|
||||
+ cmd = _zypper() + 'refresh'
|
||||
ret = {}
|
||||
call = __salt__['cmd.run_all'](cmd, output_loglevel='trace')
|
||||
if call['retcode'] != 0:
|
||||
@@ -799,7 +814,7 @@ def install(name=None,
|
||||
log.info('Targeting repo {0!r}'.format(fromrepo))
|
||||
else:
|
||||
fromrepoopt = ''
|
||||
- cmd_install = ['zypper', '--non-interactive']
|
||||
+ cmd_install = _zypper(as_list=True)
|
||||
if not refresh:
|
||||
cmd_install.append('--no-refresh')
|
||||
cmd_install += ['install', '--name', '--auto-agree-with-licenses']
|
||||
@@ -855,7 +870,7 @@ def upgrade(refresh=True):
|
||||
if salt.utils.is_true(refresh):
|
||||
refresh_db()
|
||||
old = list_pkgs()
|
||||
- cmd = 'zypper --non-interactive update --auto-agree-with-licenses'
|
||||
+ cmd = _zypper() + 'update --auto-agree-with-licenses'
|
||||
call = __salt__['cmd.run_all'](cmd, output_loglevel='trace')
|
||||
if call['retcode'] != 0:
|
||||
ret['result'] = False
|
||||
@@ -887,8 +902,8 @@ def _uninstall(action='remove', name=None, pkgs=None):
|
||||
return {}
|
||||
while targets:
|
||||
cmd = (
|
||||
- 'zypper --non-interactive remove {0} {1}'
|
||||
- .format(purge_arg, ' '.join(targets[:500]))
|
||||
+ '{0} remove {1} {2}'
|
||||
+ .format(_zypper(), purge_arg, ' '.join(targets[:500]))
|
||||
)
|
||||
__salt__['cmd.run'](cmd, output_loglevel='trace')
|
||||
targets = targets[500:]
|
||||
@@ -1003,7 +1018,7 @@ def clean_locks():
|
||||
if not os.path.exists("/etc/zypp/locks"):
|
||||
return out
|
||||
|
||||
- doc = dom.parseString(__salt__['cmd.run']('zypper --non-interactive -x cl', output_loglevel='trace'))
|
||||
+ doc = dom.parseString(__salt__['cmd.run'](_zypper() + '-x cl', output_loglevel='trace'))
|
||||
for node in doc.getElementsByTagName("message"):
|
||||
text = node.childNodes[0].nodeValue.lower()
|
||||
if text.startswith(LCK):
|
||||
@@ -1041,7 +1056,7 @@ def remove_lock(packages, **kwargs): # pylint: disable=unused-argument
|
||||
missing.append(pkg)
|
||||
|
||||
if removed:
|
||||
- __salt__['cmd.run'](('zypper --non-interactive rl {0}'.format(' '.join(removed))),
|
||||
+ __salt__['cmd.run'](('{0} rl {1}'.format(_zypper(), ' '.join(removed))),
|
||||
output_loglevel='trace')
|
||||
|
||||
return {'removed': len(removed), 'not_found': missing}
|
||||
@@ -1071,7 +1086,7 @@ def add_lock(packages, **kwargs): # pylint: disable=unused-argument
|
||||
added.append(pkg)
|
||||
|
||||
if added:
|
||||
- __salt__['cmd.run'](('zypper --non-interactive al {0}'.format(' '.join(added))),
|
||||
+ __salt__['cmd.run'](('{0} al {1}'.format(_zypper(), ' '.join(added))),
|
||||
output_loglevel='trace')
|
||||
|
||||
return {'added': len(added), 'packages': added}
|
||||
@@ -1204,7 +1219,7 @@ def _get_patterns(installed_only=None):
|
||||
List all known patterns in repos.
|
||||
'''
|
||||
patterns = {}
|
||||
- doc = dom.parseString(__salt__['cmd.run'](('zypper --xmlout se -t pattern'),
|
||||
+ doc = dom.parseString(__salt__['cmd.run']((_zypper() + '--xmlout se -t pattern'),
|
||||
output_loglevel='trace'))
|
||||
for element in doc.getElementsByTagName('solvable'):
|
||||
installed = element.getAttribute('status') == 'installed'
|
||||
@@ -1253,7 +1268,7 @@ def search(criteria):
|
||||
|
||||
salt '*' pkg.search <criteria>
|
||||
'''
|
||||
- doc = dom.parseString(__salt__['cmd.run'](('zypper --xmlout se {0}'.format(criteria)),
|
||||
+ doc = dom.parseString(__salt__['cmd.run'](('{0} --xmlout se {1}'.format(_zypper(), criteria)),
|
||||
output_loglevel='trace'))
|
||||
solvables = doc.getElementsByTagName('solvable')
|
||||
if not solvables:
|
||||
@@ -1302,7 +1317,7 @@ def list_products(all=False):
|
||||
'''
|
||||
ret = list()
|
||||
OEM_PATH = "/var/lib/suseRegister/OEM"
|
||||
- doc = dom.parseString(__salt__['cmd.run'](("zypper -x products{0}".format(not all and ' -i' or '')),
|
||||
+ doc = dom.parseString(__salt__['cmd.run'](("{0} -x products{1}".format(_zypper(), not all and ' -i' or '')),
|
||||
output_loglevel='trace'))
|
||||
for prd in doc.getElementsByTagName('product-list')[0].getElementsByTagName('product'):
|
||||
p_nfo = dict()
|
||||
@@ -1342,7 +1357,7 @@ def download(*packages):
|
||||
raise CommandExecutionError("No packages has been specified.")
|
||||
|
||||
doc = dom.parseString(__salt__['cmd.run'](
|
||||
- ('zypper -x --non-interactive download {0}'.format(' '.join(packages))),
|
||||
+ ('{0} -x download {1}'.format(_zypper(), ' '.join(packages))),
|
||||
output_loglevel='trace'))
|
||||
pkg_ret = {}
|
||||
for dld_result in doc.getElementsByTagName("download-result"):
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,201 +0,0 @@
|
||||
From 8d25c4c8906581fa44380f72f0f754b56f5e30c3 Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Fri, 19 Feb 2016 11:50:31 +0100
|
||||
Subject: [PATCH 16/22] write a zypper command builder function
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 62 +++++++++++++++++++++++---------------------------
|
||||
1 file changed, 29 insertions(+), 33 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index cb26b51..f878c95 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -56,19 +56,18 @@ def __virtual__():
|
||||
return __virtualname__
|
||||
|
||||
|
||||
-def _zypper(as_list=False):
|
||||
+def _zypper(*opts):
|
||||
'''
|
||||
- Return zypper command with default options as a string.
|
||||
+ Return zypper command with default options as a list.
|
||||
|
||||
- CMD: zypper --non-interactive
|
||||
+ opts
|
||||
+ additional options for zypper command
|
||||
|
||||
- as_list:
|
||||
- if set to True, the command and the default options
|
||||
- are returned as a list
|
||||
'''
|
||||
- if as_list:
|
||||
- return ['zypper', '--non-interactive']
|
||||
- return "zypper --non-interactive "
|
||||
+ cmd = ['zypper', '--non-interactive']
|
||||
+ cmd.extend(opts)
|
||||
+
|
||||
+ return cmd
|
||||
|
||||
|
||||
def list_upgrades(refresh=True):
|
||||
@@ -85,7 +84,7 @@ def list_upgrades(refresh=True):
|
||||
refresh_db()
|
||||
ret = {}
|
||||
call = __salt__['cmd.run_all'](
|
||||
- _zypper() + 'list-updates', output_loglevel='trace'
|
||||
+ _zypper('list-updates'), output_loglevel='trace'
|
||||
)
|
||||
if call['retcode'] != 0:
|
||||
comment = ''
|
||||
@@ -200,7 +199,7 @@ def info_available(*names, **kwargs):
|
||||
|
||||
# Run in batches
|
||||
while batch:
|
||||
- cmd = '{0} info -t package {1}'.format(_zypper(), ' '.join(batch[:batch_size]))
|
||||
+ cmd = _zypper('info', '-t', 'package', *batch[:batch_size])
|
||||
pkg_info.extend(re.split(r"Information for package*", __salt__['cmd.run_stdout'](cmd, output_loglevel='trace')))
|
||||
batch = batch[batch_size:]
|
||||
|
||||
@@ -509,7 +508,7 @@ def del_repo(repo):
|
||||
repos_cfg = _get_configured_repos()
|
||||
for alias in repos_cfg.sections():
|
||||
if alias == repo:
|
||||
- cmd = ('{0} -x rr --loose-auth --loose-query {1}'.format(_zypper(), alias))
|
||||
+ cmd = _zypper('-x', 'rr', '--loose-auth', '--loose-query', alias)
|
||||
doc = dom.parseString(__salt__['cmd.run'](cmd, output_loglevel='trace'))
|
||||
msg = doc.getElementsByTagName('message')
|
||||
if doc.getElementsByTagName('progress') and msg:
|
||||
@@ -598,8 +597,8 @@ def mod_repo(repo, **kwargs):
|
||||
try:
|
||||
# Try to parse the output and find the error,
|
||||
# but this not always working (depends on Zypper version)
|
||||
- doc = dom.parseString(__salt__['cmd.run'](('{0} -x ar {1} \'{2}\''.format(_zypper(), url, repo)),
|
||||
- output_loglevel='trace'))
|
||||
+ doc = dom.parseString(__salt__['cmd.run'](
|
||||
+ _zypper('-x', 'ar', url, repo), output_loglevel='trace'))
|
||||
except Exception:
|
||||
# No XML out available, but it is still unknown the state of the result.
|
||||
pass
|
||||
@@ -644,7 +643,8 @@ def mod_repo(repo, **kwargs):
|
||||
cmd_opt.append("--name='{0}'".format(kwargs.get('humanname')))
|
||||
|
||||
if cmd_opt:
|
||||
- __salt__['cmd.run'](('{0} -x mr {1} \'{2}\''.format(_zypper(), ' '.join(cmd_opt), repo)),
|
||||
+ cmd_opt.append(repo)
|
||||
+ __salt__['cmd.run'](_zypper('-x', 'mr', *cmd_opt),
|
||||
output_loglevel='trace')
|
||||
|
||||
# If repo nor added neither modified, error should be thrown
|
||||
@@ -667,7 +667,7 @@ def refresh_db():
|
||||
|
||||
salt '*' pkg.refresh_db
|
||||
'''
|
||||
- cmd = _zypper() + 'refresh'
|
||||
+ cmd = _zypper('refresh')
|
||||
ret = {}
|
||||
call = __salt__['cmd.run_all'](cmd, output_loglevel='trace')
|
||||
if call['retcode'] != 0:
|
||||
@@ -814,7 +814,7 @@ def install(name=None,
|
||||
log.info('Targeting repo {0!r}'.format(fromrepo))
|
||||
else:
|
||||
fromrepoopt = ''
|
||||
- cmd_install = _zypper(as_list=True)
|
||||
+ cmd_install = _zypper()
|
||||
if not refresh:
|
||||
cmd_install.append('--no-refresh')
|
||||
cmd_install += ['install', '--name', '--auto-agree-with-licenses']
|
||||
@@ -870,7 +870,7 @@ def upgrade(refresh=True):
|
||||
if salt.utils.is_true(refresh):
|
||||
refresh_db()
|
||||
old = list_pkgs()
|
||||
- cmd = _zypper() + 'update --auto-agree-with-licenses'
|
||||
+ cmd = _zypper('update', '--auto-agree-with-licenses')
|
||||
call = __salt__['cmd.run_all'](cmd, output_loglevel='trace')
|
||||
if call['retcode'] != 0:
|
||||
ret['result'] = False
|
||||
@@ -901,10 +901,7 @@ def _uninstall(action='remove', name=None, pkgs=None):
|
||||
if not targets:
|
||||
return {}
|
||||
while targets:
|
||||
- cmd = (
|
||||
- '{0} remove {1} {2}'
|
||||
- .format(_zypper(), purge_arg, ' '.join(targets[:500]))
|
||||
- )
|
||||
+ cmd = _zypper('remove', purge_arg, *targets[:500])
|
||||
__salt__['cmd.run'](cmd, output_loglevel='trace')
|
||||
targets = targets[500:]
|
||||
__context__.pop('pkg.list_pkgs', None)
|
||||
@@ -1018,7 +1015,7 @@ def clean_locks():
|
||||
if not os.path.exists("/etc/zypp/locks"):
|
||||
return out
|
||||
|
||||
- doc = dom.parseString(__salt__['cmd.run'](_zypper() + '-x cl', output_loglevel='trace'))
|
||||
+ doc = dom.parseString(__salt__['cmd.run'](_zypper('-x', 'cl'), output_loglevel='trace'))
|
||||
for node in doc.getElementsByTagName("message"):
|
||||
text = node.childNodes[0].nodeValue.lower()
|
||||
if text.startswith(LCK):
|
||||
@@ -1056,8 +1053,7 @@ def remove_lock(packages, **kwargs): # pylint: disable=unused-argument
|
||||
missing.append(pkg)
|
||||
|
||||
if removed:
|
||||
- __salt__['cmd.run'](('{0} rl {1}'.format(_zypper(), ' '.join(removed))),
|
||||
- output_loglevel='trace')
|
||||
+ __salt__['cmd.run'](_zypper('rl', *removed), output_loglevel='trace')
|
||||
|
||||
return {'removed': len(removed), 'not_found': missing}
|
||||
|
||||
@@ -1086,8 +1082,7 @@ def add_lock(packages, **kwargs): # pylint: disable=unused-argument
|
||||
added.append(pkg)
|
||||
|
||||
if added:
|
||||
- __salt__['cmd.run'](('{0} al {1}'.format(_zypper(), ' '.join(added))),
|
||||
- output_loglevel='trace')
|
||||
+ __salt__['cmd.run'](_zypper('al', *added), output_loglevel='trace')
|
||||
|
||||
return {'added': len(added), 'packages': added}
|
||||
|
||||
@@ -1219,7 +1214,7 @@ def _get_patterns(installed_only=None):
|
||||
List all known patterns in repos.
|
||||
'''
|
||||
patterns = {}
|
||||
- doc = dom.parseString(__salt__['cmd.run']((_zypper() + '--xmlout se -t pattern'),
|
||||
+ doc = dom.parseString(__salt__['cmd.run'](_zypper('--xmlout', 'se', '-t', 'pattern'),
|
||||
output_loglevel='trace'))
|
||||
for element in doc.getElementsByTagName('solvable'):
|
||||
installed = element.getAttribute('status') == 'installed'
|
||||
@@ -1268,7 +1263,7 @@ def search(criteria):
|
||||
|
||||
salt '*' pkg.search <criteria>
|
||||
'''
|
||||
- doc = dom.parseString(__salt__['cmd.run'](('{0} --xmlout se {1}'.format(_zypper(), criteria)),
|
||||
+ doc = dom.parseString(__salt__['cmd.run'](_zypper('--xmlout', 'se', criteria),
|
||||
output_loglevel='trace'))
|
||||
solvables = doc.getElementsByTagName('solvable')
|
||||
if not solvables:
|
||||
@@ -1317,8 +1312,10 @@ def list_products(all=False):
|
||||
'''
|
||||
ret = list()
|
||||
OEM_PATH = "/var/lib/suseRegister/OEM"
|
||||
- doc = dom.parseString(__salt__['cmd.run'](("{0} -x products{1}".format(_zypper(), not all and ' -i' or '')),
|
||||
- output_loglevel='trace'))
|
||||
+ cmd = _zypper('-x', 'products')
|
||||
+ if not all:
|
||||
+ cmd.append('-i')
|
||||
+ doc = dom.parseString(__salt__['cmd.run'](cmd, output_loglevel='trace'))
|
||||
for prd in doc.getElementsByTagName('product-list')[0].getElementsByTagName('product'):
|
||||
p_nfo = dict()
|
||||
for k_p_nfo, v_p_nfo in prd.attributes.items():
|
||||
@@ -1357,8 +1354,7 @@ def download(*packages):
|
||||
raise CommandExecutionError("No packages has been specified.")
|
||||
|
||||
doc = dom.parseString(__salt__['cmd.run'](
|
||||
- ('{0} -x download {1}'.format(_zypper(), ' '.join(packages))),
|
||||
- output_loglevel='trace'))
|
||||
+ _zypper('-x', 'download', *packages), output_loglevel='trace'))
|
||||
pkg_ret = {}
|
||||
for dld_result in doc.getElementsByTagName("download-result"):
|
||||
repo = dld_result.getElementsByTagName("repository")[0]
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,49 +0,0 @@
|
||||
From 978afba658cff38ebc1d6a7aecee4813796db528 Mon Sep 17 00:00:00 2001
|
||||
From: Duncan Mac-Vicar P <dmacvicar@suse.de>
|
||||
Date: Sat, 13 Feb 2016 00:23:30 +0100
|
||||
Subject: [PATCH 17/22] Fix crash with scheduler and runners (#31106)
|
||||
|
||||
* runner wrapper ClientFuncsDict do not provide access to 'pack' attribute
|
||||
* runners do not provide retcode, therefore ignore it in the schedule if it is not
|
||||
provided by __context__
|
||||
---
|
||||
salt/client/mixins.py | 6 ++++++
|
||||
salt/utils/schedule.py | 5 ++++-
|
||||
2 files changed, 10 insertions(+), 1 deletion(-)
|
||||
|
||||
diff --git a/salt/client/mixins.py b/salt/client/mixins.py
|
||||
index cdb1d0c..6fa3e6f 100644
|
||||
--- a/salt/client/mixins.py
|
||||
+++ b/salt/client/mixins.py
|
||||
@@ -53,6 +53,12 @@ class ClientFuncsDict(collections.MutableMapping):
|
||||
def __init__(self, client):
|
||||
self.client = client
|
||||
|
||||
+ def __getattr__(self, attr):
|
||||
+ '''
|
||||
+ Provide access eg. to 'pack'
|
||||
+ '''
|
||||
+ return getattr(self.client.functions, attr)
|
||||
+
|
||||
def __setitem__(self, key, val):
|
||||
raise NotImplementedError()
|
||||
|
||||
diff --git a/salt/utils/schedule.py b/salt/utils/schedule.py
|
||||
index cae5fcf..5ed49f7 100644
|
||||
--- a/salt/utils/schedule.py
|
||||
+++ b/salt/utils/schedule.py
|
||||
@@ -700,7 +700,10 @@ class Schedule(object):
|
||||
)
|
||||
)
|
||||
|
||||
- ret['retcode'] = self.functions.pack['__context__']['retcode']
|
||||
+ # runners do not provide retcode
|
||||
+ if 'retcode' in self.functions.pack['__context__']:
|
||||
+ ret['retcode'] = self.functions.pack['__context__']['retcode']
|
||||
+
|
||||
ret['success'] = True
|
||||
except Exception:
|
||||
log.exception("Unhandled exception running {0}".format(ret['fun']))
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,109 +0,0 @@
|
||||
From 29ab56413c60c958d5d62b1acdea5a97ce80fdb9 Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Thu, 18 Feb 2016 12:30:19 +0100
|
||||
Subject: [PATCH 18/22] unify behavior of refresh
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 30 ++++++++++++++++++++++++------
|
||||
1 file changed, 24 insertions(+), 6 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index f878c95..f5b09c0 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -74,6 +74,11 @@ def list_upgrades(refresh=True):
|
||||
'''
|
||||
List all available package upgrades on this system
|
||||
|
||||
+ refresh
|
||||
+ force a refresh if set to True (default).
|
||||
+ If set to False it depends on zypper if a refresh is
|
||||
+ executed.
|
||||
+
|
||||
CLI Example:
|
||||
|
||||
.. code-block:: bash
|
||||
@@ -175,6 +180,11 @@ def info_available(*names, **kwargs):
|
||||
'''
|
||||
Return the information of the named package available for the system.
|
||||
|
||||
+ refresh
|
||||
+ force a refresh if set to True (default).
|
||||
+ If set to False it depends on zypper if a refresh is
|
||||
+ executed or not.
|
||||
+
|
||||
CLI example:
|
||||
|
||||
.. code-block:: bash
|
||||
@@ -657,7 +667,7 @@ def mod_repo(repo, **kwargs):
|
||||
|
||||
def refresh_db():
|
||||
'''
|
||||
- Just run a ``zypper refresh``, return a dict::
|
||||
+ Force a repository refresh by calling ``zypper refresh --force``, return a dict::
|
||||
|
||||
{'<database name>': Bool}
|
||||
|
||||
@@ -667,7 +677,7 @@ def refresh_db():
|
||||
|
||||
salt '*' pkg.refresh_db
|
||||
'''
|
||||
- cmd = _zypper('refresh')
|
||||
+ cmd = _zypper('refresh', '--force')
|
||||
ret = {}
|
||||
call = __salt__['cmd.run_all'](cmd, output_loglevel='trace')
|
||||
if call['retcode'] != 0:
|
||||
@@ -704,7 +714,7 @@ def install(name=None,
|
||||
version=None,
|
||||
**kwargs):
|
||||
'''
|
||||
- Install the passed package(s), add refresh=True to run 'zypper refresh'
|
||||
+ Install the passed package(s), add refresh=True to force a 'zypper refresh'
|
||||
before package is installed.
|
||||
|
||||
name
|
||||
@@ -721,7 +731,9 @@ def install(name=None,
|
||||
salt '*' pkg.install <package name>
|
||||
|
||||
refresh
|
||||
- Whether or not to refresh the package database before installing.
|
||||
+ force a refresh if set to True.
|
||||
+ If set to False (default) it depends on zypper if a refresh is
|
||||
+ executed.
|
||||
|
||||
fromrepo
|
||||
Specify a package repository to install from.
|
||||
@@ -769,6 +781,9 @@ def install(name=None,
|
||||
{'<package>': {'old': '<old-version>',
|
||||
'new': '<new-version>'}}
|
||||
'''
|
||||
+ if salt.utils.is_true(refresh):
|
||||
+ refresh_db()
|
||||
+
|
||||
try:
|
||||
pkg_params, pkg_type = __salt__['pkg_resource.parse_targets'](name, pkgs, sources, **kwargs)
|
||||
except MinionError as exc:
|
||||
@@ -815,8 +830,6 @@ def install(name=None,
|
||||
else:
|
||||
fromrepoopt = ''
|
||||
cmd_install = _zypper()
|
||||
- if not refresh:
|
||||
- cmd_install.append('--no-refresh')
|
||||
cmd_install += ['install', '--name', '--auto-agree-with-licenses']
|
||||
if downloadonly:
|
||||
cmd_install.append('--download-only')
|
||||
@@ -851,6 +864,11 @@ def upgrade(refresh=True):
|
||||
'''
|
||||
Run a full system upgrade, a zypper upgrade
|
||||
|
||||
+ refresh
|
||||
+ force a refresh if set to True (default).
|
||||
+ If set to False it depends on zypper if a refresh is
|
||||
+ executed.
|
||||
+
|
||||
Return a dict containing the new package names and versions::
|
||||
|
||||
{'<package>': {'old': '<old-version>',
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,125 +0,0 @@
|
||||
From 478871aebfcb2ddf1b1c2a47b4fccb820180c9ed Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Thu, 18 Feb 2016 12:39:52 +0100
|
||||
Subject: [PATCH 19/22] add refresh option to more functions
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 40 ++++++++++++++++++++++++++++++++++++----
|
||||
1 file changed, 36 insertions(+), 4 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index f5b09c0..9afdeef 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -1245,16 +1245,24 @@ def _get_patterns(installed_only=None):
|
||||
return patterns
|
||||
|
||||
|
||||
-def list_patterns():
|
||||
+def list_patterns(refresh=False):
|
||||
'''
|
||||
List all known patterns from available repos.
|
||||
|
||||
+ refresh
|
||||
+ force a refresh if set to True.
|
||||
+ If set to False (default) it depends on zypper if a refresh is
|
||||
+ executed.
|
||||
+
|
||||
CLI Examples:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
salt '*' pkg.list_patterns
|
||||
'''
|
||||
+ if salt.utils.is_true(refresh):
|
||||
+ refresh_db()
|
||||
+
|
||||
return _get_patterns()
|
||||
|
||||
|
||||
@@ -1271,16 +1279,24 @@ def list_installed_patterns():
|
||||
return _get_patterns(installed_only=True)
|
||||
|
||||
|
||||
-def search(criteria):
|
||||
+def search(criteria, refresh=False):
|
||||
'''
|
||||
List known packags, available to the system.
|
||||
|
||||
+ refresh
|
||||
+ force a refresh if set to True.
|
||||
+ If set to False (default) it depends on zypper if a refresh is
|
||||
+ executed.
|
||||
+
|
||||
CLI Examples:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
salt '*' pkg.search <criteria>
|
||||
'''
|
||||
+ if salt.utils.is_true(refresh):
|
||||
+ refresh_db()
|
||||
+
|
||||
doc = dom.parseString(__salt__['cmd.run'](_zypper('--xmlout', 'se', criteria),
|
||||
output_loglevel='trace'))
|
||||
solvables = doc.getElementsByTagName('solvable')
|
||||
@@ -1311,13 +1327,18 @@ def _get_first_aggregate_text(node_list):
|
||||
return '\n'.join(out)
|
||||
|
||||
|
||||
-def list_products(all=False):
|
||||
+def list_products(all=False, refresh=False):
|
||||
'''
|
||||
List all available or installed SUSE products.
|
||||
|
||||
all
|
||||
List all products available or only installed. Default is False.
|
||||
|
||||
+ refresh
|
||||
+ force a refresh if set to True.
|
||||
+ If set to False (default) it depends on zypper if a refresh is
|
||||
+ executed.
|
||||
+
|
||||
Includes handling for OEM products, which read the OEM productline file
|
||||
and overwrite the release value.
|
||||
|
||||
@@ -1328,6 +1349,9 @@ def list_products(all=False):
|
||||
salt '*' pkg.list_products
|
||||
salt '*' pkg.list_products all=True
|
||||
'''
|
||||
+ if salt.utils.is_true(refresh):
|
||||
+ refresh_db()
|
||||
+
|
||||
ret = list()
|
||||
OEM_PATH = "/var/lib/suseRegister/OEM"
|
||||
cmd = _zypper('-x', 'products')
|
||||
@@ -1357,10 +1381,15 @@ def list_products(all=False):
|
||||
return ret
|
||||
|
||||
|
||||
-def download(*packages):
|
||||
+def download(refresh=False, *packages):
|
||||
"""
|
||||
Download packages to the local disk.
|
||||
|
||||
+ refresh
|
||||
+ force a refresh if set to True.
|
||||
+ If set to False (default) it depends on zypper if a refresh is
|
||||
+ executed.
|
||||
+
|
||||
CLI example:
|
||||
|
||||
.. code-block:: bash
|
||||
@@ -1371,6 +1400,9 @@ def download(*packages):
|
||||
if not packages:
|
||||
raise CommandExecutionError("No packages has been specified.")
|
||||
|
||||
+ if salt.utils.is_true(refresh):
|
||||
+ refresh_db()
|
||||
+
|
||||
doc = dom.parseString(__salt__['cmd.run'](
|
||||
_zypper('-x', 'download', *packages), output_loglevel='trace'))
|
||||
pkg_ret = {}
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,88 +0,0 @@
|
||||
From 0a09ae513698029eb1e05cd8b6e6b45d2830a0cb Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Sun, 21 Feb 2016 11:26:51 +0100
|
||||
Subject: [PATCH 20/22] simplify checking the refresh paramater
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 16 ++++++++--------
|
||||
1 file changed, 8 insertions(+), 8 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 9afdeef..e2cd5f9 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -85,7 +85,7 @@ def list_upgrades(refresh=True):
|
||||
|
||||
salt '*' pkg.list_upgrades
|
||||
'''
|
||||
- if salt.utils.is_true(refresh):
|
||||
+ if refresh:
|
||||
refresh_db()
|
||||
ret = {}
|
||||
call = __salt__['cmd.run_all'](
|
||||
@@ -200,7 +200,7 @@ def info_available(*names, **kwargs):
|
||||
names = sorted(list(set(names)))
|
||||
|
||||
# Refresh db before extracting the latest package
|
||||
- if salt.utils.is_true(kwargs.pop('refresh', True)):
|
||||
+ if kwargs.pop('refresh', True):
|
||||
refresh_db()
|
||||
|
||||
pkg_info = []
|
||||
@@ -781,7 +781,7 @@ def install(name=None,
|
||||
{'<package>': {'old': '<old-version>',
|
||||
'new': '<new-version>'}}
|
||||
'''
|
||||
- if salt.utils.is_true(refresh):
|
||||
+ if refresh:
|
||||
refresh_db()
|
||||
|
||||
try:
|
||||
@@ -885,7 +885,7 @@ def upgrade(refresh=True):
|
||||
'comment': '',
|
||||
}
|
||||
|
||||
- if salt.utils.is_true(refresh):
|
||||
+ if refresh:
|
||||
refresh_db()
|
||||
old = list_pkgs()
|
||||
cmd = _zypper('update', '--auto-agree-with-licenses')
|
||||
@@ -1260,7 +1260,7 @@ def list_patterns(refresh=False):
|
||||
|
||||
salt '*' pkg.list_patterns
|
||||
'''
|
||||
- if salt.utils.is_true(refresh):
|
||||
+ if refresh:
|
||||
refresh_db()
|
||||
|
||||
return _get_patterns()
|
||||
@@ -1294,7 +1294,7 @@ def search(criteria, refresh=False):
|
||||
|
||||
salt '*' pkg.search <criteria>
|
||||
'''
|
||||
- if salt.utils.is_true(refresh):
|
||||
+ if refresh:
|
||||
refresh_db()
|
||||
|
||||
doc = dom.parseString(__salt__['cmd.run'](_zypper('--xmlout', 'se', criteria),
|
||||
@@ -1349,7 +1349,7 @@ def list_products(all=False, refresh=False):
|
||||
salt '*' pkg.list_products
|
||||
salt '*' pkg.list_products all=True
|
||||
'''
|
||||
- if salt.utils.is_true(refresh):
|
||||
+ if refresh:
|
||||
refresh_db()
|
||||
|
||||
ret = list()
|
||||
@@ -1400,7 +1400,7 @@ def download(refresh=False, *packages):
|
||||
if not packages:
|
||||
raise CommandExecutionError("No packages has been specified.")
|
||||
|
||||
- if salt.utils.is_true(refresh):
|
||||
+ if refresh:
|
||||
refresh_db()
|
||||
|
||||
doc = dom.parseString(__salt__['cmd.run'](
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,25 +0,0 @@
|
||||
From ea6898f82ddc21c73f3ea369e6af241753a2ceda Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Mon, 22 Feb 2016 09:51:01 +0100
|
||||
Subject: [PATCH 21/22] do not change kwargs in refresh while checking a value
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 2 +-
|
||||
1 file changed, 1 insertion(+), 1 deletion(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index e2cd5f9..1499b27 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -200,7 +200,7 @@ def info_available(*names, **kwargs):
|
||||
names = sorted(list(set(names)))
|
||||
|
||||
# Refresh db before extracting the latest package
|
||||
- if kwargs.pop('refresh', True):
|
||||
+ if kwargs.get('refresh', True):
|
||||
refresh_db()
|
||||
|
||||
pkg_info = []
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,33 +0,0 @@
|
||||
From c9bab6bb32f9ca2e65b6e0c24283146b66bf91be Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Tue, 23 Feb 2016 11:46:09 +0100
|
||||
Subject: [PATCH 22/22] fix argument handling for pkg.download
|
||||
|
||||
---
|
||||
salt/modules/zypper.py | 3 ++-
|
||||
1 file changed, 2 insertions(+), 1 deletion(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 1499b27..33e5da9 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -1381,7 +1381,7 @@ def list_products(all=False, refresh=False):
|
||||
return ret
|
||||
|
||||
|
||||
-def download(refresh=False, *packages):
|
||||
+def download(*packages, **kwargs):
|
||||
"""
|
||||
Download packages to the local disk.
|
||||
|
||||
@@ -1397,6 +1397,7 @@ def download(refresh=False, *packages):
|
||||
salt '*' pkg.download httpd
|
||||
salt '*' pkg.download httpd postfix
|
||||
"""
|
||||
+ refresh = kwargs.get('refresh', False)
|
||||
if not packages:
|
||||
raise CommandExecutionError("No packages has been specified.")
|
||||
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,673 +0,0 @@
|
||||
From 5ee519e885134c1afa77d9e78c53224ad70a2e51 Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@suse.de>
|
||||
Date: Tue, 23 Feb 2016 17:34:37 +0100
|
||||
Subject: [PATCH 23/23] Initial Zypper Unit Tests and bugfixes
|
||||
|
||||
Add Zypper Unit Test installed products sample data
|
||||
|
||||
Add Zypper unit test: test_list_products and test_refresh_db
|
||||
|
||||
Reimplement list_upgrades to use XML output from Zypper instead
|
||||
|
||||
Rename Zypper products static test data file
|
||||
|
||||
Use renamed zypper products data file
|
||||
|
||||
Do not strip the output
|
||||
|
||||
Implement error handling test for listing upgrades
|
||||
|
||||
Add list upgrades Zypper static data
|
||||
|
||||
Implement list upgrades test
|
||||
|
||||
Use strings instead of unicode strings
|
||||
|
||||
Implement test for info_installed
|
||||
|
||||
Add Zypper static data for the available packages
|
||||
|
||||
Implement test for the info_available
|
||||
|
||||
Implement test for latest_available
|
||||
|
||||
Bugfix: when only one package, no dict is returned. Still upgrade_available should return boolean.
|
||||
|
||||
Implement test for the upgrade_available
|
||||
|
||||
Add third test package static info
|
||||
|
||||
Adjust test case for the third package in the test static data
|
||||
|
||||
Implement test for version compare, where RPM algorithm is called
|
||||
|
||||
Implement test for version compare, where python fall-back algorithm is called
|
||||
|
||||
Add mocking data
|
||||
|
||||
Implement list packages test
|
||||
|
||||
Add space before "assert" keyword
|
||||
|
||||
Fix PyLint
|
||||
|
||||
Do not use Zypper purge (reason: too dangerous)
|
||||
|
||||
Fix the docstring
|
||||
|
||||
Refactor code (a bit)
|
||||
|
||||
Implement unit test for remove and purge
|
||||
---
|
||||
salt/modules/zypper.py | 62 ++---
|
||||
tests/unit/modules/zypp/zypper-available.txt | 64 ++++++
|
||||
tests/unit/modules/zypp/zypper-products.xml | 37 +++
|
||||
tests/unit/modules/zypp/zypper-updates.xml | 33 +++
|
||||
tests/unit/modules/zypper_test.py | 324 +++++++++++++++++++++++++++
|
||||
5 files changed, 482 insertions(+), 38 deletions(-)
|
||||
create mode 100644 tests/unit/modules/zypp/zypper-available.txt
|
||||
create mode 100644 tests/unit/modules/zypp/zypper-products.xml
|
||||
create mode 100644 tests/unit/modules/zypp/zypper-updates.xml
|
||||
create mode 100644 tests/unit/modules/zypper_test.py
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index 33e5da9..ab8bb06 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -87,34 +87,21 @@ def list_upgrades(refresh=True):
|
||||
'''
|
||||
if refresh:
|
||||
refresh_db()
|
||||
- ret = {}
|
||||
- call = __salt__['cmd.run_all'](
|
||||
- _zypper('list-updates'), output_loglevel='trace'
|
||||
- )
|
||||
- if call['retcode'] != 0:
|
||||
- comment = ''
|
||||
- if 'stderr' in call:
|
||||
- comment += call['stderr']
|
||||
- if 'stdout' in call:
|
||||
- comment += call['stdout']
|
||||
- raise CommandExecutionError(
|
||||
- '{0}'.format(comment)
|
||||
- )
|
||||
- else:
|
||||
- out = call['stdout']
|
||||
+ ret = dict()
|
||||
+ run_data = __salt__['cmd.run_all'](_zypper('-x', 'list-updates'), output_loglevel='trace')
|
||||
+ if run_data['retcode'] != 0:
|
||||
+ msg = list()
|
||||
+ for chnl in ['stderr', 'stdout']:
|
||||
+ if run_data.get(chnl, ''):
|
||||
+ msg.append(run_data[chnl])
|
||||
+ raise CommandExecutionError(os.linesep.join(msg) or
|
||||
+ 'Zypper returned non-zero system exit. See Zypper logs for more details.')
|
||||
+
|
||||
+ doc = dom.parseString(run_data['stdout'])
|
||||
+ for update_node in doc.getElementsByTagName('update'):
|
||||
+ if update_node.getAttribute('kind') == 'package':
|
||||
+ ret[update_node.getAttribute('name')] = update_node.getAttribute('edition')
|
||||
|
||||
- for line in out.splitlines():
|
||||
- if not line:
|
||||
- continue
|
||||
- if '|' not in line:
|
||||
- continue
|
||||
- try:
|
||||
- status, repo, name, cur, avail, arch = \
|
||||
- [x.strip() for x in line.split('|')]
|
||||
- except (ValueError, IndexError):
|
||||
- continue
|
||||
- if status == 'v':
|
||||
- ret[name] = avail
|
||||
return ret
|
||||
|
||||
# Provide a list_updates function for those used to using zypper list-updates
|
||||
@@ -300,7 +287,7 @@ def upgrade_available(name):
|
||||
|
||||
salt '*' pkg.upgrade_available <package name>
|
||||
'''
|
||||
- return latest_version(name).get(name) is not None
|
||||
+ return not not latest_version(name)
|
||||
|
||||
|
||||
def version(*names, **kwargs):
|
||||
@@ -903,9 +890,9 @@ def upgrade(refresh=True):
|
||||
return ret
|
||||
|
||||
|
||||
-def _uninstall(action='remove', name=None, pkgs=None):
|
||||
+def _uninstall(name=None, pkgs=None):
|
||||
'''
|
||||
- remove and purge do identical things but with different zypper commands,
|
||||
+ Remove and purge do identical things but with different Zypper commands,
|
||||
this function performs the common logic.
|
||||
'''
|
||||
try:
|
||||
@@ -913,18 +900,17 @@ def _uninstall(action='remove', name=None, pkgs=None):
|
||||
except MinionError as exc:
|
||||
raise CommandExecutionError(exc)
|
||||
|
||||
- purge_arg = '-u' if action == 'purge' else ''
|
||||
old = list_pkgs()
|
||||
- targets = [x for x in pkg_params if x in old]
|
||||
+ targets = [target for target in pkg_params if target in old]
|
||||
if not targets:
|
||||
return {}
|
||||
+
|
||||
while targets:
|
||||
- cmd = _zypper('remove', purge_arg, *targets[:500])
|
||||
- __salt__['cmd.run'](cmd, output_loglevel='trace')
|
||||
+ __salt__['cmd.run'](_zypper('remove', *targets[:500]), output_loglevel='trace')
|
||||
targets = targets[500:]
|
||||
__context__.pop('pkg.list_pkgs', None)
|
||||
- new = list_pkgs()
|
||||
- return salt.utils.compare_dicts(old, new)
|
||||
+
|
||||
+ return salt.utils.compare_dicts(old, list_pkgs())
|
||||
|
||||
|
||||
def remove(name=None, pkgs=None, **kwargs): # pylint: disable=unused-argument
|
||||
@@ -954,7 +940,7 @@ def remove(name=None, pkgs=None, **kwargs): # pylint: disable=unused-argument
|
||||
salt '*' pkg.remove <package1>,<package2>,<package3>
|
||||
salt '*' pkg.remove pkgs='["foo", "bar"]'
|
||||
'''
|
||||
- return _uninstall(action='remove', name=name, pkgs=pkgs)
|
||||
+ return _uninstall(name=name, pkgs=pkgs)
|
||||
|
||||
|
||||
def purge(name=None, pkgs=None, **kwargs): # pylint: disable=unused-argument
|
||||
@@ -985,7 +971,7 @@ def purge(name=None, pkgs=None, **kwargs): # pylint: disable=unused-argument
|
||||
salt '*' pkg.purge <package1>,<package2>,<package3>
|
||||
salt '*' pkg.purge pkgs='["foo", "bar"]'
|
||||
'''
|
||||
- return _uninstall(action='purge', name=name, pkgs=pkgs)
|
||||
+ return _uninstall(name=name, pkgs=pkgs)
|
||||
|
||||
|
||||
def list_locks():
|
||||
diff --git a/tests/unit/modules/zypp/zypper-available.txt b/tests/unit/modules/zypp/zypper-available.txt
|
||||
new file mode 100644
|
||||
index 0000000..e1094bc
|
||||
--- /dev/null
|
||||
+++ b/tests/unit/modules/zypp/zypper-available.txt
|
||||
@@ -0,0 +1,64 @@
|
||||
+Loading repository data...
|
||||
+Reading installed packages...
|
||||
+
|
||||
+
|
||||
+Information for package vim:
|
||||
+----------------------------
|
||||
+Repository: SLE12-SP1-x86_64-Pool
|
||||
+Name: vim
|
||||
+Version: 7.4.326-2.62
|
||||
+Arch: x86_64
|
||||
+Vendor: SUSE LLC <https://www.suse.com/>
|
||||
+Support Level: Level 3
|
||||
+Installed: No
|
||||
+Status: not installed
|
||||
+Installed Size: 2,6 MiB
|
||||
+Summary: Vi IMproved
|
||||
+Description:
|
||||
+ Vim (Vi IMproved) is an almost compatible version of the UNIX editor
|
||||
+ vi. Almost every possible command can be performed using only ASCII
|
||||
+ characters. Only the 'Q' command is missing (you do not need it). Many
|
||||
+ new features have been added: multilevel undo, command line history,
|
||||
+ file name completion, block operations, and editing of binary data.
|
||||
+
|
||||
+ Vi is available for the AMIGA, MS-DOS, Windows NT, and various versions
|
||||
+ of UNIX.
|
||||
+
|
||||
+ For SUSE Linux, Vim is used as /usr/bin/vi.
|
||||
+
|
||||
+Information for package python:
|
||||
+-------------------------------
|
||||
+Repository: SLE12-SP1-x86_64-Pool
|
||||
+Name: python
|
||||
+Version: 2.7.9-20.2
|
||||
+Arch: x86_64
|
||||
+Vendor: SUSE LLC <https://www.suse.com/>
|
||||
+Support Level: Level 3
|
||||
+Installed: Yes
|
||||
+Status: up-to-date
|
||||
+Installed Size: 1,4 MiB
|
||||
+Summary: Python Interpreter
|
||||
+Description:
|
||||
+ Python is an interpreted, object-oriented programming language, and is
|
||||
+ often compared to Tcl, Perl, Scheme, or Java. You can find an overview
|
||||
+ of Python in the documentation and tutorials included in the python-doc
|
||||
+ (HTML) or python-doc-pdf (PDF) packages.
|
||||
+
|
||||
+ If you want to install third party modules using distutils, you need to
|
||||
+ install python-devel package.
|
||||
+
|
||||
+Information for package emacs:
|
||||
+------------------------------
|
||||
+Repository: SLE12-SP1-x86_64-Pool
|
||||
+Name: emacs
|
||||
+Version: 24.3-14.44
|
||||
+Arch: x86_64
|
||||
+Vendor: SUSE LLC <https://www.suse.com/>
|
||||
+Support Level: Level 3
|
||||
+Installed: Yes
|
||||
+Status: up-to-date
|
||||
+Installed Size: 63,9 MiB
|
||||
+Summary: GNU Emacs Base Package
|
||||
+Description:
|
||||
+ Basic package for the GNU Emacs editor. Requires emacs-x11 or
|
||||
+ emacs-nox.
|
||||
diff --git a/tests/unit/modules/zypp/zypper-products.xml b/tests/unit/modules/zypp/zypper-products.xml
|
||||
new file mode 100644
|
||||
index 0000000..1a50363
|
||||
--- /dev/null
|
||||
+++ b/tests/unit/modules/zypp/zypper-products.xml
|
||||
@@ -0,0 +1,37 @@
|
||||
+<?xml version='1.0'?>
|
||||
+<stream>
|
||||
+<message type="info">Loading repository data...</message>
|
||||
+<message type="info">Reading installed packages...</message>
|
||||
+<product-list>
|
||||
+<product name="SLES" version="12.1" release="0" epoch="0" arch="x86_64" vendor="SUSE LLC <https://www.suse.com/>" summary="SUSE Linux Enterprise Server 12 SP1" repo="SLE12-SP1-x86_64-Pool" productline="" registerrelease="" shortname="SLES12-SP1" flavor="POOL" isbase="false" installed="false"><endoflife time_t="1730332800" text="2024-10-31T01:00:00+01"/><registerflavor/><description>SUSE Linux Enterprise offers a comprehensive
|
||||
+ suite of products built on a single code base.
|
||||
+ The platform addresses business needs from
|
||||
+ the smallest thin-client devices to the world's
|
||||
+ most powerful high-performance computing
|
||||
+ and mainframe servers. SUSE Linux Enterprise
|
||||
+ offers common management tools and technology
|
||||
+ certifications across the platform, and
|
||||
+ each product is enterprise-class.</description></product>
|
||||
+<product name="SUSE-Manager-Proxy" version="3.0" release="0" epoch="0" arch="x86_64" vendor="obs://build.suse.de/Devel:Galaxy:Manager:Head" summary="SUSE Manager Proxy" repo="SUSE-Manager-Head" productline="" registerrelease="" shortname="SUSE Manager Proxy" flavor="DVD" isbase="false" installed="false"><endoflife time_t="1522454400" text="2018-03-31T02:00:00+02"/><registerflavor>extension</registerflavor><description>SUSE Manager Proxies extend large and/or geographically
|
||||
+dispersed SUSE Manager environments to reduce load on the SUSE Manager
|
||||
+Server, lower bandwidth needs, and provide faster local
|
||||
+updates.</description></product>
|
||||
+<product name="SUSE-Manager-Server" version="3.0" release="0" epoch="0" arch="x86_64" vendor="obs://build.suse.de/Devel:Galaxy:Manager:Head" summary="SUSE Manager Server" repo="SUSE-Manager-Head" productline="" registerrelease="" shortname="SUSE Manager Server" flavor="DVD" isbase="false" installed="false"><endoflife time_t="1522454400" text="2018-03-31T02:00:00+02"/><registerflavor>extension</registerflavor><description>SUSE Manager lets you efficiently manage physical, virtual,
|
||||
+and cloud-based Linux systems. It provides automated and cost-effective
|
||||
+configuration and software management, asset management, and system
|
||||
+provisioning.</description></product>
|
||||
+<product name="sle-manager-tools-beta" version="12" release="0" epoch="0" arch="x86_64" vendor="obs://build.suse.de/Devel:Galaxy:Manager:Head" summary="SUSE Manager Tools" repo="SUSE-Manager-Head" productline="" registerrelease="" shortname="Manager-Tools" flavor="POOL" isbase="false" installed="false"><endoflife time_t="1509408000" text="2017-10-31T01:00:00+01"/><registerflavor>extension</registerflavor><description><p>
|
||||
+ SUSE Manager Tools provide packages required to connect to a
|
||||
+ SUSE Manager Server.
|
||||
+ <p></description></product>
|
||||
+<product name="SLES" version="12.1" release="0" epoch="0" arch="x86_64" vendor="SUSE" summary="SUSE Linux Enterprise Server 12 SP1" repo="@System" productline="sles" registerrelease="" shortname="SLES12-SP1" flavor="DVD" isbase="true" installed="true"><endoflife time_t="1730332800" text="2024-10-31T01:00:00+01"/><registerflavor/><description>SUSE Linux Enterprise offers a comprehensive
|
||||
+ suite of products built on a single code base.
|
||||
+ The platform addresses business needs from
|
||||
+ the smallest thin-client devices to the world's
|
||||
+ most powerful high-performance computing
|
||||
+ and mainframe servers. SUSE Linux Enterprise
|
||||
+ offers common management tools and technology
|
||||
+ certifications across the platform, and
|
||||
+ each product is enterprise-class.</description></product>
|
||||
+</product-list>
|
||||
+</stream>
|
||||
diff --git a/tests/unit/modules/zypp/zypper-updates.xml b/tests/unit/modules/zypp/zypper-updates.xml
|
||||
new file mode 100644
|
||||
index 0000000..61fe85b
|
||||
--- /dev/null
|
||||
+++ b/tests/unit/modules/zypp/zypper-updates.xml
|
||||
@@ -0,0 +1,33 @@
|
||||
+<?xml version='1.0'?>
|
||||
+<stream>
|
||||
+<message type="info">Loading repository data...</message>
|
||||
+<message type="info">Reading installed packages...</message>
|
||||
+<update-status version="0.6">
|
||||
+<update-list>
|
||||
+ <update name="SUSEConnect" edition="0.2.33-7.1" arch="x86_64" kind="package" >
|
||||
+ <summary>Utility to register a system with the SUSE Customer Center </summary>
|
||||
+ <description>This package provides a command line tool and rubygem library for connecting a
|
||||
+client system to the SUSE Customer Center. It will connect the system to your
|
||||
+product subscriptions and enable the product repositories/services locally.</description>
|
||||
+ <license></license>
|
||||
+ <source url="http://scc.suse.de/SLE-SERVER/12-SP1/x86_64/update/" alias="SLE12-SP1-x86_64-Update"/>
|
||||
+ </update>
|
||||
+ <update name="bind-libs" edition="9.9.6P1-35.1" arch="x86_64" kind="package" >
|
||||
+ <summary>Shared libraries of BIND </summary>
|
||||
+ <description>This package contains the shared libraries of the Berkeley Internet
|
||||
+Name Domain (BIND) Domain Name System implementation of the Domain Name
|
||||
+System (DNS) protocols.</description>
|
||||
+ <license></license>
|
||||
+ <source url="http://scc.suse.de/SLE-SERVER/12-SP1/x86_64/update/" alias="SLE12-SP1-x86_64-Update"/>
|
||||
+ </update>
|
||||
+ <update name="bind-utils" edition="9.9.6P1-35.1" arch="x86_64" kind="package" >
|
||||
+ <summary>Utilities to query and test DNS </summary>
|
||||
+ <description>This package includes the utilities host, dig, and nslookup used to
|
||||
+test and query the Domain Name System (DNS). The Berkeley Internet
|
||||
+Name Domain (BIND) DNS server is found in the package named bind.</description>
|
||||
+ <license></license>
|
||||
+ <source url="http://scc.suse.de/SLE-SERVER/12-SP1/x86_64/update/" alias="SLE12-SP1-x86_64-Update"/>
|
||||
+ </update>
|
||||
+</update-list>
|
||||
+</update-status>
|
||||
+</stream>
|
||||
diff --git a/tests/unit/modules/zypper_test.py b/tests/unit/modules/zypper_test.py
|
||||
new file mode 100644
|
||||
index 0000000..de964f9
|
||||
--- /dev/null
|
||||
+++ b/tests/unit/modules/zypper_test.py
|
||||
@@ -0,0 +1,324 @@
|
||||
+# -*- coding: utf-8 -*-
|
||||
+'''
|
||||
+ :codeauthor: :email:`Bo Maryniuk <bo@suse.de>`
|
||||
+'''
|
||||
+
|
||||
+# Import Python Libs
|
||||
+from __future__ import absolute_import
|
||||
+
|
||||
+# Import Salt Testing Libs
|
||||
+from salttesting import TestCase, skipIf
|
||||
+from salttesting.mock import (
|
||||
+ MagicMock,
|
||||
+ patch,
|
||||
+ NO_MOCK,
|
||||
+ NO_MOCK_REASON
|
||||
+)
|
||||
+from salt.exceptions import CommandExecutionError
|
||||
+
|
||||
+import os
|
||||
+
|
||||
+from salttesting.helpers import ensure_in_syspath
|
||||
+
|
||||
+ensure_in_syspath('../../')
|
||||
+
|
||||
+
|
||||
+def get_test_data(filename):
|
||||
+ '''
|
||||
+ Return static test data
|
||||
+ '''
|
||||
+ return open(os.path.join(os.path.join(os.path.dirname(os.path.abspath(__file__)), 'zypp'), filename)).read()
|
||||
+
|
||||
+
|
||||
+# Import Salt Libs
|
||||
+from salt.modules import zypper
|
||||
+
|
||||
+# Globals
|
||||
+zypper.__salt__ = dict()
|
||||
+zypper.__context__ = dict()
|
||||
+zypper.rpm = None
|
||||
+
|
||||
+
|
||||
+@skipIf(NO_MOCK, NO_MOCK_REASON)
|
||||
+class ZypperTestCase(TestCase):
|
||||
+ '''
|
||||
+ Test cases for salt.modules.zypper
|
||||
+ '''
|
||||
+
|
||||
+ def test_list_upgrades(self):
|
||||
+ '''
|
||||
+ List package upgrades
|
||||
+ :return:
|
||||
+ '''
|
||||
+ ref_out = {
|
||||
+ 'stdout': get_test_data('zypper-updates.xml'),
|
||||
+ 'stderr': None,
|
||||
+ 'retcode': 0
|
||||
+ }
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run_all': MagicMock(return_value=ref_out)}):
|
||||
+ upgrades = zypper.list_upgrades(refresh=False)
|
||||
+ assert len(upgrades) == 3
|
||||
+ for pkg, version in {'SUSEConnect': '0.2.33-7.1',
|
||||
+ 'bind-utils': '9.9.6P1-35.1',
|
||||
+ 'bind-libs': '9.9.6P1-35.1'}.items():
|
||||
+ assert pkg in upgrades
|
||||
+ assert upgrades[pkg] == version
|
||||
+
|
||||
+ def test_list_upgrades_error_handling(self):
|
||||
+ '''
|
||||
+ Test error handling in the list package upgrades.
|
||||
+ :return:
|
||||
+ '''
|
||||
+ # Test handled errors
|
||||
+ ref_out = {
|
||||
+ 'stderr': 'Some handled zypper internal error',
|
||||
+ 'retcode': 1
|
||||
+ }
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run_all': MagicMock(return_value=ref_out)}):
|
||||
+ try:
|
||||
+ zypper.list_upgrades(refresh=False)
|
||||
+ except CommandExecutionError as error:
|
||||
+ assert error.message == ref_out['stderr']
|
||||
+
|
||||
+ # Test unhandled error
|
||||
+ ref_out = {
|
||||
+ 'retcode': 1
|
||||
+ }
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run_all': MagicMock(return_value=ref_out)}):
|
||||
+ try:
|
||||
+ zypper.list_upgrades(refresh=False)
|
||||
+ except CommandExecutionError as error:
|
||||
+ assert error.message == 'Zypper returned non-zero system exit. See Zypper logs for more details.'
|
||||
+
|
||||
+ def test_list_products(self):
|
||||
+ '''
|
||||
+ List products test.
|
||||
+ '''
|
||||
+ ref_out = get_test_data('zypper-products.xml')
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run': MagicMock(return_value=ref_out)}):
|
||||
+ products = zypper.list_products()
|
||||
+ assert len(products) == 5
|
||||
+ assert (['SLES', 'SLES', 'SUSE-Manager-Proxy', 'SUSE-Manager-Server', 'sle-manager-tools-beta'] ==
|
||||
+ sorted([prod['name'] for prod in products]))
|
||||
+ assert ('SUSE LLC <https://www.suse.com/>' in [product['vendor'] for product in products])
|
||||
+ assert ([False, False, False, False, True] ==
|
||||
+ sorted([product['isbase'] for product in products]))
|
||||
+ assert ([False, False, False, False, True] ==
|
||||
+ sorted([product['installed'] for product in products]))
|
||||
+ assert (['0', '0', '0', '0', '0'] ==
|
||||
+ sorted([product['release'] for product in products]))
|
||||
+ assert ([False, False, False, False, u'sles'] ==
|
||||
+ sorted([product['productline'] for product in products]))
|
||||
+ assert ([1509408000, 1522454400, 1522454400, 1730332800, 1730332800] ==
|
||||
+ sorted([product['eol_t'] for product in products]))
|
||||
+
|
||||
+ def test_refresh_db(self):
|
||||
+ '''
|
||||
+ Test if refresh DB handled correctly
|
||||
+ '''
|
||||
+ ref_out = [
|
||||
+ "Repository 'openSUSE-Leap-42.1-LATEST' is up to date.",
|
||||
+ "Repository 'openSUSE-Leap-42.1-Update' is up to date.",
|
||||
+ "Retrieving repository 'openSUSE-Leap-42.1-Update-Non-Oss' metadata",
|
||||
+ "Forcing building of repository cache",
|
||||
+ "Building repository 'openSUSE-Leap-42.1-Update-Non-Oss' cache ..........[done]",
|
||||
+ "Building repository 'salt-dev' cache",
|
||||
+ "All repositories have been refreshed."
|
||||
+ ]
|
||||
+
|
||||
+ run_out = {
|
||||
+ 'stderr': '', 'stdout': '\n'.join(ref_out), 'retcode': 0
|
||||
+ }
|
||||
+
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run_all': MagicMock(return_value=run_out)}):
|
||||
+ result = zypper.refresh_db()
|
||||
+ self.assertEqual(result.get("openSUSE-Leap-42.1-LATEST"), False)
|
||||
+ self.assertEqual(result.get("openSUSE-Leap-42.1-Update"), False)
|
||||
+ self.assertEqual(result.get("openSUSE-Leap-42.1-Update-Non-Oss"), True)
|
||||
+
|
||||
+ def test_info_installed(self):
|
||||
+ '''
|
||||
+ Test the return information of the named package(s), installed on the system.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ run_out = {
|
||||
+ 'virgo-dummy':
|
||||
+ {'build_date': '2015-07-09T10:55:19Z',
|
||||
+ 'vendor': 'openSUSE Build Service',
|
||||
+ 'description': 'This is the Virgo dummy package used for testing SUSE Manager',
|
||||
+ 'license': 'GPL-2.0', 'build_host': 'sheep05', 'url': 'http://www.suse.com',
|
||||
+ 'build_date_time_t': 1436432119, 'relocations': '(not relocatable)',
|
||||
+ 'source_rpm': 'virgo-dummy-1.0-1.1.src.rpm', 'install_date': '2016-02-23T16:31:57Z',
|
||||
+ 'install_date_time_t': 1456241517, 'summary': 'Virgo dummy package', 'version': '1.0',
|
||||
+ 'signature': 'DSA/SHA1, Thu Jul 9 08:55:33 2015, Key ID 27fa41bd8a7c64f9',
|
||||
+ 'release': '1.1', 'group': 'Applications/System', 'arch': 'noarch', 'size': '17992'},
|
||||
+
|
||||
+ 'libopenssl1_0_0':
|
||||
+ {'build_date': '2015-11-04T23:20:34Z', 'vendor': 'SUSE LLC <https://www.suse.com/>',
|
||||
+ 'description': 'The OpenSSL Project is a collaborative effort.',
|
||||
+ 'license': 'OpenSSL', 'build_host': 'sheep11', 'url': 'https://www.openssl.org/',
|
||||
+ 'build_date_time_t': 1446675634, 'relocations': '(not relocatable)',
|
||||
+ 'source_rpm': 'openssl-1.0.1i-34.1.src.rpm', 'install_date': '2016-02-23T16:31:35Z',
|
||||
+ 'install_date_time_t': 1456241495, 'summary': 'Secure Sockets and Transport Layer Security',
|
||||
+ 'version': '1.0.1i', 'signature': 'RSA/SHA256, Wed Nov 4 22:21:34 2015, Key ID 70af9e8139db7c82',
|
||||
+ 'release': '34.1', 'group': 'Productivity/Networking/Security', 'packager': 'https://www.suse.com/',
|
||||
+ 'arch': 'x86_64', 'size': '2576912'},
|
||||
+ }
|
||||
+ with patch.dict(zypper.__salt__, {'lowpkg.info': MagicMock(return_value=run_out)}):
|
||||
+ installed = zypper.info_installed()
|
||||
+ # Test overall products length
|
||||
+ assert len(installed) == 2
|
||||
+
|
||||
+ # Test translated fields
|
||||
+ for pkg_name, pkg_info in installed.items():
|
||||
+ assert installed[pkg_name].get('source') == run_out[pkg_name]['source_rpm']
|
||||
+
|
||||
+ # Test keys transition from the lowpkg.info
|
||||
+ for pn_key, pn_val in run_out['virgo-dummy'].items():
|
||||
+ if pn_key == 'source_rpm':
|
||||
+ continue
|
||||
+ assert installed['virgo-dummy'][pn_key] == pn_val
|
||||
+
|
||||
+ def test_info_available(self):
|
||||
+ '''
|
||||
+ Test return the information of the named package available for the system.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ test_pkgs = ['vim', 'emacs', 'python']
|
||||
+ ref_out = get_test_data('zypper-available.txt')
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run_stdout': MagicMock(return_value=ref_out)}):
|
||||
+ available = zypper.info_available(*test_pkgs, refresh=False)
|
||||
+ assert len(available) == 3
|
||||
+ for pkg_name, pkg_info in available.items():
|
||||
+ assert pkg_name in test_pkgs
|
||||
+
|
||||
+ assert available['emacs']['status'] == 'up-to-date'
|
||||
+ assert available['emacs']['installed']
|
||||
+ assert available['emacs']['support level'] == 'Level 3'
|
||||
+ assert available['emacs']['vendor'] == 'SUSE LLC <https://www.suse.com/>'
|
||||
+ assert available['emacs']['summary'] == 'GNU Emacs Base Package'
|
||||
+
|
||||
+ assert available['vim']['status'] == 'not installed'
|
||||
+ assert not available['vim']['installed']
|
||||
+ assert available['vim']['support level'] == 'Level 3'
|
||||
+ assert available['vim']['vendor'] == 'SUSE LLC <https://www.suse.com/>'
|
||||
+ assert available['vim']['summary'] == 'Vi IMproved'
|
||||
+
|
||||
+ @patch('salt.modules.zypper.refresh_db', MagicMock(return_value=True))
|
||||
+ def test_latest_version(self):
|
||||
+ '''
|
||||
+ Test the latest version of the named package available for upgrade or installation.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ ref_out = get_test_data('zypper-available.txt')
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run_stdout': MagicMock(return_value=ref_out)}):
|
||||
+ assert zypper.latest_version('vim') == '7.4.326-2.62'
|
||||
+
|
||||
+ @patch('salt.modules.zypper.refresh_db', MagicMock(return_value=True))
|
||||
+ def test_upgrade_available(self):
|
||||
+ '''
|
||||
+ Test whether or not an upgrade is available for a given package.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ ref_out = get_test_data('zypper-available.txt')
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run_stdout': MagicMock(return_value=ref_out)}):
|
||||
+ for pkg_name in ['emacs', 'python']:
|
||||
+ assert not zypper.upgrade_available(pkg_name)
|
||||
+ assert zypper.upgrade_available('vim')
|
||||
+
|
||||
+ @patch('salt.modules.zypper.HAS_RPM', True)
|
||||
+ def test_version_cmp_rpm(self):
|
||||
+ '''
|
||||
+ Test package version is called RPM version if RPM-Python is installed
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ with patch('salt.modules.zypper.rpm', MagicMock(return_value=MagicMock)):
|
||||
+ with patch('salt.modules.zypper.rpm.labelCompare', MagicMock(return_value=0)):
|
||||
+ assert 0 == zypper.version_cmp('1', '2') # mock returns 0, which means RPM was called
|
||||
+
|
||||
+ @patch('salt.modules.zypper.HAS_RPM', False)
|
||||
+ def test_version_cmp_fallback(self):
|
||||
+ '''
|
||||
+ Test package version is called RPM version if RPM-Python is installed
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ with patch('salt.modules.zypper.rpm', MagicMock(return_value=MagicMock)):
|
||||
+ with patch('salt.modules.zypper.rpm.labelCompare', MagicMock(return_value=0)):
|
||||
+ assert -1 == zypper.version_cmp('1', '2') # mock returns -1, a python implementation was called
|
||||
+
|
||||
+ def test_list_pkgs(self):
|
||||
+ '''
|
||||
+ Test packages listing.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ def _add_data(data, key, value):
|
||||
+ data[key] = value
|
||||
+
|
||||
+ rpm_out = [
|
||||
+ 'protobuf-java_|-2.6.1_|-3.1.develHead_|-',
|
||||
+ 'yast2-ftp-server_|-3.1.8_|-8.1_|-',
|
||||
+ 'jose4j_|-0.4.4_|-2.1.develHead_|-',
|
||||
+ 'apache-commons-cli_|-1.2_|-1.233_|-',
|
||||
+ 'jakarta-commons-discovery_|-0.4_|-129.686_|-',
|
||||
+ 'susemanager-build-keys-web_|-12.0_|-5.1.develHead_|-',
|
||||
+ ]
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run': MagicMock(return_value=os.linesep.join(rpm_out))}):
|
||||
+ with patch.dict(zypper.__salt__, {'pkg_resource.add_pkg': _add_data}):
|
||||
+ with patch.dict(zypper.__salt__, {'pkg_resource.sort_pkglist': MagicMock()}):
|
||||
+ with patch.dict(zypper.__salt__, {'pkg_resource.stringify': MagicMock()}):
|
||||
+ pkgs = zypper.list_pkgs()
|
||||
+ for pkg_name, pkg_version in {
|
||||
+ 'jakarta-commons-discovery': '0.4-129.686',
|
||||
+ 'yast2-ftp-server': '3.1.8-8.1',
|
||||
+ 'protobuf-java': '2.6.1-3.1.develHead',
|
||||
+ 'susemanager-build-keys-web': '12.0-5.1.develHead',
|
||||
+ 'apache-commons-cli': '1.2-1.233',
|
||||
+ 'jose4j': '0.4.4-2.1.develHead'}.items():
|
||||
+ assert pkgs.get(pkg_name)
|
||||
+ assert pkgs[pkg_name] == pkg_version
|
||||
+
|
||||
+ def test_remove_purge(self):
|
||||
+ '''
|
||||
+ Test package removal
|
||||
+ :return:
|
||||
+ '''
|
||||
+ class ListPackages(object):
|
||||
+ def __init__(self):
|
||||
+ self._packages = ['vim', 'pico']
|
||||
+ self._pkgs = {
|
||||
+ 'vim': '0.18.0',
|
||||
+ 'emacs': '24.0.1',
|
||||
+ 'pico': '0.1.1',
|
||||
+ }
|
||||
+
|
||||
+ def __call__(self):
|
||||
+ pkgs = self._pkgs.copy()
|
||||
+ for target in self._packages:
|
||||
+ if self._pkgs.get(target):
|
||||
+ del self._pkgs[target]
|
||||
+
|
||||
+ return pkgs
|
||||
+
|
||||
+ parsed_targets = [{'vim': None, 'pico': None}, None]
|
||||
+
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run': MagicMock(return_value=False)}):
|
||||
+ with patch.dict(zypper.__salt__, {'pkg_resource.parse_targets': MagicMock(return_value=parsed_targets)}):
|
||||
+ with patch.dict(zypper.__salt__, {'pkg_resource.stringify': MagicMock()}):
|
||||
+ with patch('salt.modules.zypper.list_pkgs', ListPackages()):
|
||||
+ diff = zypper.remove(name='vim,pico')
|
||||
+ for pkg_name in ['vim', 'pico']:
|
||||
+ assert diff.get(pkg_name)
|
||||
+ assert diff[pkg_name]['old']
|
||||
+ assert not diff[pkg_name]['new']
|
||||
+
|
||||
+
|
||||
+if __name__ == '__main__':
|
||||
+ from integration import run_tests
|
||||
+ run_tests(ZypperTestCase, needs_daemon=False)
|
||||
--
|
||||
2.7.2
|
||||
|
@ -1,296 +0,0 @@
|
||||
From 0372b1ff62a79d0c9f384fe48969d8bae039d5a1 Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Thu, 25 Feb 2016 10:20:29 +0100
|
||||
Subject: [PATCH 24/25] proper checking if zypper exit codes and handling of
|
||||
result messages
|
||||
|
||||
add function to check zypper exit codes
|
||||
|
||||
check zypper exit code everywhere
|
||||
|
||||
add _zypper_check_result() to raise and error or return stdout
|
||||
|
||||
use _zypper_check_result()
|
||||
|
||||
remove new lines between zypper command and check result
|
||||
|
||||
restructure the code a bit
|
||||
---
|
||||
salt/modules/zypper.py | 144 +++++++++++++++++++++++++++++--------------------
|
||||
1 file changed, 85 insertions(+), 59 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index ab8bb06..d6628aa 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -26,6 +26,7 @@ except ImportError:
|
||||
# pylint: enable=import-error,redefined-builtin,no-name-in-module
|
||||
|
||||
from xml.dom import minidom as dom
|
||||
+from xml.parsers.expat import ExpatError
|
||||
|
||||
# Import salt libs
|
||||
import salt.utils
|
||||
@@ -70,6 +71,53 @@ def _zypper(*opts):
|
||||
return cmd
|
||||
|
||||
|
||||
+def _is_zypper_error(retcode):
|
||||
+ '''
|
||||
+ Return True in case the exist code indicate a zypper errror.
|
||||
+ Otherwise False
|
||||
+ '''
|
||||
+ # see man zypper for existing exit codes
|
||||
+ return not int(retcode) in [0, 100, 101, 102, 103]
|
||||
+
|
||||
+
|
||||
+def _zypper_check_result(result, xml=False):
|
||||
+ '''
|
||||
+ Check the result of a zypper command. In case of an error, it raise
|
||||
+ a CommandExecutionError. Otherwise it returns stdout string of the
|
||||
+ command.
|
||||
+
|
||||
+ result
|
||||
+ The result of a zypper command called with cmd.run_all
|
||||
+
|
||||
+ xml
|
||||
+ Set to True if zypper command was called with --xmlout.
|
||||
+ In this case it try to read an error message out of the XML
|
||||
+ stream. Default is False.
|
||||
+ '''
|
||||
+ if _is_zypper_error(result['retcode']):
|
||||
+ msg = list()
|
||||
+ if not xml:
|
||||
+ msg.append(result['stderr'] and result['stderr'] or "")
|
||||
+ else:
|
||||
+ try:
|
||||
+ doc = dom.parseString(result['stdout'])
|
||||
+ except ExpatError as err:
|
||||
+ log.error(err)
|
||||
+ doc = None
|
||||
+ if doc:
|
||||
+ msg_nodes = doc.getElementsByTagName('message')
|
||||
+ for node in msg_nodes:
|
||||
+ if node.getAttribute('type') == 'error':
|
||||
+ msg.append(node.childNodes[0].nodeValue)
|
||||
+ elif result['stderr'].strip():
|
||||
+ msg.append(result['stderr'].strip())
|
||||
+
|
||||
+ raise CommandExecutionError("zypper command failed: {0}".format(
|
||||
+ msg and os.linesep.join(msg) or "Check zypper logs"))
|
||||
+
|
||||
+ return result['stdout']
|
||||
+
|
||||
+
|
||||
def list_upgrades(refresh=True):
|
||||
'''
|
||||
List all available package upgrades on this system
|
||||
@@ -89,15 +137,7 @@ def list_upgrades(refresh=True):
|
||||
refresh_db()
|
||||
ret = dict()
|
||||
run_data = __salt__['cmd.run_all'](_zypper('-x', 'list-updates'), output_loglevel='trace')
|
||||
- if run_data['retcode'] != 0:
|
||||
- msg = list()
|
||||
- for chnl in ['stderr', 'stdout']:
|
||||
- if run_data.get(chnl, ''):
|
||||
- msg.append(run_data[chnl])
|
||||
- raise CommandExecutionError(os.linesep.join(msg) or
|
||||
- 'Zypper returned non-zero system exit. See Zypper logs for more details.')
|
||||
-
|
||||
- doc = dom.parseString(run_data['stdout'])
|
||||
+ doc = dom.parseString(_zypper_check_result(run_data, xml=True))
|
||||
for update_node in doc.getElementsByTagName('update'):
|
||||
if update_node.getAttribute('kind') == 'package':
|
||||
ret[update_node.getAttribute('name')] = update_node.getAttribute('edition')
|
||||
@@ -506,7 +546,8 @@ def del_repo(repo):
|
||||
for alias in repos_cfg.sections():
|
||||
if alias == repo:
|
||||
cmd = _zypper('-x', 'rr', '--loose-auth', '--loose-query', alias)
|
||||
- doc = dom.parseString(__salt__['cmd.run'](cmd, output_loglevel='trace'))
|
||||
+ ret = __salt__['cmd.run_all'](cmd, output_loglevel='trace')
|
||||
+ doc = dom.parseString(_zypper_check_result(ret, xml=True))
|
||||
msg = doc.getElementsByTagName('message')
|
||||
if doc.getElementsByTagName('progress') and msg:
|
||||
return {
|
||||
@@ -590,22 +631,8 @@ def mod_repo(repo, **kwargs):
|
||||
'Repository \'{0}\' already exists as \'{1}\'.'.format(repo, alias))
|
||||
|
||||
# Add new repo
|
||||
- doc = None
|
||||
- try:
|
||||
- # Try to parse the output and find the error,
|
||||
- # but this not always working (depends on Zypper version)
|
||||
- doc = dom.parseString(__salt__['cmd.run'](
|
||||
- _zypper('-x', 'ar', url, repo), output_loglevel='trace'))
|
||||
- except Exception:
|
||||
- # No XML out available, but it is still unknown the state of the result.
|
||||
- pass
|
||||
-
|
||||
- if doc:
|
||||
- msg_nodes = doc.getElementsByTagName('message')
|
||||
- if msg_nodes:
|
||||
- msg_node = msg_nodes[0]
|
||||
- if msg_node.getAttribute('type') == 'error':
|
||||
- raise CommandExecutionError(msg_node.childNodes[0].nodeValue)
|
||||
+ _zypper_check_result(__salt__['cmd.run_all'](_zypper('-x', 'ar', url, repo),
|
||||
+ output_loglevel='trace'), xml=True)
|
||||
|
||||
# Verify the repository has been added
|
||||
repos_cfg = _get_configured_repos()
|
||||
@@ -641,8 +668,9 @@ def mod_repo(repo, **kwargs):
|
||||
|
||||
if cmd_opt:
|
||||
cmd_opt.append(repo)
|
||||
- __salt__['cmd.run'](_zypper('-x', 'mr', *cmd_opt),
|
||||
- output_loglevel='trace')
|
||||
+ ret = __salt__['cmd.run_all'](_zypper('-x', 'mr', *cmd_opt),
|
||||
+ output_loglevel='trace')
|
||||
+ _zypper_check_result(ret, xml=True)
|
||||
|
||||
# If repo nor added neither modified, error should be thrown
|
||||
if not added and not cmd_opt:
|
||||
@@ -666,17 +694,7 @@ def refresh_db():
|
||||
'''
|
||||
cmd = _zypper('refresh', '--force')
|
||||
ret = {}
|
||||
- call = __salt__['cmd.run_all'](cmd, output_loglevel='trace')
|
||||
- if call['retcode'] != 0:
|
||||
- comment = ''
|
||||
- if 'stderr' in call:
|
||||
- comment += call['stderr']
|
||||
-
|
||||
- raise CommandExecutionError(
|
||||
- '{0}'.format(comment)
|
||||
- )
|
||||
- else:
|
||||
- out = call['stdout']
|
||||
+ out = _zypper_check_result(__salt__['cmd.run_all'](cmd, output_loglevel='trace'))
|
||||
|
||||
for line in out.splitlines():
|
||||
if not line:
|
||||
@@ -828,19 +846,18 @@ def install(name=None,
|
||||
cmd = cmd_install + targets[:500]
|
||||
targets = targets[500:]
|
||||
call = __salt__['cmd.run_all'](cmd, output_loglevel='trace', python_shell=False)
|
||||
- if call['retcode'] != 0:
|
||||
- raise CommandExecutionError(call['stderr']) # Fixme: This needs a proper report mechanism.
|
||||
- else:
|
||||
- for line in call['stdout'].splitlines():
|
||||
- match = re.match(r"^The selected package '([^']+)'.+has lower version", line)
|
||||
- if match:
|
||||
- downgrades.append(match.group(1))
|
||||
+ out = _zypper_check_result(call)
|
||||
+ for line in out.splitlines():
|
||||
+ match = re.match(r"^The selected package '([^']+)'.+has lower version", line)
|
||||
+ if match:
|
||||
+ downgrades.append(match.group(1))
|
||||
|
||||
while downgrades:
|
||||
cmd = cmd_install + ['--force'] + downgrades[:500]
|
||||
downgrades = downgrades[500:]
|
||||
|
||||
- __salt__['cmd.run'](cmd, output_loglevel='trace', python_shell=False)
|
||||
+ _zypper_check_result(__salt__['cmd.run_all'](cmd, output_loglevel='trace', python_shell=False))
|
||||
+
|
||||
__context__.pop('pkg.list_pkgs', None)
|
||||
new = list_pkgs()
|
||||
|
||||
@@ -877,7 +894,7 @@ def upgrade(refresh=True):
|
||||
old = list_pkgs()
|
||||
cmd = _zypper('update', '--auto-agree-with-licenses')
|
||||
call = __salt__['cmd.run_all'](cmd, output_loglevel='trace')
|
||||
- if call['retcode'] != 0:
|
||||
+ if _is_zypper_error(call['retcode']):
|
||||
ret['result'] = False
|
||||
if 'stderr' in call:
|
||||
ret['comment'] += call['stderr']
|
||||
@@ -906,7 +923,8 @@ def _uninstall(name=None, pkgs=None):
|
||||
return {}
|
||||
|
||||
while targets:
|
||||
- __salt__['cmd.run'](_zypper('remove', *targets[:500]), output_loglevel='trace')
|
||||
+ _zypper_check_result(__salt__['cmd.run_all'](_zypper('remove', *targets[:500]),
|
||||
+ output_loglevel='trace'))
|
||||
targets = targets[500:]
|
||||
__context__.pop('pkg.list_pkgs', None)
|
||||
|
||||
@@ -1019,7 +1037,8 @@ def clean_locks():
|
||||
if not os.path.exists("/etc/zypp/locks"):
|
||||
return out
|
||||
|
||||
- doc = dom.parseString(__salt__['cmd.run'](_zypper('-x', 'cl'), output_loglevel='trace'))
|
||||
+ ret = __salt__['cmd.run_all'](_zypper('-x', 'cl'), output_loglevel='trace')
|
||||
+ doc = dom.parseString(_zypper_check_result(ret, xml=True))
|
||||
for node in doc.getElementsByTagName("message"):
|
||||
text = node.childNodes[0].nodeValue.lower()
|
||||
if text.startswith(LCK):
|
||||
@@ -1057,7 +1076,8 @@ def remove_lock(packages, **kwargs): # pylint: disable=unused-argument
|
||||
missing.append(pkg)
|
||||
|
||||
if removed:
|
||||
- __salt__['cmd.run'](_zypper('rl', *removed), output_loglevel='trace')
|
||||
+ _zypper_check_result(__salt__['cmd.run_all'](_zypper('rl', *removed),
|
||||
+ output_loglevel='trace'))
|
||||
|
||||
return {'removed': len(removed), 'not_found': missing}
|
||||
|
||||
@@ -1086,7 +1106,8 @@ def add_lock(packages, **kwargs): # pylint: disable=unused-argument
|
||||
added.append(pkg)
|
||||
|
||||
if added:
|
||||
- __salt__['cmd.run'](_zypper('al', *added), output_loglevel='trace')
|
||||
+ _zypper_check_result(__salt__['cmd.run_all'](_zypper('al', *added),
|
||||
+ output_loglevel='trace'))
|
||||
|
||||
return {'added': len(added), 'packages': added}
|
||||
|
||||
@@ -1218,8 +1239,10 @@ def _get_patterns(installed_only=None):
|
||||
List all known patterns in repos.
|
||||
'''
|
||||
patterns = {}
|
||||
- doc = dom.parseString(__salt__['cmd.run'](_zypper('--xmlout', 'se', '-t', 'pattern'),
|
||||
- output_loglevel='trace'))
|
||||
+
|
||||
+ ret = __salt__['cmd.run_all'](_zypper('--xmlout', 'se', '-t', 'pattern'),
|
||||
+ output_loglevel='trace')
|
||||
+ doc = dom.parseString(_zypper_check_result(ret, xml=True))
|
||||
for element in doc.getElementsByTagName('solvable'):
|
||||
installed = element.getAttribute('status') == 'installed'
|
||||
if (installed_only and installed) or not installed_only:
|
||||
@@ -1283,8 +1306,9 @@ def search(criteria, refresh=False):
|
||||
if refresh:
|
||||
refresh_db()
|
||||
|
||||
- doc = dom.parseString(__salt__['cmd.run'](_zypper('--xmlout', 'se', criteria),
|
||||
- output_loglevel='trace'))
|
||||
+ ret = __salt__['cmd.run_all'](_zypper('--xmlout', 'se', criteria),
|
||||
+ output_loglevel='trace')
|
||||
+ doc = dom.parseString(_zypper_check_result(ret, xml=True))
|
||||
solvables = doc.getElementsByTagName('solvable')
|
||||
if not solvables:
|
||||
raise CommandExecutionError('No packages found by criteria "{0}".'.format(criteria))
|
||||
@@ -1343,7 +1367,9 @@ def list_products(all=False, refresh=False):
|
||||
cmd = _zypper('-x', 'products')
|
||||
if not all:
|
||||
cmd.append('-i')
|
||||
- doc = dom.parseString(__salt__['cmd.run'](cmd, output_loglevel='trace'))
|
||||
+
|
||||
+ call = __salt__['cmd.run_all'](cmd, output_loglevel='trace')
|
||||
+ doc = dom.parseString(_zypper_check_result(call, xml=True))
|
||||
for prd in doc.getElementsByTagName('product-list')[0].getElementsByTagName('product'):
|
||||
p_nfo = dict()
|
||||
for k_p_nfo, v_p_nfo in prd.attributes.items():
|
||||
@@ -1390,8 +1416,8 @@ def download(*packages, **kwargs):
|
||||
if refresh:
|
||||
refresh_db()
|
||||
|
||||
- doc = dom.parseString(__salt__['cmd.run'](
|
||||
- _zypper('-x', 'download', *packages), output_loglevel='trace'))
|
||||
+ ret = __salt__['cmd.run_all'](_zypper('-x', 'download', *packages), output_loglevel='trace')
|
||||
+ doc = dom.parseString(_zypper_check_result(ret, xml=True))
|
||||
pkg_ret = {}
|
||||
for dld_result in doc.getElementsByTagName("download-result"):
|
||||
repo = dld_result.getElementsByTagName("repository")[0]
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,288 +0,0 @@
|
||||
From de4417cd3de8af72fe2acd2ea22ab7c04327a939 Mon Sep 17 00:00:00 2001
|
||||
From: Michael Calmer <mc@suse.de>
|
||||
Date: Fri, 26 Feb 2016 12:05:45 +0100
|
||||
Subject: [PATCH 25/25] adapt tests to new zypper_check_result() output
|
||||
|
||||
test _zypper_check_result()
|
||||
|
||||
use specialized assert functions for tests
|
||||
---
|
||||
tests/unit/modules/zypper_test.py | 158 ++++++++++++++++++++++++++------------
|
||||
1 file changed, 111 insertions(+), 47 deletions(-)
|
||||
|
||||
diff --git a/tests/unit/modules/zypper_test.py b/tests/unit/modules/zypper_test.py
|
||||
index de964f9..f89d18f 100644
|
||||
--- a/tests/unit/modules/zypper_test.py
|
||||
+++ b/tests/unit/modules/zypper_test.py
|
||||
@@ -57,12 +57,63 @@ class ZypperTestCase(TestCase):
|
||||
}
|
||||
with patch.dict(zypper.__salt__, {'cmd.run_all': MagicMock(return_value=ref_out)}):
|
||||
upgrades = zypper.list_upgrades(refresh=False)
|
||||
- assert len(upgrades) == 3
|
||||
+ self.assertEqual(len(upgrades), 3)
|
||||
for pkg, version in {'SUSEConnect': '0.2.33-7.1',
|
||||
'bind-utils': '9.9.6P1-35.1',
|
||||
'bind-libs': '9.9.6P1-35.1'}.items():
|
||||
- assert pkg in upgrades
|
||||
- assert upgrades[pkg] == version
|
||||
+ self.assertIn(pkg, upgrades)
|
||||
+ self.assertEqual(upgrades[pkg], version)
|
||||
+
|
||||
+ def test_zypper_check_result(self):
|
||||
+ '''
|
||||
+ Test zypper check result function
|
||||
+ '''
|
||||
+ cmd_out = {
|
||||
+ 'retcode': 1,
|
||||
+ 'stdout': '',
|
||||
+ 'stderr': 'This is an error'
|
||||
+ }
|
||||
+ with self.assertRaisesRegexp(CommandExecutionError, "^zypper command failed: This is an error$"):
|
||||
+ zypper._zypper_check_result(cmd_out)
|
||||
+
|
||||
+ cmd_out = {
|
||||
+ 'retcode': 0,
|
||||
+ 'stdout': 'result',
|
||||
+ 'stderr': ''
|
||||
+ }
|
||||
+ out = zypper._zypper_check_result(cmd_out)
|
||||
+ self.assertEqual(out, "result")
|
||||
+
|
||||
+ cmd_out = {
|
||||
+ 'retcode': 1,
|
||||
+ 'stdout': '',
|
||||
+ 'stderr': 'This is an error'
|
||||
+ }
|
||||
+ with self.assertRaisesRegexp(CommandExecutionError, "^zypper command failed: This is an error$"):
|
||||
+ zypper._zypper_check_result(cmd_out, xml=True)
|
||||
+
|
||||
+ cmd_out = {
|
||||
+ 'retcode': 1,
|
||||
+ 'stdout': '',
|
||||
+ 'stderr': ''
|
||||
+ }
|
||||
+ with self.assertRaisesRegexp(CommandExecutionError, "^zypper command failed: Check zypper logs$"):
|
||||
+ zypper._zypper_check_result(cmd_out, xml=True)
|
||||
+
|
||||
+ cmd_out = {
|
||||
+ 'stdout': '''<?xml version='1.0'?>
|
||||
+<stream>
|
||||
+ <message type="info">Refreshing service 'container-suseconnect'.</message>
|
||||
+ <message type="error">Some handled zypper internal error</message>
|
||||
+ <message type="error">Another zypper internal error</message>
|
||||
+</stream>
|
||||
+ ''',
|
||||
+ 'stderr': '',
|
||||
+ 'retcode': 1
|
||||
+ }
|
||||
+ with self.assertRaisesRegexp(CommandExecutionError,
|
||||
+ "^zypper command failed: Some handled zypper internal error\nAnother zypper internal error$"):
|
||||
+ zypper._zypper_check_result(cmd_out, xml=True)
|
||||
|
||||
def test_list_upgrades_error_handling(self):
|
||||
'''
|
||||
@@ -71,45 +122,53 @@ class ZypperTestCase(TestCase):
|
||||
'''
|
||||
# Test handled errors
|
||||
ref_out = {
|
||||
- 'stderr': 'Some handled zypper internal error',
|
||||
+ 'stdout': '''<?xml version='1.0'?>
|
||||
+<stream>
|
||||
+ <message type="info">Refreshing service 'container-suseconnect'.</message>
|
||||
+ <message type="error">Some handled zypper internal error</message>
|
||||
+ <message type="error">Another zypper internal error</message>
|
||||
+</stream>
|
||||
+ ''',
|
||||
'retcode': 1
|
||||
}
|
||||
with patch.dict(zypper.__salt__, {'cmd.run_all': MagicMock(return_value=ref_out)}):
|
||||
- try:
|
||||
+ with self.assertRaisesRegexp(CommandExecutionError,
|
||||
+ "^zypper command failed: Some handled zypper internal error\nAnother zypper internal error$"):
|
||||
zypper.list_upgrades(refresh=False)
|
||||
- except CommandExecutionError as error:
|
||||
- assert error.message == ref_out['stderr']
|
||||
|
||||
# Test unhandled error
|
||||
ref_out = {
|
||||
- 'retcode': 1
|
||||
+ 'retcode': 1,
|
||||
+ 'stdout': '',
|
||||
+ 'stderr': ''
|
||||
}
|
||||
with patch.dict(zypper.__salt__, {'cmd.run_all': MagicMock(return_value=ref_out)}):
|
||||
- try:
|
||||
+ with self.assertRaisesRegexp(CommandExecutionError, '^zypper command failed: Check zypper logs$'):
|
||||
zypper.list_upgrades(refresh=False)
|
||||
- except CommandExecutionError as error:
|
||||
- assert error.message == 'Zypper returned non-zero system exit. See Zypper logs for more details.'
|
||||
|
||||
def test_list_products(self):
|
||||
'''
|
||||
List products test.
|
||||
'''
|
||||
- ref_out = get_test_data('zypper-products.xml')
|
||||
- with patch.dict(zypper.__salt__, {'cmd.run': MagicMock(return_value=ref_out)}):
|
||||
+ ref_out = {
|
||||
+ 'retcode': 0,
|
||||
+ 'stdout': get_test_data('zypper-products.xml')
|
||||
+ }
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run_all': MagicMock(return_value=ref_out)}):
|
||||
products = zypper.list_products()
|
||||
- assert len(products) == 5
|
||||
- assert (['SLES', 'SLES', 'SUSE-Manager-Proxy', 'SUSE-Manager-Server', 'sle-manager-tools-beta'] ==
|
||||
+ self.assertEqual(len(products), 5)
|
||||
+ self.assertEqual(['SLES', 'SLES', 'SUSE-Manager-Proxy', 'SUSE-Manager-Server', 'sle-manager-tools-beta'],
|
||||
sorted([prod['name'] for prod in products]))
|
||||
- assert ('SUSE LLC <https://www.suse.com/>' in [product['vendor'] for product in products])
|
||||
- assert ([False, False, False, False, True] ==
|
||||
+ self.assertIn('SUSE LLC <https://www.suse.com/>', [product['vendor'] for product in products])
|
||||
+ self.assertEqual([False, False, False, False, True],
|
||||
sorted([product['isbase'] for product in products]))
|
||||
- assert ([False, False, False, False, True] ==
|
||||
+ self.assertEqual([False, False, False, False, True],
|
||||
sorted([product['installed'] for product in products]))
|
||||
- assert (['0', '0', '0', '0', '0'] ==
|
||||
+ self.assertEqual(['0', '0', '0', '0', '0'],
|
||||
sorted([product['release'] for product in products]))
|
||||
- assert ([False, False, False, False, u'sles'] ==
|
||||
+ self.assertEqual([False, False, False, False, u'sles'],
|
||||
sorted([product['productline'] for product in products]))
|
||||
- assert ([1509408000, 1522454400, 1522454400, 1730332800, 1730332800] ==
|
||||
+ self.assertEqual([1509408000, 1522454400, 1522454400, 1730332800, 1730332800],
|
||||
sorted([product['eol_t'] for product in products]))
|
||||
|
||||
def test_refresh_db(self):
|
||||
@@ -168,17 +227,17 @@ class ZypperTestCase(TestCase):
|
||||
with patch.dict(zypper.__salt__, {'lowpkg.info': MagicMock(return_value=run_out)}):
|
||||
installed = zypper.info_installed()
|
||||
# Test overall products length
|
||||
- assert len(installed) == 2
|
||||
+ self.assertEqual(len(installed), 2)
|
||||
|
||||
# Test translated fields
|
||||
for pkg_name, pkg_info in installed.items():
|
||||
- assert installed[pkg_name].get('source') == run_out[pkg_name]['source_rpm']
|
||||
+ self.assertEqual(installed[pkg_name].get('source'), run_out[pkg_name]['source_rpm'])
|
||||
|
||||
# Test keys transition from the lowpkg.info
|
||||
for pn_key, pn_val in run_out['virgo-dummy'].items():
|
||||
if pn_key == 'source_rpm':
|
||||
continue
|
||||
- assert installed['virgo-dummy'][pn_key] == pn_val
|
||||
+ self.assertEqual(installed['virgo-dummy'][pn_key], pn_val)
|
||||
|
||||
def test_info_available(self):
|
||||
'''
|
||||
@@ -190,21 +249,21 @@ class ZypperTestCase(TestCase):
|
||||
ref_out = get_test_data('zypper-available.txt')
|
||||
with patch.dict(zypper.__salt__, {'cmd.run_stdout': MagicMock(return_value=ref_out)}):
|
||||
available = zypper.info_available(*test_pkgs, refresh=False)
|
||||
- assert len(available) == 3
|
||||
+ self.assertEqual(len(available), 3)
|
||||
for pkg_name, pkg_info in available.items():
|
||||
- assert pkg_name in test_pkgs
|
||||
+ self.assertIn(pkg_name, test_pkgs)
|
||||
|
||||
- assert available['emacs']['status'] == 'up-to-date'
|
||||
- assert available['emacs']['installed']
|
||||
- assert available['emacs']['support level'] == 'Level 3'
|
||||
- assert available['emacs']['vendor'] == 'SUSE LLC <https://www.suse.com/>'
|
||||
- assert available['emacs']['summary'] == 'GNU Emacs Base Package'
|
||||
+ self.assertEqual(available['emacs']['status'], 'up-to-date')
|
||||
+ self.assertTrue(available['emacs']['installed'])
|
||||
+ self.assertEqual(available['emacs']['support level'], 'Level 3')
|
||||
+ self.assertEqual(available['emacs']['vendor'], 'SUSE LLC <https://www.suse.com/>')
|
||||
+ self.assertEqual(available['emacs']['summary'], 'GNU Emacs Base Package')
|
||||
|
||||
- assert available['vim']['status'] == 'not installed'
|
||||
- assert not available['vim']['installed']
|
||||
- assert available['vim']['support level'] == 'Level 3'
|
||||
- assert available['vim']['vendor'] == 'SUSE LLC <https://www.suse.com/>'
|
||||
- assert available['vim']['summary'] == 'Vi IMproved'
|
||||
+ self.assertEqual(available['vim']['status'], 'not installed')
|
||||
+ self.assertFalse(available['vim']['installed'])
|
||||
+ self.assertEqual(available['vim']['support level'], 'Level 3')
|
||||
+ self.assertEqual(available['vim']['vendor'], 'SUSE LLC <https://www.suse.com/>')
|
||||
+ self.assertEqual(available['vim']['summary'], 'Vi IMproved')
|
||||
|
||||
@patch('salt.modules.zypper.refresh_db', MagicMock(return_value=True))
|
||||
def test_latest_version(self):
|
||||
@@ -215,7 +274,7 @@ class ZypperTestCase(TestCase):
|
||||
'''
|
||||
ref_out = get_test_data('zypper-available.txt')
|
||||
with patch.dict(zypper.__salt__, {'cmd.run_stdout': MagicMock(return_value=ref_out)}):
|
||||
- assert zypper.latest_version('vim') == '7.4.326-2.62'
|
||||
+ self.assertEqual(zypper.latest_version('vim'), '7.4.326-2.62')
|
||||
|
||||
@patch('salt.modules.zypper.refresh_db', MagicMock(return_value=True))
|
||||
def test_upgrade_available(self):
|
||||
@@ -227,8 +286,8 @@ class ZypperTestCase(TestCase):
|
||||
ref_out = get_test_data('zypper-available.txt')
|
||||
with patch.dict(zypper.__salt__, {'cmd.run_stdout': MagicMock(return_value=ref_out)}):
|
||||
for pkg_name in ['emacs', 'python']:
|
||||
- assert not zypper.upgrade_available(pkg_name)
|
||||
- assert zypper.upgrade_available('vim')
|
||||
+ self.assertFalse(zypper.upgrade_available(pkg_name))
|
||||
+ self.assertTrue(zypper.upgrade_available('vim'))
|
||||
|
||||
@patch('salt.modules.zypper.HAS_RPM', True)
|
||||
def test_version_cmp_rpm(self):
|
||||
@@ -239,7 +298,7 @@ class ZypperTestCase(TestCase):
|
||||
'''
|
||||
with patch('salt.modules.zypper.rpm', MagicMock(return_value=MagicMock)):
|
||||
with patch('salt.modules.zypper.rpm.labelCompare', MagicMock(return_value=0)):
|
||||
- assert 0 == zypper.version_cmp('1', '2') # mock returns 0, which means RPM was called
|
||||
+ self.assertEqual(0, zypper.version_cmp('1', '2')) # mock returns 0, which means RPM was called
|
||||
|
||||
@patch('salt.modules.zypper.HAS_RPM', False)
|
||||
def test_version_cmp_fallback(self):
|
||||
@@ -250,7 +309,7 @@ class ZypperTestCase(TestCase):
|
||||
'''
|
||||
with patch('salt.modules.zypper.rpm', MagicMock(return_value=MagicMock)):
|
||||
with patch('salt.modules.zypper.rpm.labelCompare', MagicMock(return_value=0)):
|
||||
- assert -1 == zypper.version_cmp('1', '2') # mock returns -1, a python implementation was called
|
||||
+ self.assertEqual(-1, zypper.version_cmp('1', '2')) # mock returns -1, a python implementation was called
|
||||
|
||||
def test_list_pkgs(self):
|
||||
'''
|
||||
@@ -281,8 +340,8 @@ class ZypperTestCase(TestCase):
|
||||
'susemanager-build-keys-web': '12.0-5.1.develHead',
|
||||
'apache-commons-cli': '1.2-1.233',
|
||||
'jose4j': '0.4.4-2.1.develHead'}.items():
|
||||
- assert pkgs.get(pkg_name)
|
||||
- assert pkgs[pkg_name] == pkg_version
|
||||
+ self.assertTrue(pkgs.get(pkg_name))
|
||||
+ self.assertEqual(pkgs[pkg_name], pkg_version)
|
||||
|
||||
def test_remove_purge(self):
|
||||
'''
|
||||
@@ -307,16 +366,21 @@ class ZypperTestCase(TestCase):
|
||||
return pkgs
|
||||
|
||||
parsed_targets = [{'vim': None, 'pico': None}, None]
|
||||
+ cmd_out = {
|
||||
+ 'retcode': 0,
|
||||
+ 'stdout': '',
|
||||
+ 'stderr': ''
|
||||
+ }
|
||||
|
||||
- with patch.dict(zypper.__salt__, {'cmd.run': MagicMock(return_value=False)}):
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run_all': MagicMock(return_value=cmd_out)}):
|
||||
with patch.dict(zypper.__salt__, {'pkg_resource.parse_targets': MagicMock(return_value=parsed_targets)}):
|
||||
with patch.dict(zypper.__salt__, {'pkg_resource.stringify': MagicMock()}):
|
||||
with patch('salt.modules.zypper.list_pkgs', ListPackages()):
|
||||
diff = zypper.remove(name='vim,pico')
|
||||
for pkg_name in ['vim', 'pico']:
|
||||
- assert diff.get(pkg_name)
|
||||
- assert diff[pkg_name]['old']
|
||||
- assert not diff[pkg_name]['new']
|
||||
+ self.assertTrue(diff.get(pkg_name))
|
||||
+ self.assertTrue(diff[pkg_name]['old'])
|
||||
+ self.assertFalse(diff[pkg_name]['new'])
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,21 +0,0 @@
|
||||
From 9b8d6cbb72cd6537016a9d4da73bd3127b951845 Mon Sep 17 00:00:00 2001
|
||||
From: =?UTF-8?q?Marcus=20R=C3=BCckert?= <mrueckert@suse.de>
|
||||
Date: Wed, 2 Mar 2016 17:29:23 +0100
|
||||
Subject: [PATCH] make the suse check consistent with rh_service.py
|
||||
|
||||
---
|
||||
salt/modules/service.py | 1 +
|
||||
1 file changed, 1 insertion(+)
|
||||
|
||||
diff --git a/salt/modules/service.py b/salt/modules/service.py
|
||||
index 05db855..7aacedd 100644
|
||||
--- a/salt/modules/service.py
|
||||
+++ b/salt/modules/service.py
|
||||
@@ -37,6 +37,7 @@ def __virtual__():
|
||||
'Arch ARM',
|
||||
'ALT',
|
||||
'SUSE Enterprise Server',
|
||||
+ 'SUSE',
|
||||
'OEL',
|
||||
'Linaro',
|
||||
'elementary OS',
|
@ -1,28 +0,0 @@
|
||||
From c0c8a77242cac5565febc9b08aeda7328d13e92f Mon Sep 17 00:00:00 2001
|
||||
From: =?UTF-8?q?Marcus=20R=C3=BCckert?= <mrueckert@suse.de>
|
||||
Date: Wed, 2 Mar 2016 17:29:54 +0100
|
||||
Subject: [PATCH] Fix numerical check of osrelease
|
||||
|
||||
After making the version check numerical in 9975508 it no longer matched
|
||||
SLES 11 properly to use the rh_service module as '11.4 > 11' evaluates
|
||||
to true. Without using the rh_service module, not all methods are
|
||||
implemented to use the service state on sle11.
|
||||
---
|
||||
salt/modules/rh_service.py | 4 ++--
|
||||
1 file changed, 2 insertions(+), 2 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/rh_service.py b/salt/modules/rh_service.py
|
||||
index c425cde..910a75d 100644
|
||||
--- a/salt/modules/rh_service.py
|
||||
+++ b/salt/modules/rh_service.py
|
||||
@@ -66,8 +66,8 @@ def __virtual__():
|
||||
return (False, 'Cannot load rh_service module: '
|
||||
'osrelease grain, {0}, not a float,'.format(osrelease))
|
||||
if __grains__['os'] == 'SUSE':
|
||||
- if osrelease > 11:
|
||||
- return (False, 'Cannot load rh_service module on SUSE >= 11')
|
||||
+ if osrelease >= 12:
|
||||
+ return (False, 'Cannot load rh_service module on SUSE >= 12')
|
||||
if __grains__['os'] == 'Fedora':
|
||||
if osrelease > 15:
|
||||
return (False, 'Cannot load rh_service module on Fedora >= 15')
|
@ -1,734 +0,0 @@
|
||||
From 2220c5a0ae800988bf83c39b458a8747f01186c0 Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@suse.de>
|
||||
Date: Fri, 12 Feb 2016 16:16:12 +0100
|
||||
Subject: [PATCH 29/29] Make use of checksum configurable (defaults to MD5,
|
||||
SHA256 suggested)
|
||||
|
||||
Set config hash_type to SHA1
|
||||
Set default hash as SHA1 in config and explain why.
|
||||
Use hash_type configuration for the Cloud
|
||||
Use configurable hash_type for general Key fingerprinting
|
||||
Use SHA1 hash by default
|
||||
Use SHA1 hash by default in Tomcat module, refactor for support different algorithms
|
||||
Use SHA1 by default instead of MD5
|
||||
Remove SHA1 to SHA265 by default
|
||||
Add note to the Tomcat module for SHA256
|
||||
Remove sha1 to sha265
|
||||
Remove SHA1 for SHA256
|
||||
Remove SHA1 in favor of SHA256
|
||||
Use MD5 hash algorithm by default (until deprecated)
|
||||
Create a mixin class that will be reused in the similar instances (daemons)
|
||||
Use mixin for the daemon classes
|
||||
Report environment failure, if any
|
||||
Verify if hash_type is using vulnerable algorithms
|
||||
Standardize logging
|
||||
Add daemons unit test to verify hash_type settings
|
||||
Fix PyLint
|
||||
---
|
||||
conf/master | 5 +-
|
||||
conf/minion | 8 +-
|
||||
conf/proxy | 9 +-
|
||||
salt/cli/daemons.py | 83 +++++++++++++-----
|
||||
salt/cloud/__init__.py | 4 +-
|
||||
salt/crypt.py | 10 +--
|
||||
salt/key.py | 4 +-
|
||||
salt/modules/key.py | 10 +--
|
||||
salt/modules/tomcat.py | 26 ++----
|
||||
salt/modules/win_file.py | 2 +-
|
||||
salt/utils/__init__.py | 13 +--
|
||||
salt/utils/cloud.py | 3 +-
|
||||
tests/unit/daemons_test.py | 209 +++++++++++++++++++++++++++++++++++++++++++++
|
||||
13 files changed, 319 insertions(+), 67 deletions(-)
|
||||
create mode 100644 tests/unit/daemons_test.py
|
||||
|
||||
diff --git a/conf/master b/conf/master
|
||||
index 36657e8..cf05ec4 100644
|
||||
--- a/conf/master
|
||||
+++ b/conf/master
|
||||
@@ -466,9 +466,12 @@ syndic_user: salt
|
||||
#default_top: base
|
||||
|
||||
# The hash_type is the hash to use when discovering the hash of a file on
|
||||
-# the master server. The default is md5, but sha1, sha224, sha256, sha384
|
||||
+# the master server. The default is md5 but sha1, sha224, sha256, sha384
|
||||
# and sha512 are also supported.
|
||||
#
|
||||
+# WARNING: While md5 is supported, do not use it due to the high chance
|
||||
+# of possible collisions and thus security breach.
|
||||
+#
|
||||
# Prior to changing this value, the master should be stopped and all Salt
|
||||
# caches should be cleared.
|
||||
#hash_type: md5
|
||||
diff --git a/conf/minion b/conf/minion
|
||||
index 2307f70..e17ec61 100644
|
||||
--- a/conf/minion
|
||||
+++ b/conf/minion
|
||||
@@ -440,12 +440,14 @@
|
||||
#fileserver_limit_traversal: False
|
||||
|
||||
# The hash_type is the hash to use when discovering the hash of a file in
|
||||
-# the local fileserver. The default is md5, but sha1, sha224, sha256, sha384
|
||||
-# and sha512 are also supported.
|
||||
+# the local fileserver. The default is sha256, sha224, sha384 and sha512 are also supported.
|
||||
+#
|
||||
+# WARNING: While md5 and sha1 are also supported, do not use it due to the high chance
|
||||
+# of possible collisions and thus security breach.
|
||||
#
|
||||
# Warning: Prior to changing this value, the minion should be stopped and all
|
||||
# Salt caches should be cleared.
|
||||
-#hash_type: md5
|
||||
+#hash_type: sha256
|
||||
|
||||
# The Salt pillar is searched for locally if file_client is set to local. If
|
||||
# this is the case, and pillar data is defined, then the pillar_roots need to
|
||||
diff --git a/conf/proxy b/conf/proxy
|
||||
index 472df35..0de6af8 100644
|
||||
--- a/conf/proxy
|
||||
+++ b/conf/proxy
|
||||
@@ -419,12 +419,15 @@
|
||||
#fileserver_limit_traversal: False
|
||||
|
||||
# The hash_type is the hash to use when discovering the hash of a file in
|
||||
-# the local fileserver. The default is md5, but sha1, sha224, sha256, sha384
|
||||
-# and sha512 are also supported.
|
||||
+# the local fileserver. The default is sha256 but sha224, sha384 and sha512
|
||||
+# are also supported.
|
||||
+#
|
||||
+# WARNING: While md5 and sha1 are also supported, do not use it due to the high chance
|
||||
+# of possible collisions and thus security breach.
|
||||
#
|
||||
# Warning: Prior to changing this value, the minion should be stopped and all
|
||||
# Salt caches should be cleared.
|
||||
-#hash_type: md5
|
||||
+#hash_type: sha256
|
||||
|
||||
# The Salt pillar is searched for locally if file_client is set to local. If
|
||||
# this is the case, and pillar data is defined, then the pillar_roots need to
|
||||
diff --git a/salt/cli/daemons.py b/salt/cli/daemons.py
|
||||
index 7f8b8c8..b0e7b20 100644
|
||||
--- a/salt/cli/daemons.py
|
||||
+++ b/salt/cli/daemons.py
|
||||
@@ -58,7 +58,50 @@ from salt.exceptions import SaltSystemExit
|
||||
logger = salt.log.setup.logging.getLogger(__name__)
|
||||
|
||||
|
||||
-class Master(parsers.MasterOptionParser):
|
||||
+class DaemonsMixin(object): # pylint: disable=no-init
|
||||
+ '''
|
||||
+ Uses the same functions for all daemons
|
||||
+ '''
|
||||
+ def verify_hash_type(self):
|
||||
+ '''
|
||||
+ Verify and display a nag-messsage to the log if vulnerable hash-type is used.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ if self.config['hash_type'].lower() in ['md5', 'sha1']:
|
||||
+ logger.warning('IMPORTANT: Do not use {h_type} hashing algorithm! Please set "hash_type" to '
|
||||
+ 'SHA256 in Salt {d_name} config!'.format(
|
||||
+ h_type=self.config['hash_type'], d_name=self.__class__.__name__))
|
||||
+
|
||||
+ def start_log_info(self):
|
||||
+ '''
|
||||
+ Say daemon starting.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ logger.info('The Salt {d_name} is starting up'.format(d_name=self.__class__.__name__))
|
||||
+
|
||||
+ def shutdown_log_info(self):
|
||||
+ '''
|
||||
+ Say daemon shutting down.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ logger.info('The Salt {d_name} is shut down'.format(d_name=self.__class__.__name__))
|
||||
+
|
||||
+ def environment_failure(self, error):
|
||||
+ '''
|
||||
+ Log environment failure for the daemon and exit with the error code.
|
||||
+
|
||||
+ :param error:
|
||||
+ :return:
|
||||
+ '''
|
||||
+ logger.exception('Failed to create environment for {d_name}: {reason}'.format(
|
||||
+ d_name=self.__class__.__name__, reason=error.message))
|
||||
+ sys.exit(error.errno)
|
||||
+
|
||||
+
|
||||
+class Master(parsers.MasterOptionParser, DaemonsMixin): # pylint: disable=no-init
|
||||
'''
|
||||
Creates a master server
|
||||
'''
|
||||
@@ -114,8 +157,7 @@ class Master(parsers.MasterOptionParser):
|
||||
for syndic_file in os.listdir(self.config['syndic_dir']):
|
||||
os.remove(os.path.join(self.config['syndic_dir'], syndic_file))
|
||||
except OSError as err:
|
||||
- logger.exception('Failed to prepare salt environment')
|
||||
- sys.exit(err.errno)
|
||||
+ self.environment_failure(err)
|
||||
|
||||
self.setup_logfile_logger()
|
||||
verify_log(self.config)
|
||||
@@ -153,17 +195,18 @@ class Master(parsers.MasterOptionParser):
|
||||
'''
|
||||
self.prepare()
|
||||
if check_user(self.config['user']):
|
||||
- logger.info('The salt master is starting up')
|
||||
+ self.verify_hash_type()
|
||||
+ self.start_log_info()
|
||||
self.master.start()
|
||||
|
||||
def shutdown(self):
|
||||
'''
|
||||
If sub-classed, run any shutdown operations on this method.
|
||||
'''
|
||||
- logger.info('The salt master is shut down')
|
||||
+ self.shutdown_log_info()
|
||||
|
||||
|
||||
-class Minion(parsers.MinionOptionParser): # pylint: disable=no-init
|
||||
+class Minion(parsers.MinionOptionParser, DaemonsMixin): # pylint: disable=no-init
|
||||
'''
|
||||
Create a minion server
|
||||
'''
|
||||
@@ -226,8 +269,7 @@ class Minion(parsers.MinionOptionParser): # pylint: disable=no-init
|
||||
verify_files([logfile], self.config['user'])
|
||||
os.umask(current_umask)
|
||||
except OSError as err:
|
||||
- logger.exception('Failed to prepare salt environment')
|
||||
- sys.exit(err.errno)
|
||||
+ self.environment_failure(err)
|
||||
|
||||
self.setup_logfile_logger()
|
||||
verify_log(self.config)
|
||||
@@ -273,7 +315,8 @@ class Minion(parsers.MinionOptionParser): # pylint: disable=no-init
|
||||
try:
|
||||
self.prepare()
|
||||
if check_user(self.config['user']):
|
||||
- logger.info('The salt minion is starting up')
|
||||
+ self.verify_hash_type()
|
||||
+ self.start_log_info()
|
||||
self.minion.tune_in()
|
||||
finally:
|
||||
self.shutdown()
|
||||
@@ -310,10 +353,10 @@ class Minion(parsers.MinionOptionParser): # pylint: disable=no-init
|
||||
'''
|
||||
If sub-classed, run any shutdown operations on this method.
|
||||
'''
|
||||
- logger.info('The salt minion is shut down')
|
||||
+ self.shutdown_log_info()
|
||||
|
||||
|
||||
-class ProxyMinion(parsers.ProxyMinionOptionParser): # pylint: disable=no-init
|
||||
+class ProxyMinion(parsers.ProxyMinionOptionParser, DaemonsMixin): # pylint: disable=no-init
|
||||
'''
|
||||
Create a proxy minion server
|
||||
'''
|
||||
@@ -388,8 +431,7 @@ class ProxyMinion(parsers.ProxyMinionOptionParser): # pylint: disable=no-init
|
||||
os.umask(current_umask)
|
||||
|
||||
except OSError as err:
|
||||
- logger.exception('Failed to prepare salt environment')
|
||||
- sys.exit(err.errno)
|
||||
+ self.environment_failure(err)
|
||||
|
||||
self.setup_logfile_logger()
|
||||
verify_log(self.config)
|
||||
@@ -431,7 +473,8 @@ class ProxyMinion(parsers.ProxyMinionOptionParser): # pylint: disable=no-init
|
||||
try:
|
||||
self.prepare()
|
||||
if check_user(self.config['user']):
|
||||
- logger.info('The proxy minion is starting up')
|
||||
+ self.verify_hash_type()
|
||||
+ self.start_log_info()
|
||||
self.minion.tune_in()
|
||||
except (KeyboardInterrupt, SaltSystemExit) as exc:
|
||||
logger.warn('Stopping the Salt Proxy Minion')
|
||||
@@ -449,10 +492,10 @@ class ProxyMinion(parsers.ProxyMinionOptionParser): # pylint: disable=no-init
|
||||
if hasattr(self, 'minion') and 'proxymodule' in self.minion.opts:
|
||||
proxy_fn = self.minion.opts['proxymodule'].loaded_base_name + '.shutdown'
|
||||
self.minion.opts['proxymodule'][proxy_fn](self.minion.opts)
|
||||
- logger.info('The proxy minion is shut down')
|
||||
+ self.shutdown_log_info()
|
||||
|
||||
|
||||
-class Syndic(parsers.SyndicOptionParser):
|
||||
+class Syndic(parsers.SyndicOptionParser, DaemonsMixin): # pylint: disable=no-init
|
||||
'''
|
||||
Create a syndic server
|
||||
'''
|
||||
@@ -488,8 +531,7 @@ class Syndic(parsers.SyndicOptionParser):
|
||||
verify_files([logfile], self.config['user'])
|
||||
os.umask(current_umask)
|
||||
except OSError as err:
|
||||
- logger.exception('Failed to prepare salt environment')
|
||||
- sys.exit(err.errno)
|
||||
+ self.environment_failure(err)
|
||||
|
||||
self.setup_logfile_logger()
|
||||
verify_log(self.config)
|
||||
@@ -521,7 +563,8 @@ class Syndic(parsers.SyndicOptionParser):
|
||||
'''
|
||||
self.prepare()
|
||||
if check_user(self.config['user']):
|
||||
- logger.info('The salt syndic is starting up')
|
||||
+ self.verify_hash_type()
|
||||
+ self.start_log_info()
|
||||
try:
|
||||
self.syndic.tune_in()
|
||||
except KeyboardInterrupt:
|
||||
@@ -532,4 +575,4 @@ class Syndic(parsers.SyndicOptionParser):
|
||||
'''
|
||||
If sub-classed, run any shutdown operations on this method.
|
||||
'''
|
||||
- logger.info('The salt syndic is shut down')
|
||||
+ self.shutdown_log_info()
|
||||
diff --git a/salt/cloud/__init__.py b/salt/cloud/__init__.py
|
||||
index 77186a4..733b403 100644
|
||||
--- a/salt/cloud/__init__.py
|
||||
+++ b/salt/cloud/__init__.py
|
||||
@@ -2036,7 +2036,7 @@ class Map(Cloud):
|
||||
master_temp_pub = salt.utils.mkstemp()
|
||||
with salt.utils.fopen(master_temp_pub, 'w') as mtp:
|
||||
mtp.write(pub)
|
||||
- master_finger = salt.utils.pem_finger(master_temp_pub)
|
||||
+ master_finger = salt.utils.pem_finger(master_temp_pub, sum_type=self.opts['hash_type'])
|
||||
os.unlink(master_temp_pub)
|
||||
|
||||
if master_profile.get('make_minion', True) is True:
|
||||
@@ -2121,7 +2121,7 @@ class Map(Cloud):
|
||||
# mitigate man-in-the-middle attacks
|
||||
master_pub = os.path.join(self.opts['pki_dir'], 'master.pub')
|
||||
if os.path.isfile(master_pub):
|
||||
- master_finger = salt.utils.pem_finger(master_pub)
|
||||
+ master_finger = salt.utils.pem_finger(master_pub, sum_type=self.opts['hash_type'])
|
||||
|
||||
opts = self.opts.copy()
|
||||
if self.opts['parallel']:
|
||||
diff --git a/salt/crypt.py b/salt/crypt.py
|
||||
index 907ec0c..eaf6d72 100644
|
||||
--- a/salt/crypt.py
|
||||
+++ b/salt/crypt.py
|
||||
@@ -558,11 +558,11 @@ class AsyncAuth(object):
|
||||
if self.opts.get('syndic_master', False): # Is syndic
|
||||
syndic_finger = self.opts.get('syndic_finger', self.opts.get('master_finger', False))
|
||||
if syndic_finger:
|
||||
- if salt.utils.pem_finger(m_pub_fn) != syndic_finger:
|
||||
+ if salt.utils.pem_finger(m_pub_fn, sum_type=self.opts['hash_type']) != syndic_finger:
|
||||
self._finger_fail(syndic_finger, m_pub_fn)
|
||||
else:
|
||||
if self.opts.get('master_finger', False):
|
||||
- if salt.utils.pem_finger(m_pub_fn) != self.opts['master_finger']:
|
||||
+ if salt.utils.pem_finger(m_pub_fn, sum_type=self.opts['hash_type']) != self.opts['master_finger']:
|
||||
self._finger_fail(self.opts['master_finger'], m_pub_fn)
|
||||
auth['publish_port'] = payload['publish_port']
|
||||
raise tornado.gen.Return(auth)
|
||||
@@ -1071,11 +1071,11 @@ class SAuth(AsyncAuth):
|
||||
if self.opts.get('syndic_master', False): # Is syndic
|
||||
syndic_finger = self.opts.get('syndic_finger', self.opts.get('master_finger', False))
|
||||
if syndic_finger:
|
||||
- if salt.utils.pem_finger(m_pub_fn) != syndic_finger:
|
||||
+ if salt.utils.pem_finger(m_pub_fn, sum_type=self.opts['hash_type']) != syndic_finger:
|
||||
self._finger_fail(syndic_finger, m_pub_fn)
|
||||
else:
|
||||
if self.opts.get('master_finger', False):
|
||||
- if salt.utils.pem_finger(m_pub_fn) != self.opts['master_finger']:
|
||||
+ if salt.utils.pem_finger(m_pub_fn, sum_type=self.opts['hash_type']) != self.opts['master_finger']:
|
||||
self._finger_fail(self.opts['master_finger'], m_pub_fn)
|
||||
auth['publish_port'] = payload['publish_port']
|
||||
return auth
|
||||
@@ -1089,7 +1089,7 @@ class SAuth(AsyncAuth):
|
||||
'this minion is not subject to a man-in-the-middle attack.'
|
||||
.format(
|
||||
finger,
|
||||
- salt.utils.pem_finger(master_key)
|
||||
+ salt.utils.pem_finger(master_key, sum_type=self.opts['hash_type'])
|
||||
)
|
||||
)
|
||||
sys.exit(42)
|
||||
diff --git a/salt/key.py b/salt/key.py
|
||||
index 08086a0..e4cb4eb 100644
|
||||
--- a/salt/key.py
|
||||
+++ b/salt/key.py
|
||||
@@ -933,7 +933,7 @@ class Key(object):
|
||||
path = os.path.join(self.opts['pki_dir'], key)
|
||||
else:
|
||||
path = os.path.join(self.opts['pki_dir'], status, key)
|
||||
- ret[status][key] = salt.utils.pem_finger(path)
|
||||
+ ret[status][key] = salt.utils.pem_finger(path, sum_type=self.opts['hash_type'])
|
||||
return ret
|
||||
|
||||
def finger_all(self):
|
||||
@@ -948,7 +948,7 @@ class Key(object):
|
||||
path = os.path.join(self.opts['pki_dir'], key)
|
||||
else:
|
||||
path = os.path.join(self.opts['pki_dir'], status, key)
|
||||
- ret[status][key] = salt.utils.pem_finger(path)
|
||||
+ ret[status][key] = salt.utils.pem_finger(path, sum_type=self.opts['hash_type'])
|
||||
return ret
|
||||
|
||||
|
||||
diff --git a/salt/modules/key.py b/salt/modules/key.py
|
||||
index 12762df..3e16c2d 100644
|
||||
--- a/salt/modules/key.py
|
||||
+++ b/salt/modules/key.py
|
||||
@@ -21,9 +21,8 @@ def finger():
|
||||
|
||||
salt '*' key.finger
|
||||
'''
|
||||
- return salt.utils.pem_finger(
|
||||
- os.path.join(__opts__['pki_dir'], 'minion.pub')
|
||||
- )
|
||||
+ return salt.utils.pem_finger(os.path.join(__opts__['pki_dir'], 'minion.pub'),
|
||||
+ sum_type=__opts__.get('hash_type', 'md5'))
|
||||
|
||||
|
||||
def finger_master():
|
||||
@@ -36,6 +35,5 @@ def finger_master():
|
||||
|
||||
salt '*' key.finger_master
|
||||
'''
|
||||
- return salt.utils.pem_finger(
|
||||
- os.path.join(__opts__['pki_dir'], 'minion_master.pub')
|
||||
- )
|
||||
+ return salt.utils.pem_finger(os.path.join(__opts__['pki_dir'], 'minion_master.pub'),
|
||||
+ sum_type=__opts__.get('hash_type', 'md5'))
|
||||
diff --git a/salt/modules/tomcat.py b/salt/modules/tomcat.py
|
||||
index d3df2dc..4a7f0eb 100644
|
||||
--- a/salt/modules/tomcat.py
|
||||
+++ b/salt/modules/tomcat.py
|
||||
@@ -610,7 +610,7 @@ def deploy_war(war,
|
||||
|
||||
def passwd(passwd,
|
||||
user='',
|
||||
- alg='md5',
|
||||
+ alg='sha1',
|
||||
realm=None):
|
||||
'''
|
||||
This function replaces the $CATALINA_HOME/bin/digest.sh script
|
||||
@@ -625,23 +625,15 @@ def passwd(passwd,
|
||||
salt '*' tomcat.passwd secret tomcat sha1
|
||||
salt '*' tomcat.passwd secret tomcat sha1 'Protected Realm'
|
||||
'''
|
||||
- if alg == 'md5':
|
||||
- m = hashlib.md5()
|
||||
- elif alg == 'sha1':
|
||||
- m = hashlib.sha1()
|
||||
- else:
|
||||
- return False
|
||||
-
|
||||
- if realm:
|
||||
- m.update('{0}:{1}:{2}'.format(
|
||||
- user,
|
||||
- realm,
|
||||
- passwd,
|
||||
- ))
|
||||
- else:
|
||||
- m.update(passwd)
|
||||
+ # Shouldn't it be SHA265 instead of SHA1?
|
||||
+ digest = hasattr(hashlib, alg) and getattr(hashlib, alg) or None
|
||||
+ if digest:
|
||||
+ if realm:
|
||||
+ digest.update('{0}:{1}:{2}'.format(user, realm, passwd,))
|
||||
+ else:
|
||||
+ digest.update(passwd)
|
||||
|
||||
- return m.hexdigest()
|
||||
+ return digest and digest.hexdigest() or False
|
||||
|
||||
|
||||
# Non-Manager functions
|
||||
diff --git a/salt/modules/win_file.py b/salt/modules/win_file.py
|
||||
index 7911bfc..5ea31ae 100644
|
||||
--- a/salt/modules/win_file.py
|
||||
+++ b/salt/modules/win_file.py
|
||||
@@ -842,7 +842,7 @@ def chgrp(path, group):
|
||||
return None
|
||||
|
||||
|
||||
-def stats(path, hash_type='md5', follow_symlinks=True):
|
||||
+def stats(path, hash_type='sha256', follow_symlinks=True):
|
||||
'''
|
||||
Return a dict containing the stats for a given file
|
||||
|
||||
diff --git a/salt/utils/__init__.py b/salt/utils/__init__.py
|
||||
index c6a3fd3..4e40caf 100644
|
||||
--- a/salt/utils/__init__.py
|
||||
+++ b/salt/utils/__init__.py
|
||||
@@ -858,10 +858,11 @@ def path_join(*parts):
|
||||
))
|
||||
|
||||
|
||||
-def pem_finger(path=None, key=None, sum_type='md5'):
|
||||
+def pem_finger(path=None, key=None, sum_type='sha256'):
|
||||
'''
|
||||
Pass in either a raw pem string, or the path on disk to the location of a
|
||||
- pem file, and the type of cryptographic hash to use. The default is md5.
|
||||
+ pem file, and the type of cryptographic hash to use. The default is SHA256.
|
||||
+
|
||||
The fingerprint of the pem will be returned.
|
||||
|
||||
If neither a key nor a path are passed in, a blank string will be returned.
|
||||
@@ -1979,7 +1980,7 @@ def safe_walk(top, topdown=True, onerror=None, followlinks=True, _seen=None):
|
||||
yield top, dirs, nondirs
|
||||
|
||||
|
||||
-def get_hash(path, form='md5', chunk_size=65536):
|
||||
+def get_hash(path, form='sha256', chunk_size=65536):
|
||||
'''
|
||||
Get the hash sum of a file
|
||||
|
||||
@@ -1989,10 +1990,10 @@ def get_hash(path, form='md5', chunk_size=65536):
|
||||
``get_sum`` cannot really be trusted since it is vulnerable to
|
||||
collisions: ``get_sum(..., 'xyz') == 'Hash xyz not supported'``
|
||||
'''
|
||||
- try:
|
||||
- hash_type = getattr(hashlib, form)
|
||||
- except (AttributeError, TypeError):
|
||||
+ hash_type = hasattr(hashlib, form) and getattr(hashlib, form) or None
|
||||
+ if hash_type is None:
|
||||
raise ValueError('Invalid hash type: {0}'.format(form))
|
||||
+
|
||||
with salt.utils.fopen(path, 'rb') as ifile:
|
||||
hash_obj = hash_type()
|
||||
# read the file in in chunks, not the entire file
|
||||
diff --git a/salt/utils/cloud.py b/salt/utils/cloud.py
|
||||
index d546e51..7a21166 100644
|
||||
--- a/salt/utils/cloud.py
|
||||
+++ b/salt/utils/cloud.py
|
||||
@@ -2421,6 +2421,7 @@ def init_cachedir(base=None):
|
||||
|
||||
def request_minion_cachedir(
|
||||
minion_id,
|
||||
+ opts=None,
|
||||
fingerprint='',
|
||||
pubkey=None,
|
||||
provider=None,
|
||||
@@ -2440,7 +2441,7 @@ def request_minion_cachedir(
|
||||
|
||||
if not fingerprint:
|
||||
if pubkey is not None:
|
||||
- fingerprint = salt.utils.pem_finger(key=pubkey)
|
||||
+ fingerprint = salt.utils.pem_finger(key=pubkey, sum_type=(opts and opts.get('hash_type') or 'sha256'))
|
||||
|
||||
init_cachedir(base)
|
||||
|
||||
diff --git a/tests/unit/daemons_test.py b/tests/unit/daemons_test.py
|
||||
new file mode 100644
|
||||
index 0000000..47d5e8a
|
||||
--- /dev/null
|
||||
+++ b/tests/unit/daemons_test.py
|
||||
@@ -0,0 +1,209 @@
|
||||
+# -*- coding: utf-8 -*-
|
||||
+'''
|
||||
+ :codeauthor: :email:`Bo Maryniuk <bo@suse.de>`
|
||||
+'''
|
||||
+
|
||||
+# Import python libs
|
||||
+from __future__ import absolute_import
|
||||
+
|
||||
+# Import Salt Testing libs
|
||||
+from salttesting import TestCase, skipIf
|
||||
+from salttesting.helpers import ensure_in_syspath
|
||||
+from salttesting.mock import patch, MagicMock, NO_MOCK, NO_MOCK_REASON
|
||||
+
|
||||
+ensure_in_syspath('../')
|
||||
+
|
||||
+# Import Salt libs
|
||||
+import integration
|
||||
+from salt.cli import daemons
|
||||
+
|
||||
+
|
||||
+class LoggerMock(object):
|
||||
+ '''
|
||||
+ Logger data collector
|
||||
+ '''
|
||||
+
|
||||
+ def __init__(self):
|
||||
+ '''
|
||||
+ init
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self.reset()
|
||||
+
|
||||
+ def reset(self):
|
||||
+ '''
|
||||
+ Reset values
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self.last_message = self.last_type = None
|
||||
+
|
||||
+ def info(self, data):
|
||||
+ '''
|
||||
+ Collects the data from the logger of info type.
|
||||
+
|
||||
+ :param data:
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self.last_message = data
|
||||
+ self.last_type = 'info'
|
||||
+
|
||||
+ def warning(self, data):
|
||||
+ '''
|
||||
+ Collects the data from the logger of warning type.
|
||||
+
|
||||
+ :param data:
|
||||
+ :return:
|
||||
+ '''
|
||||
+ self.last_message = data
|
||||
+ self.last_type = 'warning'
|
||||
+
|
||||
+
|
||||
+@skipIf(NO_MOCK, NO_MOCK_REASON)
|
||||
+class DaemonsStarterTestCase(TestCase, integration.SaltClientTestCaseMixIn):
|
||||
+ '''
|
||||
+ Unit test for the daemons starter classes.
|
||||
+ '''
|
||||
+
|
||||
+ def test_master_daemon_hash_type_verified(self):
|
||||
+ '''
|
||||
+ Verify if Master is verifying hash_type config option.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+ def _create_master():
|
||||
+ '''
|
||||
+ Create master instance
|
||||
+ :return:
|
||||
+ '''
|
||||
+ master = daemons.Master()
|
||||
+ master.config = {'user': 'dummy', 'hash_type': alg}
|
||||
+ for attr in ['master', 'start_log_info', 'prepare']:
|
||||
+ setattr(master, attr, MagicMock())
|
||||
+
|
||||
+ return master
|
||||
+
|
||||
+ _logger = LoggerMock()
|
||||
+ with patch('salt.cli.daemons.check_user', MagicMock(return_value=True)):
|
||||
+ with patch('salt.cli.daemons.logger', _logger):
|
||||
+ for alg in ['md5', 'sha1']:
|
||||
+ _create_master().start()
|
||||
+ self.assertEqual(_logger.last_type, 'warning')
|
||||
+ self.assertTrue(_logger.last_message)
|
||||
+ self.assertTrue(_logger.last_message.find('Do not use {alg}'.format(alg=alg)) > -1)
|
||||
+
|
||||
+ _logger.reset()
|
||||
+
|
||||
+ for alg in ['sha224', 'sha256', 'sha384', 'sha512']:
|
||||
+ _create_master().start()
|
||||
+ self.assertEqual(_logger.last_type, None)
|
||||
+ self.assertFalse(_logger.last_message)
|
||||
+
|
||||
+ def test_minion_daemon_hash_type_verified(self):
|
||||
+ '''
|
||||
+ Verify if Minion is verifying hash_type config option.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+
|
||||
+ def _create_minion():
|
||||
+ '''
|
||||
+ Create minion instance
|
||||
+ :return:
|
||||
+ '''
|
||||
+ obj = daemons.Minion()
|
||||
+ obj.config = {'user': 'dummy', 'hash_type': alg}
|
||||
+ for attr in ['minion', 'start_log_info', 'prepare', 'shutdown']:
|
||||
+ setattr(obj, attr, MagicMock())
|
||||
+
|
||||
+ return obj
|
||||
+
|
||||
+ _logger = LoggerMock()
|
||||
+ with patch('salt.cli.daemons.check_user', MagicMock(return_value=True)):
|
||||
+ with patch('salt.cli.daemons.logger', _logger):
|
||||
+ for alg in ['md5', 'sha1']:
|
||||
+ _create_minion().start()
|
||||
+ self.assertEqual(_logger.last_type, 'warning')
|
||||
+ self.assertTrue(_logger.last_message)
|
||||
+ self.assertTrue(_logger.last_message.find('Do not use {alg}'.format(alg=alg)) > -1)
|
||||
+
|
||||
+ _logger.reset()
|
||||
+
|
||||
+ for alg in ['sha224', 'sha256', 'sha384', 'sha512']:
|
||||
+ _create_minion().start()
|
||||
+ self.assertEqual(_logger.last_type, None)
|
||||
+ self.assertFalse(_logger.last_message)
|
||||
+
|
||||
+ def test_proxy_minion_daemon_hash_type_verified(self):
|
||||
+ '''
|
||||
+ Verify if ProxyMinion is verifying hash_type config option.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+
|
||||
+ def _create_proxy_minion():
|
||||
+ '''
|
||||
+ Create proxy minion instance
|
||||
+ :return:
|
||||
+ '''
|
||||
+ obj = daemons.ProxyMinion()
|
||||
+ obj.config = {'user': 'dummy', 'hash_type': alg}
|
||||
+ for attr in ['minion', 'start_log_info', 'prepare', 'shutdown']:
|
||||
+ setattr(obj, attr, MagicMock())
|
||||
+
|
||||
+ return obj
|
||||
+
|
||||
+ _logger = LoggerMock()
|
||||
+ with patch('salt.cli.daemons.check_user', MagicMock(return_value=True)):
|
||||
+ with patch('salt.cli.daemons.logger', _logger):
|
||||
+ for alg in ['md5', 'sha1']:
|
||||
+ _create_proxy_minion().start()
|
||||
+ self.assertEqual(_logger.last_type, 'warning')
|
||||
+ self.assertTrue(_logger.last_message)
|
||||
+ self.assertTrue(_logger.last_message.find('Do not use {alg}'.format(alg=alg)) > -1)
|
||||
+
|
||||
+ _logger.reset()
|
||||
+
|
||||
+ for alg in ['sha224', 'sha256', 'sha384', 'sha512']:
|
||||
+ _create_proxy_minion().start()
|
||||
+ self.assertEqual(_logger.last_type, None)
|
||||
+ self.assertFalse(_logger.last_message)
|
||||
+
|
||||
+ def test_syndic_daemon_hash_type_verified(self):
|
||||
+ '''
|
||||
+ Verify if Syndic is verifying hash_type config option.
|
||||
+
|
||||
+ :return:
|
||||
+ '''
|
||||
+
|
||||
+ def _create_syndic():
|
||||
+ '''
|
||||
+ Create syndic instance
|
||||
+ :return:
|
||||
+ '''
|
||||
+ obj = daemons.Syndic()
|
||||
+ obj.config = {'user': 'dummy', 'hash_type': alg}
|
||||
+ for attr in ['syndic', 'start_log_info', 'prepare', 'shutdown']:
|
||||
+ setattr(obj, attr, MagicMock())
|
||||
+
|
||||
+ return obj
|
||||
+
|
||||
+ _logger = LoggerMock()
|
||||
+ with patch('salt.cli.daemons.check_user', MagicMock(return_value=True)):
|
||||
+ with patch('salt.cli.daemons.logger', _logger):
|
||||
+ for alg in ['md5', 'sha1']:
|
||||
+ _create_syndic().start()
|
||||
+ self.assertEqual(_logger.last_type, 'warning')
|
||||
+ self.assertTrue(_logger.last_message)
|
||||
+ self.assertTrue(_logger.last_message.find('Do not use {alg}'.format(alg=alg)) > -1)
|
||||
+
|
||||
+ _logger.reset()
|
||||
+
|
||||
+ for alg in ['sha224', 'sha256', 'sha384', 'sha512']:
|
||||
+ _create_syndic().start()
|
||||
+ self.assertEqual(_logger.last_type, None)
|
||||
+ self.assertFalse(_logger.last_message)
|
||||
+
|
||||
+if __name__ == '__main__':
|
||||
+ from integration import run_tests
|
||||
+ run_tests(DaemonsStarterTestCase, needs_daemon=False)
|
||||
--
|
||||
2.7.2
|
||||
|
@ -1,228 +0,0 @@
|
||||
From 79d5477cfa5e85d2480bb07e49ecaeff423f5238 Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@suse.de>
|
||||
Date: Thu, 10 Mar 2016 13:25:20 +0100
|
||||
Subject: [PATCH 30/33] Bugfix: on SLE11 series base product reported as
|
||||
additional
|
||||
|
||||
Add SLE11 product info snapshot, rename previous
|
||||
|
||||
Update test case to cover SLE11 and SLE12
|
||||
---
|
||||
salt/modules/zypper.py | 2 +-
|
||||
.../unit/modules/zypp/zypper-products-sle11sp3.xml | 37 +++++++++++++++
|
||||
.../unit/modules/zypp/zypper-products-sle12sp1.xml | 37 +++++++++++++++
|
||||
tests/unit/modules/zypp/zypper-products.xml | 37 ---------------
|
||||
tests/unit/modules/zypper_test.py | 52 +++++++++++++---------
|
||||
5 files changed, 107 insertions(+), 58 deletions(-)
|
||||
create mode 100644 tests/unit/modules/zypp/zypper-products-sle11sp3.xml
|
||||
create mode 100644 tests/unit/modules/zypp/zypper-products-sle12sp1.xml
|
||||
delete mode 100644 tests/unit/modules/zypp/zypper-products.xml
|
||||
|
||||
diff --git a/salt/modules/zypper.py b/salt/modules/zypper.py
|
||||
index d6628aa..1c6b31d 100644
|
||||
--- a/salt/modules/zypper.py
|
||||
+++ b/salt/modules/zypper.py
|
||||
@@ -1373,7 +1373,7 @@ def list_products(all=False, refresh=False):
|
||||
for prd in doc.getElementsByTagName('product-list')[0].getElementsByTagName('product'):
|
||||
p_nfo = dict()
|
||||
for k_p_nfo, v_p_nfo in prd.attributes.items():
|
||||
- p_nfo[k_p_nfo] = k_p_nfo not in ['isbase', 'installed'] and v_p_nfo or v_p_nfo == 'true'
|
||||
+ p_nfo[k_p_nfo] = k_p_nfo not in ['isbase', 'installed'] and v_p_nfo or v_p_nfo in ['true', '1']
|
||||
p_nfo['eol'] = prd.getElementsByTagName('endoflife')[0].getAttribute('text')
|
||||
p_nfo['eol_t'] = int(prd.getElementsByTagName('endoflife')[0].getAttribute('time_t'))
|
||||
p_nfo['description'] = " ".join(
|
||||
diff --git a/tests/unit/modules/zypp/zypper-products-sle11sp3.xml b/tests/unit/modules/zypp/zypper-products-sle11sp3.xml
|
||||
new file mode 100644
|
||||
index 0000000..89a85e3
|
||||
--- /dev/null
|
||||
+++ b/tests/unit/modules/zypp/zypper-products-sle11sp3.xml
|
||||
@@ -0,0 +1,37 @@
|
||||
+<?xml version='1.0'?>
|
||||
+<stream>
|
||||
+<message type="info">Refreshing service 'nu_novell_com'.</message>
|
||||
+<message type="info">Loading repository data...</message>
|
||||
+<message type="info">Reading installed packages...</message>
|
||||
+<product-list>
|
||||
+<product name="SUSE_SLES" version="11.3" release="1.138" epoch="0" arch="x86_64" productline="" registerrelease="" vendor="SUSE LINUX Products GmbH, Nuernberg, Germany" summary="SUSE Linux Enterprise Server 11 SP3" shortname="" flavor="" isbase="0" repo="nu_novell_com:SLES11-SP3-Pool" installed="0"><endoflife time_t="0" text="1970-01-01T01:00:00+0100"/>0x7ffdb538e948<description>SUSE Linux Enterprise offers a comprehensive
|
||||
+ suite of products built on a single code base.
|
||||
+ The platform addresses business needs from
|
||||
+ the smallest thin-client devices to the world’s
|
||||
+ most powerful high-performance computing
|
||||
+ and mainframe servers. SUSE Linux Enterprise
|
||||
+ offers common management tools and technology
|
||||
+ certifications across the platform, and
|
||||
+ each product is enterprise-class.</description></product>
|
||||
+<product name="SUSE_SLES-SP4-migration" version="11.3" release="1.4" epoch="0" arch="x86_64" productline="" registerrelease="" vendor="SUSE LINUX Products GmbH, Nuernberg, Germany" summary="SUSE_SLES Service Pack 4 Migration Product" shortname="" flavor="" isbase="0" repo="nu_novell_com:SLES11-SP3-Updates" installed="0"><endoflife time_t="0" text="1970-01-01T01:00:00+0100"/>0x7ffdb538e948<description>SUSE Linux Enterprise offers a comprehensive
|
||||
+ suite of products built on a single code base.
|
||||
+ The platform addresses business needs from
|
||||
+ the smallest thin-client devices to the world’s
|
||||
+ most powerful high-performance computing
|
||||
+ and mainframe servers. SUSE Linux Enterprise
|
||||
+ offers common management tools and technology
|
||||
+ certifications across the platform, and
|
||||
+ each product is enterprise-class.</description></product>
|
||||
+<product name="SUSE_SLES" version="11.3" release="1.201" epoch="0" arch="x86_64" productline="" registerrelease="" vendor="SUSE LINUX Products GmbH, Nuernberg, Germany" summary="SUSE Linux Enterprise Server 11 SP3" shortname="" flavor="" isbase="0" repo="nu_novell_com:SLES11-SP3-Updates" installed="0"><endoflife time_t="0" text="1970-01-01T01:00:00+0100"/>0x7ffdb538e948<description>SUSE Linux Enterprise offers a comprehensive
|
||||
+ suite of products built on a single code base.
|
||||
+ The platform addresses business needs from
|
||||
+ the smallest thin-client devices to the world’s
|
||||
+ most powerful high-performance computing
|
||||
+ and mainframe servers. SUSE Linux Enterprise
|
||||
+ offers common management tools and technology
|
||||
+ certifications across the platform, and
|
||||
+ each product is enterprise-class.</description></product>
|
||||
+<product name="SUSE-Manager-Server" version="2.1" release="1.2" epoch="0" arch="x86_64" productline="" registerrelease="" vendor="SUSE LINUX Products GmbH, Nuernberg, Germany" summary="SUSE Manager Server" shortname="" flavor="cd" isbase="0" repo="nu_novell_com:SUSE-Manager-Server-2.1-Pool" installed="0"><endoflife time_t="0" text="1970-01-01T01:00:00+0100"/>0x7ffdb538e948<description>SUSE Manager Server appliance</description></product>
|
||||
+<product name="SUSE-Manager-Server" version="2.1" release="1.2" epoch="0" arch="x86_64" productline="manager" registerrelease="" vendor="SUSE LINUX Products GmbH, Nuernberg, Germany" summary="SUSE Manager Server" shortname="" flavor="cd" isbase="1" repo="@System" installed="1"><endoflife time_t="0" text="1970-01-01T01:00:00+0100"/>0x7ffdb538e948<description>SUSE Manager Server appliance</description></product>
|
||||
+</product-list>
|
||||
+</stream>
|
||||
diff --git a/tests/unit/modules/zypp/zypper-products-sle12sp1.xml b/tests/unit/modules/zypp/zypper-products-sle12sp1.xml
|
||||
new file mode 100644
|
||||
index 0000000..1a50363
|
||||
--- /dev/null
|
||||
+++ b/tests/unit/modules/zypp/zypper-products-sle12sp1.xml
|
||||
@@ -0,0 +1,37 @@
|
||||
+<?xml version='1.0'?>
|
||||
+<stream>
|
||||
+<message type="info">Loading repository data...</message>
|
||||
+<message type="info">Reading installed packages...</message>
|
||||
+<product-list>
|
||||
+<product name="SLES" version="12.1" release="0" epoch="0" arch="x86_64" vendor="SUSE LLC <https://www.suse.com/>" summary="SUSE Linux Enterprise Server 12 SP1" repo="SLE12-SP1-x86_64-Pool" productline="" registerrelease="" shortname="SLES12-SP1" flavor="POOL" isbase="false" installed="false"><endoflife time_t="1730332800" text="2024-10-31T01:00:00+01"/><registerflavor/><description>SUSE Linux Enterprise offers a comprehensive
|
||||
+ suite of products built on a single code base.
|
||||
+ The platform addresses business needs from
|
||||
+ the smallest thin-client devices to the world's
|
||||
+ most powerful high-performance computing
|
||||
+ and mainframe servers. SUSE Linux Enterprise
|
||||
+ offers common management tools and technology
|
||||
+ certifications across the platform, and
|
||||
+ each product is enterprise-class.</description></product>
|
||||
+<product name="SUSE-Manager-Proxy" version="3.0" release="0" epoch="0" arch="x86_64" vendor="obs://build.suse.de/Devel:Galaxy:Manager:Head" summary="SUSE Manager Proxy" repo="SUSE-Manager-Head" productline="" registerrelease="" shortname="SUSE Manager Proxy" flavor="DVD" isbase="false" installed="false"><endoflife time_t="1522454400" text="2018-03-31T02:00:00+02"/><registerflavor>extension</registerflavor><description>SUSE Manager Proxies extend large and/or geographically
|
||||
+dispersed SUSE Manager environments to reduce load on the SUSE Manager
|
||||
+Server, lower bandwidth needs, and provide faster local
|
||||
+updates.</description></product>
|
||||
+<product name="SUSE-Manager-Server" version="3.0" release="0" epoch="0" arch="x86_64" vendor="obs://build.suse.de/Devel:Galaxy:Manager:Head" summary="SUSE Manager Server" repo="SUSE-Manager-Head" productline="" registerrelease="" shortname="SUSE Manager Server" flavor="DVD" isbase="false" installed="false"><endoflife time_t="1522454400" text="2018-03-31T02:00:00+02"/><registerflavor>extension</registerflavor><description>SUSE Manager lets you efficiently manage physical, virtual,
|
||||
+and cloud-based Linux systems. It provides automated and cost-effective
|
||||
+configuration and software management, asset management, and system
|
||||
+provisioning.</description></product>
|
||||
+<product name="sle-manager-tools-beta" version="12" release="0" epoch="0" arch="x86_64" vendor="obs://build.suse.de/Devel:Galaxy:Manager:Head" summary="SUSE Manager Tools" repo="SUSE-Manager-Head" productline="" registerrelease="" shortname="Manager-Tools" flavor="POOL" isbase="false" installed="false"><endoflife time_t="1509408000" text="2017-10-31T01:00:00+01"/><registerflavor>extension</registerflavor><description><p>
|
||||
+ SUSE Manager Tools provide packages required to connect to a
|
||||
+ SUSE Manager Server.
|
||||
+ <p></description></product>
|
||||
+<product name="SLES" version="12.1" release="0" epoch="0" arch="x86_64" vendor="SUSE" summary="SUSE Linux Enterprise Server 12 SP1" repo="@System" productline="sles" registerrelease="" shortname="SLES12-SP1" flavor="DVD" isbase="true" installed="true"><endoflife time_t="1730332800" text="2024-10-31T01:00:00+01"/><registerflavor/><description>SUSE Linux Enterprise offers a comprehensive
|
||||
+ suite of products built on a single code base.
|
||||
+ The platform addresses business needs from
|
||||
+ the smallest thin-client devices to the world's
|
||||
+ most powerful high-performance computing
|
||||
+ and mainframe servers. SUSE Linux Enterprise
|
||||
+ offers common management tools and technology
|
||||
+ certifications across the platform, and
|
||||
+ each product is enterprise-class.</description></product>
|
||||
+</product-list>
|
||||
+</stream>
|
||||
diff --git a/tests/unit/modules/zypp/zypper-products.xml b/tests/unit/modules/zypp/zypper-products.xml
|
||||
deleted file mode 100644
|
||||
index 1a50363..0000000
|
||||
--- a/tests/unit/modules/zypp/zypper-products.xml
|
||||
+++ /dev/null
|
||||
@@ -1,37 +0,0 @@
|
||||
-<?xml version='1.0'?>
|
||||
-<stream>
|
||||
-<message type="info">Loading repository data...</message>
|
||||
-<message type="info">Reading installed packages...</message>
|
||||
-<product-list>
|
||||
-<product name="SLES" version="12.1" release="0" epoch="0" arch="x86_64" vendor="SUSE LLC <https://www.suse.com/>" summary="SUSE Linux Enterprise Server 12 SP1" repo="SLE12-SP1-x86_64-Pool" productline="" registerrelease="" shortname="SLES12-SP1" flavor="POOL" isbase="false" installed="false"><endoflife time_t="1730332800" text="2024-10-31T01:00:00+01"/><registerflavor/><description>SUSE Linux Enterprise offers a comprehensive
|
||||
- suite of products built on a single code base.
|
||||
- The platform addresses business needs from
|
||||
- the smallest thin-client devices to the world's
|
||||
- most powerful high-performance computing
|
||||
- and mainframe servers. SUSE Linux Enterprise
|
||||
- offers common management tools and technology
|
||||
- certifications across the platform, and
|
||||
- each product is enterprise-class.</description></product>
|
||||
-<product name="SUSE-Manager-Proxy" version="3.0" release="0" epoch="0" arch="x86_64" vendor="obs://build.suse.de/Devel:Galaxy:Manager:Head" summary="SUSE Manager Proxy" repo="SUSE-Manager-Head" productline="" registerrelease="" shortname="SUSE Manager Proxy" flavor="DVD" isbase="false" installed="false"><endoflife time_t="1522454400" text="2018-03-31T02:00:00+02"/><registerflavor>extension</registerflavor><description>SUSE Manager Proxies extend large and/or geographically
|
||||
-dispersed SUSE Manager environments to reduce load on the SUSE Manager
|
||||
-Server, lower bandwidth needs, and provide faster local
|
||||
-updates.</description></product>
|
||||
-<product name="SUSE-Manager-Server" version="3.0" release="0" epoch="0" arch="x86_64" vendor="obs://build.suse.de/Devel:Galaxy:Manager:Head" summary="SUSE Manager Server" repo="SUSE-Manager-Head" productline="" registerrelease="" shortname="SUSE Manager Server" flavor="DVD" isbase="false" installed="false"><endoflife time_t="1522454400" text="2018-03-31T02:00:00+02"/><registerflavor>extension</registerflavor><description>SUSE Manager lets you efficiently manage physical, virtual,
|
||||
-and cloud-based Linux systems. It provides automated and cost-effective
|
||||
-configuration and software management, asset management, and system
|
||||
-provisioning.</description></product>
|
||||
-<product name="sle-manager-tools-beta" version="12" release="0" epoch="0" arch="x86_64" vendor="obs://build.suse.de/Devel:Galaxy:Manager:Head" summary="SUSE Manager Tools" repo="SUSE-Manager-Head" productline="" registerrelease="" shortname="Manager-Tools" flavor="POOL" isbase="false" installed="false"><endoflife time_t="1509408000" text="2017-10-31T01:00:00+01"/><registerflavor>extension</registerflavor><description><p>
|
||||
- SUSE Manager Tools provide packages required to connect to a
|
||||
- SUSE Manager Server.
|
||||
- <p></description></product>
|
||||
-<product name="SLES" version="12.1" release="0" epoch="0" arch="x86_64" vendor="SUSE" summary="SUSE Linux Enterprise Server 12 SP1" repo="@System" productline="sles" registerrelease="" shortname="SLES12-SP1" flavor="DVD" isbase="true" installed="true"><endoflife time_t="1730332800" text="2024-10-31T01:00:00+01"/><registerflavor/><description>SUSE Linux Enterprise offers a comprehensive
|
||||
- suite of products built on a single code base.
|
||||
- The platform addresses business needs from
|
||||
- the smallest thin-client devices to the world's
|
||||
- most powerful high-performance computing
|
||||
- and mainframe servers. SUSE Linux Enterprise
|
||||
- offers common management tools and technology
|
||||
- certifications across the platform, and
|
||||
- each product is enterprise-class.</description></product>
|
||||
-</product-list>
|
||||
-</stream>
|
||||
diff --git a/tests/unit/modules/zypper_test.py b/tests/unit/modules/zypper_test.py
|
||||
index f89d18f..5c4eb67 100644
|
||||
--- a/tests/unit/modules/zypper_test.py
|
||||
+++ b/tests/unit/modules/zypper_test.py
|
||||
@@ -150,26 +150,38 @@ class ZypperTestCase(TestCase):
|
||||
'''
|
||||
List products test.
|
||||
'''
|
||||
- ref_out = {
|
||||
- 'retcode': 0,
|
||||
- 'stdout': get_test_data('zypper-products.xml')
|
||||
- }
|
||||
- with patch.dict(zypper.__salt__, {'cmd.run_all': MagicMock(return_value=ref_out)}):
|
||||
- products = zypper.list_products()
|
||||
- self.assertEqual(len(products), 5)
|
||||
- self.assertEqual(['SLES', 'SLES', 'SUSE-Manager-Proxy', 'SUSE-Manager-Server', 'sle-manager-tools-beta'],
|
||||
- sorted([prod['name'] for prod in products]))
|
||||
- self.assertIn('SUSE LLC <https://www.suse.com/>', [product['vendor'] for product in products])
|
||||
- self.assertEqual([False, False, False, False, True],
|
||||
- sorted([product['isbase'] for product in products]))
|
||||
- self.assertEqual([False, False, False, False, True],
|
||||
- sorted([product['installed'] for product in products]))
|
||||
- self.assertEqual(['0', '0', '0', '0', '0'],
|
||||
- sorted([product['release'] for product in products]))
|
||||
- self.assertEqual([False, False, False, False, u'sles'],
|
||||
- sorted([product['productline'] for product in products]))
|
||||
- self.assertEqual([1509408000, 1522454400, 1522454400, 1730332800, 1730332800],
|
||||
- sorted([product['eol_t'] for product in products]))
|
||||
+ for filename, test_data in {
|
||||
+ 'zypper-products-sle12sp1.xml': {
|
||||
+ 'name': ['SLES', 'SLES', 'SUSE-Manager-Proxy',
|
||||
+ 'SUSE-Manager-Server', 'sle-manager-tools-beta'],
|
||||
+ 'vendor': 'SUSE LLC <https://www.suse.com/>',
|
||||
+ 'release': ['0', '0', '0', '0', '0'],
|
||||
+ 'productline': [False, False, False, False, 'sles'],
|
||||
+ 'eol_t': [1509408000, 1522454400, 1522454400, 1730332800, 1730332800],
|
||||
+ 'isbase': [False, False, False, False, True],
|
||||
+ 'installed': [False, False, False, False, True],
|
||||
+ },
|
||||
+ 'zypper-products-sle11sp3.xml': {
|
||||
+ 'name': ['SUSE-Manager-Server', 'SUSE-Manager-Server',
|
||||
+ 'SUSE_SLES', 'SUSE_SLES', 'SUSE_SLES-SP4-migration'],
|
||||
+ 'vendor': 'SUSE LINUX Products GmbH, Nuernberg, Germany',
|
||||
+ 'release': ['1.138', '1.2', '1.2', '1.201', '1.4'],
|
||||
+ 'productline': [False, False, False, False, 'manager'],
|
||||
+ 'eol_t': [0, 0, 0, 0, 0],
|
||||
+ 'isbase': [False, False, False, False, True],
|
||||
+ 'installed': [False, False, False, False, True],
|
||||
+ }}.items():
|
||||
+ ref_out = {
|
||||
+ 'retcode': 0,
|
||||
+ 'stdout': get_test_data(filename)
|
||||
+ }
|
||||
+
|
||||
+ with patch.dict(zypper.__salt__, {'cmd.run_all': MagicMock(return_value=ref_out)}):
|
||||
+ products = zypper.list_products()
|
||||
+ self.assertEqual(len(products), 5)
|
||||
+ self.assertIn(test_data['vendor'], [product['vendor'] for product in products])
|
||||
+ for kwd in ['name', 'isbase', 'installed', 'release', 'productline', 'eol_t']:
|
||||
+ self.assertEqual(test_data[kwd], sorted([prod[kwd] for prod in products]))
|
||||
|
||||
def test_refresh_db(self):
|
||||
'''
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,42 +0,0 @@
|
||||
From d3d7d20b569ad1ae5bc8a7ba1ac6652aa2e47ec5 Mon Sep 17 00:00:00 2001
|
||||
From: rallytime <nicole@saltstack.com>
|
||||
Date: Tue, 23 Feb 2016 17:20:47 -0700
|
||||
Subject: [PATCH 31/33] Only use LONGSIZE in rpm.info if available. Otherwise,
|
||||
use SIZE.
|
||||
|
||||
Fixes #31366
|
||||
---
|
||||
salt/modules/rpm.py | 10 +++++++++-
|
||||
1 file changed, 9 insertions(+), 1 deletion(-)
|
||||
|
||||
diff --git a/salt/modules/rpm.py b/salt/modules/rpm.py
|
||||
index 51c72c9..cdf91a6 100644
|
||||
--- a/salt/modules/rpm.py
|
||||
+++ b/salt/modules/rpm.py
|
||||
@@ -422,6 +422,14 @@ def info(*packages, **attr):
|
||||
salt '*' lowpkg.info apache2 bash attr=version
|
||||
salt '*' lowpkg.info apache2 bash attr=version,build_date_iso,size
|
||||
'''
|
||||
+ # LONGSIZE is not a valid tag for all versions of rpm. If LONGSIZE isn't
|
||||
+ # available, then we can just use SIZE for older versions. See Issue #31366.
|
||||
+ rpm_tags = __salt__['cmd.run_all']('rpm --querytags')
|
||||
+ rpm_tags = rpm_tags.get('stdout').split('\n')
|
||||
+ if 'LONGSIZE' in rpm_tags:
|
||||
+ size_tag = '%{LONGSIZE}'
|
||||
+ else:
|
||||
+ size_tag = '%{SIZE}'
|
||||
|
||||
cmd = packages and "rpm -q {0}".format(' '.join(packages)) or "rpm -qa"
|
||||
|
||||
@@ -440,7 +448,7 @@ def info(*packages, **attr):
|
||||
"build_host": "build_host: %{BUILDHOST}\\n",
|
||||
"group": "group: %{GROUP}\\n",
|
||||
"source_rpm": "source_rpm: %{SOURCERPM}\\n",
|
||||
- "size": "size: %{LONGSIZE}\\n",
|
||||
+ "size": "size: " + size_tag + "\\n",
|
||||
"arch": "arch: %{ARCH}\\n",
|
||||
"license": "%|LICENSE?{license: %{LICENSE}\\n}|",
|
||||
"signature": "signature: %|DSAHEADER?{%{DSAHEADER:pgpsig}}:{%|RSAHEADER?{%{RSAHEADER:pgpsig}}:"
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,26 +0,0 @@
|
||||
From 5c0c3da01e3e64e7614d9d3cc52d8d9c18a06230 Mon Sep 17 00:00:00 2001
|
||||
From: rallytime <nicole@saltstack.com>
|
||||
Date: Tue, 23 Feb 2016 17:26:52 -0700
|
||||
Subject: [PATCH 32/33] Add error check when retcode is 0, but stderr is
|
||||
present
|
||||
|
||||
---
|
||||
salt/modules/rpm.py | 2 ++
|
||||
1 file changed, 2 insertions(+)
|
||||
|
||||
diff --git a/salt/modules/rpm.py b/salt/modules/rpm.py
|
||||
index cdf91a6..00cbd5d 100644
|
||||
--- a/salt/modules/rpm.py
|
||||
+++ b/salt/modules/rpm.py
|
||||
@@ -485,6 +485,8 @@ def info(*packages, **attr):
|
||||
if 'stderr' in call:
|
||||
comment += (call['stderr'] or call['stdout'])
|
||||
raise CommandExecutionError('{0}'.format(comment))
|
||||
+ elif 'error' in call['stderr']:
|
||||
+ raise CommandExecutionError(call['stderr'])
|
||||
else:
|
||||
out = call['stdout']
|
||||
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,39 +0,0 @@
|
||||
From ba08f6714222622467215c23c8284f992830e047 Mon Sep 17 00:00:00 2001
|
||||
From: Richard McIntosh <richard.c.mcintosh@gmail.com>
|
||||
Date: Thu, 10 Mar 2016 16:46:14 +0100
|
||||
Subject: [PATCH 33/33] fixing init system dectection on sles 11, refs #31617
|
||||
|
||||
---
|
||||
salt/modules/rh_service.py | 11 ++++++++---
|
||||
1 file changed, 8 insertions(+), 3 deletions(-)
|
||||
|
||||
diff --git a/salt/modules/rh_service.py b/salt/modules/rh_service.py
|
||||
index 910a75d..c8ebb52 100644
|
||||
--- a/salt/modules/rh_service.py
|
||||
+++ b/salt/modules/rh_service.py
|
||||
@@ -60,14 +60,19 @@ def __virtual__():
|
||||
if __grains__['os'] in enable:
|
||||
if __grains__['os'] == 'XenServer':
|
||||
return __virtualname__
|
||||
+
|
||||
+ if __grains__['os'] == 'SUSE':
|
||||
+ if str(__grains__['osrelease']).startswith('11'):
|
||||
+ return __virtualname__
|
||||
+ else:
|
||||
+ return (False, 'Cannot load rh_service module on SUSE > 11')
|
||||
+
|
||||
try:
|
||||
osrelease = float(__grains__.get('osrelease', 0))
|
||||
except ValueError:
|
||||
return (False, 'Cannot load rh_service module: '
|
||||
'osrelease grain, {0}, not a float,'.format(osrelease))
|
||||
- if __grains__['os'] == 'SUSE':
|
||||
- if osrelease >= 12:
|
||||
- return (False, 'Cannot load rh_service module on SUSE >= 12')
|
||||
+
|
||||
if __grains__['os'] == 'Fedora':
|
||||
if osrelease > 15:
|
||||
return (False, 'Cannot load rh_service module on Fedora >= 15')
|
||||
--
|
||||
2.1.4
|
||||
|
@ -1,963 +0,0 @@
|
||||
From 17bd6cf1edbcab0e343bc8fe0382756e1fd6c2a0 Mon Sep 17 00:00:00 2001
|
||||
From: Erik Johnson <palehose@gmail.com>
|
||||
Date: Fri, 11 Mar 2016 15:05:57 -0600
|
||||
Subject: [PATCH 34/35] Fix git_pillar race condition
|
||||
|
||||
- Strip whitespace when splitting
|
||||
- Add GitLockError exception class
|
||||
- salt.utils.gitfs: rewrite locking code
|
||||
|
||||
This does a few things:
|
||||
|
||||
1. Introduces the concept of a checkout lock, to prevent concurrent
|
||||
"_pillar" master funcs from trying to checkout the repo at the same
|
||||
time.
|
||||
2. Refrains from checking out unless the SHA has changed.
|
||||
3. Cleans up some usage of the GitProvider subclass' "url" attribute
|
||||
when "id" should be used.
|
||||
|
||||
- salt.runners.cache: Add ability to clear checkout locks
|
||||
- Pass through the lock_type
|
||||
This is necessary for "salt-run fileserver.clear_lock" to work
|
||||
- salt.fileserver: Add ability to clear checkout locks
|
||||
- Fix duplicate output
|
||||
- Use remote_ref instead of local_ref to see if checkout is necessary
|
||||
---
|
||||
salt/exceptions.py | 33 +++
|
||||
salt/fileserver/__init__.py | 10 +-
|
||||
salt/fileserver/gitfs.py | 4 +-
|
||||
salt/runners/cache.py | 31 ++-
|
||||
salt/runners/fileserver.py | 12 +-
|
||||
salt/utils/__init__.py | 12 ++
|
||||
salt/utils/gitfs.py | 499 +++++++++++++++++++++++++++++++++-----------
|
||||
7 files changed, 463 insertions(+), 138 deletions(-)
|
||||
|
||||
diff --git a/salt/exceptions.py b/salt/exceptions.py
|
||||
index 67bf323255ee..ed52f8c3622b 100644
|
||||
--- a/salt/exceptions.py
|
||||
+++ b/salt/exceptions.py
|
||||
@@ -98,6 +98,39 @@ class FileserverConfigError(SaltException):
|
||||
'''
|
||||
|
||||
|
||||
+class FileLockError(SaltException):
|
||||
+ '''
|
||||
+ Used when an error occurs obtaining a file lock
|
||||
+ '''
|
||||
+ def __init__(self, msg, time_start=None, *args, **kwargs):
|
||||
+ super(FileLockError, self).__init__(msg, *args, **kwargs)
|
||||
+ if time_start is None:
|
||||
+ log.warning(
|
||||
+ 'time_start should be provided when raising a FileLockError. '
|
||||
+ 'Defaulting to current time as a fallback, but this may '
|
||||
+ 'result in an inaccurate timeout.'
|
||||
+ )
|
||||
+ self.time_start = time.time()
|
||||
+ else:
|
||||
+ self.time_start = time_start
|
||||
+
|
||||
+
|
||||
+class GitLockError(SaltException):
|
||||
+ '''
|
||||
+ Raised when an uncaught error occurs in the midst of obtaining an
|
||||
+ update/checkout lock in salt.utils.gitfs.
|
||||
+
|
||||
+ NOTE: While this uses the errno param similar to an OSError, this exception
|
||||
+ class is *not* as subclass of OSError. This is done intentionally, so that
|
||||
+ this exception class can be caught in a try/except without being caught as
|
||||
+ an OSError.
|
||||
+ '''
|
||||
+ def __init__(self, errno, strerror, *args, **kwargs):
|
||||
+ super(GitLockError, self).__init__(strerror, *args, **kwargs)
|
||||
+ self.errno = errno
|
||||
+ self.strerror = strerror
|
||||
+
|
||||
+
|
||||
class SaltInvocationError(SaltException, TypeError):
|
||||
'''
|
||||
Used when the wrong number of arguments are sent to modules or invalid
|
||||
diff --git a/salt/fileserver/__init__.py b/salt/fileserver/__init__.py
|
||||
index c40e512d940c..8ff6f223a5eb 100644
|
||||
--- a/salt/fileserver/__init__.py
|
||||
+++ b/salt/fileserver/__init__.py
|
||||
@@ -272,7 +272,7 @@ def is_file_ignored(opts, fname):
|
||||
return False
|
||||
|
||||
|
||||
-def clear_lock(clear_func, lock_type, remote=None):
|
||||
+def clear_lock(clear_func, role, remote=None, lock_type='update'):
|
||||
'''
|
||||
Function to allow non-fileserver functions to clear update locks
|
||||
|
||||
@@ -282,7 +282,7 @@ def clear_lock(clear_func, lock_type, remote=None):
|
||||
lists, one containing messages describing successfully cleared locks,
|
||||
and one containing messages describing errors encountered.
|
||||
|
||||
- lock_type
|
||||
+ role
|
||||
What type of lock is being cleared (gitfs, git_pillar, etc.). Used
|
||||
solely for logging purposes.
|
||||
|
||||
@@ -290,14 +290,16 @@ def clear_lock(clear_func, lock_type, remote=None):
|
||||
Optional string which should be used in ``func`` to pattern match so
|
||||
that a subset of remotes can be targeted.
|
||||
|
||||
+ lock_type : update
|
||||
+ Which type of lock to clear
|
||||
|
||||
Returns the return data from ``clear_func``.
|
||||
'''
|
||||
- msg = 'Clearing update lock for {0} remotes'.format(lock_type)
|
||||
+ msg = 'Clearing {0} lock for {1} remotes'.format(lock_type, role)
|
||||
if remote:
|
||||
msg += ' matching {0}'.format(remote)
|
||||
log.debug(msg)
|
||||
- return clear_func(remote=remote)
|
||||
+ return clear_func(remote=remote, lock_type=lock_type)
|
||||
|
||||
|
||||
class Fileserver(object):
|
||||
diff --git a/salt/fileserver/gitfs.py b/salt/fileserver/gitfs.py
|
||||
index fc92964334e5..8f74e92c8649 100644
|
||||
--- a/salt/fileserver/gitfs.py
|
||||
+++ b/salt/fileserver/gitfs.py
|
||||
@@ -93,13 +93,13 @@ def clear_cache():
|
||||
return gitfs.clear_cache()
|
||||
|
||||
|
||||
-def clear_lock(remote=None):
|
||||
+def clear_lock(remote=None, lock_type='update'):
|
||||
'''
|
||||
Clear update.lk
|
||||
'''
|
||||
gitfs = salt.utils.gitfs.GitFS(__opts__)
|
||||
gitfs.init_remotes(__opts__['gitfs_remotes'], PER_REMOTE_OVERRIDES)
|
||||
- return gitfs.clear_lock(remote=remote)
|
||||
+ return gitfs.clear_lock(remote=remote, lock_type=lock_type)
|
||||
|
||||
|
||||
def lock(remote=None):
|
||||
diff --git a/salt/runners/cache.py b/salt/runners/cache.py
|
||||
index 674a85e9e75f..b96489773b8d 100644
|
||||
--- a/salt/runners/cache.py
|
||||
+++ b/salt/runners/cache.py
|
||||
@@ -238,7 +238,7 @@ def clear_all(tgt=None, expr_form='glob'):
|
||||
clear_mine_flag=True)
|
||||
|
||||
|
||||
-def clear_git_lock(role, remote=None):
|
||||
+def clear_git_lock(role, remote=None, **kwargs):
|
||||
'''
|
||||
.. versionadded:: 2015.8.2
|
||||
|
||||
@@ -261,12 +261,23 @@ def clear_git_lock(role, remote=None):
|
||||
have their lock cleared. For example, a ``remote`` value of **github**
|
||||
will remove the lock from all github.com remotes.
|
||||
|
||||
+ type : update,checkout
|
||||
+ The types of lock to clear. Can be ``update``, ``checkout``, or both of
|
||||
+ et (either comma-separated or as a Python list).
|
||||
+
|
||||
+ .. versionadded:: 2015.8.9
|
||||
+
|
||||
CLI Example:
|
||||
|
||||
.. code-block:: bash
|
||||
|
||||
salt-run cache.clear_git_lock git_pillar
|
||||
'''
|
||||
+ kwargs = salt.utils.clean_kwargs(**kwargs)
|
||||
+ type_ = salt.utils.split_input(kwargs.pop('type', ['update', 'checkout']))
|
||||
+ if kwargs:
|
||||
+ salt.utils.invalid_kwargs(kwargs)
|
||||
+
|
||||
if role == 'gitfs':
|
||||
git_objects = [salt.utils.gitfs.GitFS(__opts__)]
|
||||
git_objects[0].init_remotes(__opts__['gitfs_remotes'],
|
||||
@@ -315,11 +326,15 @@ def clear_git_lock(role, remote=None):
|
||||
|
||||
ret = {}
|
||||
for obj in git_objects:
|
||||
- cleared, errors = _clear_lock(obj.clear_lock, role, remote)
|
||||
- if cleared:
|
||||
- ret.setdefault('cleared', []).extend(cleared)
|
||||
- if errors:
|
||||
- ret.setdefault('errors', []).extend(errors)
|
||||
+ for lock_type in type_:
|
||||
+ cleared, errors = _clear_lock(obj.clear_lock,
|
||||
+ role,
|
||||
+ remote=remote,
|
||||
+ lock_type=lock_type)
|
||||
+ if cleared:
|
||||
+ ret.setdefault('cleared', []).extend(cleared)
|
||||
+ if errors:
|
||||
+ ret.setdefault('errors', []).extend(errors)
|
||||
if not ret:
|
||||
- ret = 'No locks were removed'
|
||||
- salt.output.display_output(ret, 'nested', opts=__opts__)
|
||||
+ return 'No locks were removed'
|
||||
+ return ret
|
||||
diff --git a/salt/runners/fileserver.py b/salt/runners/fileserver.py
|
||||
index c6efe0221a3c..7ec3d5e3e2a9 100644
|
||||
--- a/salt/runners/fileserver.py
|
||||
+++ b/salt/runners/fileserver.py
|
||||
@@ -293,8 +293,8 @@ def clear_cache(backend=None):
|
||||
if errors:
|
||||
ret['errors'] = errors
|
||||
if not ret:
|
||||
- ret = 'No cache was cleared'
|
||||
- salt.output.display_output(ret, 'nested', opts=__opts__)
|
||||
+ return 'No cache was cleared'
|
||||
+ return ret
|
||||
|
||||
|
||||
def clear_lock(backend=None, remote=None):
|
||||
@@ -334,8 +334,8 @@ def clear_lock(backend=None, remote=None):
|
||||
if errors:
|
||||
ret['errors'] = errors
|
||||
if not ret:
|
||||
- ret = 'No locks were removed'
|
||||
- salt.output.display_output(ret, 'nested', opts=__opts__)
|
||||
+ return 'No locks were removed'
|
||||
+ return ret
|
||||
|
||||
|
||||
def lock(backend=None, remote=None):
|
||||
@@ -376,5 +376,5 @@ def lock(backend=None, remote=None):
|
||||
if errors:
|
||||
ret['errors'] = errors
|
||||
if not ret:
|
||||
- ret = 'No locks were set'
|
||||
- salt.output.display_output(ret, 'nested', opts=__opts__)
|
||||
+ return 'No locks were set'
|
||||
+ return ret
|
||||
diff --git a/salt/utils/__init__.py b/salt/utils/__init__.py
|
||||
index 4e40cafd25d7..5ee73b168349 100644
|
||||
--- a/salt/utils/__init__.py
|
||||
+++ b/salt/utils/__init__.py
|
||||
@@ -2875,3 +2875,15 @@ def invalid_kwargs(invalid_kwargs, raise_exc=True):
|
||||
raise SaltInvocationError(msg)
|
||||
else:
|
||||
return msg
|
||||
+
|
||||
+
|
||||
+def split_input(val):
|
||||
+ '''
|
||||
+ Take an input value and split it into a list, returning the resulting list
|
||||
+ '''
|
||||
+ if isinstance(val, list):
|
||||
+ return val
|
||||
+ try:
|
||||
+ return [x.strip() for x in val.split(',')]
|
||||
+ except AttributeError:
|
||||
+ return [x.strip() for x in str(val).split(',')]
|
||||
diff --git a/salt/utils/gitfs.py b/salt/utils/gitfs.py
|
||||
index 411da5a2e8cd..289aa9ee5288 100644
|
||||
--- a/salt/utils/gitfs.py
|
||||
+++ b/salt/utils/gitfs.py
|
||||
@@ -3,6 +3,7 @@
|
||||
# Import python libs
|
||||
from __future__ import absolute_import
|
||||
import copy
|
||||
+import contextlib
|
||||
import distutils.version # pylint: disable=import-error,no-name-in-module
|
||||
import errno
|
||||
import fnmatch
|
||||
@@ -15,6 +16,7 @@ import shlex
|
||||
import shutil
|
||||
import stat
|
||||
import subprocess
|
||||
+import time
|
||||
from datetime import datetime
|
||||
|
||||
VALID_PROVIDERS = ('gitpython', 'pygit2', 'dulwich')
|
||||
@@ -57,7 +59,7 @@ import salt.utils
|
||||
import salt.utils.itertools
|
||||
import salt.utils.url
|
||||
import salt.fileserver
|
||||
-from salt.exceptions import FileserverConfigError
|
||||
+from salt.exceptions import FileserverConfigError, GitLockError
|
||||
from salt.utils.event import tagify
|
||||
|
||||
# Import third party libs
|
||||
@@ -298,29 +300,8 @@ class GitProvider(object):
|
||||
_check_ref(ret, base_ref, rname)
|
||||
return ret
|
||||
|
||||
- def check_lock(self):
|
||||
- '''
|
||||
- Used by the provider-specific fetch() function to check the existence
|
||||
- of an update lock, and set the lock if not present. If the lock exists
|
||||
- already, or if there was a problem setting the lock, this function
|
||||
- returns False. If the lock was successfully set, return True.
|
||||
- '''
|
||||
- if os.path.exists(self.lockfile):
|
||||
- log.warning(
|
||||
- 'Update lockfile is present for {0} remote \'{1}\', '
|
||||
- 'skipping. If this warning persists, it is possible that the '
|
||||
- 'update process was interrupted. Removing {2} or running '
|
||||
- '\'salt-run cache.clear_git_lock {0}\' will allow updates to '
|
||||
- 'continue for this remote.'
|
||||
- .format(self.role, self.id, self.lockfile)
|
||||
- )
|
||||
- return False
|
||||
- errors = self.lock()[-1]
|
||||
- if errors:
|
||||
- log.error('Unable to set update lock for {0} remote \'{1}\', '
|
||||
- 'skipping.'.format(self.role, self.id))
|
||||
- return False
|
||||
- return True
|
||||
+ def _get_lock_file(self, lock_type='update'):
|
||||
+ return os.path.join(self.gitdir, lock_type + '.lk')
|
||||
|
||||
def check_root(self):
|
||||
'''
|
||||
@@ -344,65 +325,143 @@ class GitProvider(object):
|
||||
'''
|
||||
return []
|
||||
|
||||
- def clear_lock(self):
|
||||
+ def clear_lock(self, lock_type='update'):
|
||||
'''
|
||||
Clear update.lk
|
||||
'''
|
||||
+ lock_file = self._get_lock_file(lock_type=lock_type)
|
||||
+
|
||||
def _add_error(errlist, exc):
|
||||
msg = ('Unable to remove update lock for {0} ({1}): {2} '
|
||||
- .format(self.url, self.lockfile, exc))
|
||||
+ .format(self.url, lock_file, exc))
|
||||
log.debug(msg)
|
||||
errlist.append(msg)
|
||||
|
||||
success = []
|
||||
failed = []
|
||||
- if os.path.exists(self.lockfile):
|
||||
- try:
|
||||
- os.remove(self.lockfile)
|
||||
- except OSError as exc:
|
||||
- if exc.errno == errno.EISDIR:
|
||||
- # Somehow this path is a directory. Should never happen
|
||||
- # unless some wiseguy manually creates a directory at this
|
||||
- # path, but just in case, handle it.
|
||||
- try:
|
||||
- shutil.rmtree(self.lockfile)
|
||||
- except OSError as exc:
|
||||
- _add_error(failed, exc)
|
||||
- else:
|
||||
+
|
||||
+ try:
|
||||
+ os.remove(lock_file)
|
||||
+ except OSError as exc:
|
||||
+ if exc.errno == errno.ENOENT:
|
||||
+ # No lock file present
|
||||
+ pass
|
||||
+ elif exc.errno == errno.EISDIR:
|
||||
+ # Somehow this path is a directory. Should never happen
|
||||
+ # unless some wiseguy manually creates a directory at this
|
||||
+ # path, but just in case, handle it.
|
||||
+ try:
|
||||
+ shutil.rmtree(lock_file)
|
||||
+ except OSError as exc:
|
||||
_add_error(failed, exc)
|
||||
else:
|
||||
- msg = 'Removed lock for {0} remote \'{1}\''.format(
|
||||
+ _add_error(failed, exc)
|
||||
+ else:
|
||||
+ msg = 'Removed {0} lock for {1} remote \'{2}\''.format(
|
||||
+ lock_type,
|
||||
+ self.role,
|
||||
+ self.id
|
||||
+ )
|
||||
+ log.debug(msg)
|
||||
+ success.append(msg)
|
||||
+ return success, failed
|
||||
+
|
||||
+ def fetch(self):
|
||||
+ '''
|
||||
+ Fetch the repo. If the local copy was updated, return True. If the
|
||||
+ local copy was already up-to-date, return False.
|
||||
+
|
||||
+ This function requires that a _fetch() function be implemented in a
|
||||
+ sub-class.
|
||||
+ '''
|
||||
+ try:
|
||||
+ with self.gen_lock(lock_type='update'):
|
||||
+ log.debug('Fetching %s remote \'%s\'', self.role, self.id)
|
||||
+ # Run provider-specific fetch code
|
||||
+ return self._fetch()
|
||||
+ except GitLockError as exc:
|
||||
+ if exc.errno == errno.EEXIST:
|
||||
+ log.warning(
|
||||
+ 'Update lock file is present for %s remote \'%s\', '
|
||||
+ 'skipping. If this warning persists, it is possible that '
|
||||
+ 'the update process was interrupted, but the lock could '
|
||||
+ 'also have been manually set. Removing %s or running '
|
||||
+ '\'salt-run cache.clear_git_lock %s type=update\' will '
|
||||
+ 'allow updates to continue for this remote.',
|
||||
+ self.role,
|
||||
+ self.id,
|
||||
+ self._get_lock_file(lock_type='update'),
|
||||
self.role,
|
||||
- self.id
|
||||
)
|
||||
- log.debug(msg)
|
||||
- success.append(msg)
|
||||
- return success, failed
|
||||
+ return False
|
||||
+
|
||||
+ def _lock(self, lock_type='update', failhard=False):
|
||||
+ '''
|
||||
+ Place a lock file if (and only if) it does not already exist.
|
||||
+ '''
|
||||
+ try:
|
||||
+ fh_ = os.open(self._get_lock_file(lock_type),
|
||||
+ os.O_CREAT | os.O_EXCL | os.O_WRONLY)
|
||||
+ with os.fdopen(fh_, 'w'):
|
||||
+ # Write the lock file and close the filehandle
|
||||
+ pass
|
||||
+ except (OSError, IOError) as exc:
|
||||
+ if exc.errno == errno.EEXIST:
|
||||
+ if failhard:
|
||||
+ raise
|
||||
+ return None
|
||||
+ else:
|
||||
+ msg = 'Unable to set {0} lock for {1} ({2}): {3} '.format(
|
||||
+ lock_type,
|
||||
+ self.id,
|
||||
+ self._get_lock_file(lock_type),
|
||||
+ exc
|
||||
+ )
|
||||
+ log.error(msg)
|
||||
+ raise GitLockError(exc.errno, msg)
|
||||
+ msg = 'Set {0} lock for {1} remote \'{2}\''.format(
|
||||
+ lock_type,
|
||||
+ self.role,
|
||||
+ self.id
|
||||
+ )
|
||||
+ log.debug(msg)
|
||||
+ return msg
|
||||
|
||||
def lock(self):
|
||||
'''
|
||||
- Place an update.lk
|
||||
+ Place an lock file and report on the success/failure. This is an
|
||||
+ interface to be used by the fileserver runner, so it is hard-coded to
|
||||
+ perform an update lock. We aren't using the gen_lock()
|
||||
+ contextmanager here because the lock is meant to stay and not be
|
||||
+ automatically removed.
|
||||
'''
|
||||
success = []
|
||||
failed = []
|
||||
- if not os.path.exists(self.lockfile):
|
||||
- try:
|
||||
- with salt.utils.fopen(self.lockfile, 'w+') as fp_:
|
||||
- fp_.write('')
|
||||
- except (IOError, OSError) as exc:
|
||||
- msg = ('Unable to set update lock for {0} ({1}): {2} '
|
||||
- .format(self.url, self.lockfile, exc))
|
||||
- log.error(msg)
|
||||
- failed.append(msg)
|
||||
- else:
|
||||
- msg = 'Set lock for {0} remote \'{1}\''.format(
|
||||
- self.role,
|
||||
- self.id
|
||||
- )
|
||||
- log.debug(msg)
|
||||
- success.append(msg)
|
||||
+ try:
|
||||
+ result = self._lock(lock_type='update')
|
||||
+ except GitLockError as exc:
|
||||
+ failed.append(exc.strerror)
|
||||
+ else:
|
||||
+ if result is not None:
|
||||
+ success.append(result)
|
||||
return success, failed
|
||||
|
||||
+ @contextlib.contextmanager
|
||||
+ def gen_lock(self, lock_type='update'):
|
||||
+ '''
|
||||
+ Set and automatically clear a lock
|
||||
+ '''
|
||||
+ lock_set = False
|
||||
+ try:
|
||||
+ self._lock(lock_type=lock_type, failhard=True)
|
||||
+ lock_set = True
|
||||
+ yield
|
||||
+ except (OSError, IOError, GitLockError) as exc:
|
||||
+ raise GitLockError(exc.errno, exc.strerror)
|
||||
+ finally:
|
||||
+ if lock_set:
|
||||
+ self.clear_lock(lock_type=lock_type)
|
||||
+
|
||||
def init_remote(self):
|
||||
'''
|
||||
This function must be overridden in a sub-class
|
||||
@@ -432,13 +491,14 @@ class GitProvider(object):
|
||||
blacklist=self.env_blacklist
|
||||
)
|
||||
|
||||
- def envs(self):
|
||||
+ def _fetch(self):
|
||||
'''
|
||||
- This function must be overridden in a sub-class
|
||||
+ Provider-specific code for fetching, must be implemented in a
|
||||
+ sub-class.
|
||||
'''
|
||||
raise NotImplementedError()
|
||||
|
||||
- def fetch(self):
|
||||
+ def envs(self):
|
||||
'''
|
||||
This function must be overridden in a sub-class
|
||||
'''
|
||||
@@ -504,17 +564,67 @@ class GitPython(GitProvider):
|
||||
|
||||
def checkout(self):
|
||||
'''
|
||||
- Checkout the configured branch/tag
|
||||
+ Checkout the configured branch/tag. We catch an "Exception" class here
|
||||
+ instead of a specific exception class because the exceptions raised by
|
||||
+ GitPython when running these functions vary in different versions of
|
||||
+ GitPython.
|
||||
'''
|
||||
- for ref in ('origin/' + self.branch, self.branch):
|
||||
+ try:
|
||||
+ head_sha = self.repo.rev_parse('HEAD').hexsha
|
||||
+ except Exception:
|
||||
+ # Should only happen the first time we are checking out, since
|
||||
+ # we fetch first before ever checking anything out.
|
||||
+ head_sha = None
|
||||
+
|
||||
+ # 'origin/' + self.branch ==> matches a branch head
|
||||
+ # 'tags/' + self.branch + '@{commit}' ==> matches tag's commit
|
||||
+ for rev_parse_target, checkout_ref in (
|
||||
+ ('origin/' + self.branch, 'origin/' + self.branch),
|
||||
+ ('tags/' + self.branch + '@{commit}', 'tags/' + self.branch)):
|
||||
try:
|
||||
- self.repo.git.checkout(ref)
|
||||
+ target_sha = self.repo.rev_parse(rev_parse_target).hexsha
|
||||
+ except Exception:
|
||||
+ # ref does not exist
|
||||
+ continue
|
||||
+ else:
|
||||
+ if head_sha == target_sha:
|
||||
+ # No need to checkout, we're already up-to-date
|
||||
+ return self.check_root()
|
||||
+
|
||||
+ try:
|
||||
+ with self.gen_lock(lock_type='checkout'):
|
||||
+ self.repo.git.checkout(checkout_ref)
|
||||
+ log.debug(
|
||||
+ '%s remote \'%s\' has been checked out to %s',
|
||||
+ self.role,
|
||||
+ self.id,
|
||||
+ checkout_ref
|
||||
+ )
|
||||
+ except GitLockError as exc:
|
||||
+ if exc.errno == errno.EEXIST:
|
||||
+ # Re-raise with a different strerror containing a
|
||||
+ # more meaningful error message for the calling
|
||||
+ # function.
|
||||
+ raise GitLockError(
|
||||
+ exc.errno,
|
||||
+ 'Checkout lock exists for {0} remote \'{1}\''
|
||||
+ .format(self.role, self.id)
|
||||
+ )
|
||||
+ else:
|
||||
+ log.error(
|
||||
+ 'Error %d encountered obtaining checkout lock '
|
||||
+ 'for %s remote \'%s\'',
|
||||
+ exc.errno,
|
||||
+ self.role,
|
||||
+ self.id
|
||||
+ )
|
||||
+ return None
|
||||
except Exception:
|
||||
continue
|
||||
return self.check_root()
|
||||
log.error(
|
||||
- 'Failed to checkout {0} from {1} remote \'{2}\': remote ref does '
|
||||
- 'not exist'.format(self.branch, self.role, self.id)
|
||||
+ 'Failed to checkout %s from %s remote \'%s\': remote ref does '
|
||||
+ 'not exist', self.branch, self.role, self.id
|
||||
)
|
||||
return None
|
||||
|
||||
@@ -555,7 +665,7 @@ class GitPython(GitProvider):
|
||||
log.error(_INVALID_REPO.format(self.cachedir, self.url))
|
||||
return new
|
||||
|
||||
- self.lockfile = os.path.join(self.repo.working_dir, 'update.lk')
|
||||
+ self.gitdir = os.path.join(self.repo.working_dir, '.git')
|
||||
|
||||
if not self.repo.remotes:
|
||||
try:
|
||||
@@ -604,13 +714,11 @@ class GitPython(GitProvider):
|
||||
ref_paths = [x.path for x in self.repo.refs]
|
||||
return self._get_envs_from_ref_paths(ref_paths)
|
||||
|
||||
- def fetch(self):
|
||||
+ def _fetch(self):
|
||||
'''
|
||||
Fetch the repo. If the local copy was updated, return True. If the
|
||||
local copy was already up-to-date, return False.
|
||||
'''
|
||||
- if not self.check_lock():
|
||||
- return False
|
||||
origin = self.repo.remotes[0]
|
||||
try:
|
||||
fetch_results = origin.fetch()
|
||||
@@ -772,7 +880,61 @@ class Pygit2(GitProvider):
|
||||
remote_ref = 'refs/remotes/origin/' + self.branch
|
||||
tag_ref = 'refs/tags/' + self.branch
|
||||
|
||||
+ try:
|
||||
+ local_head = self.repo.lookup_reference('HEAD')
|
||||
+ except KeyError:
|
||||
+ log.warning(
|
||||
+ 'HEAD not present in %s remote \'%s\'', self.role, self.id
|
||||
+ )
|
||||
+ return None
|
||||
+
|
||||
+ try:
|
||||
+ head_sha = local_head.get_object().hex
|
||||
+ except AttributeError:
|
||||
+ # Shouldn't happen, but just in case a future pygit2 API change
|
||||
+ # breaks things, avoid a traceback and log an error.
|
||||
+ log.error(
|
||||
+ 'Unable to get SHA of HEAD for %s remote \'%s\'',
|
||||
+ self.role, self.id
|
||||
+ )
|
||||
+ return None
|
||||
+ except KeyError:
|
||||
+ head_sha = None
|
||||
+
|
||||
refs = self.repo.listall_references()
|
||||
+
|
||||
+ def _perform_checkout(checkout_ref, branch=True):
|
||||
+ '''
|
||||
+ DRY function for checking out either a branch or a tag
|
||||
+ '''
|
||||
+ try:
|
||||
+ with self.gen_lock(lock_type='checkout'):
|
||||
+ # Checkout the local branch corresponding to the
|
||||
+ # remote ref.
|
||||
+ self.repo.checkout(checkout_ref)
|
||||
+ if branch:
|
||||
+ self.repo.reset(oid, pygit2.GIT_RESET_HARD)
|
||||
+ return True
|
||||
+ except GitLockError as exc:
|
||||
+ if exc.errno == errno.EEXIST:
|
||||
+ # Re-raise with a different strerror containing a
|
||||
+ # more meaningful error message for the calling
|
||||
+ # function.
|
||||
+ raise GitLockError(
|
||||
+ exc.errno,
|
||||
+ 'Checkout lock exists for {0} remote \'{1}\''
|
||||
+ .format(self.role, self.id)
|
||||
+ )
|
||||
+ else:
|
||||
+ log.error(
|
||||
+ 'Error %d encountered obtaining checkout lock '
|
||||
+ 'for %s remote \'%s\'',
|
||||
+ exc.errno,
|
||||
+ self.role,
|
||||
+ self.id
|
||||
+ )
|
||||
+ return False
|
||||
+
|
||||
try:
|
||||
if remote_ref in refs:
|
||||
# Get commit id for the remote ref
|
||||
@@ -782,41 +944,99 @@ class Pygit2(GitProvider):
|
||||
# it at the commit id of the remote ref
|
||||
self.repo.create_reference(local_ref, oid)
|
||||
|
||||
- # Check HEAD ref existence (checking out local_ref when HEAD
|
||||
- # ref doesn't exist will raise an exception in pygit2 >= 0.21),
|
||||
- # and create the HEAD ref if it is missing.
|
||||
- head_ref = self.repo.lookup_reference('HEAD').target
|
||||
- if head_ref not in refs and head_ref != local_ref:
|
||||
- branch_name = head_ref.partition('refs/heads/')[-1]
|
||||
- if not branch_name:
|
||||
- # Shouldn't happen, but log an error if it does
|
||||
- log.error(
|
||||
- 'pygit2 was unable to resolve branch name from '
|
||||
- 'HEAD ref \'{0}\' in {1} remote \'{2}\''.format(
|
||||
- head_ref, self.role, self.id
|
||||
+ try:
|
||||
+ target_sha = \
|
||||
+ self.repo.lookup_reference(remote_ref).get_object().hex
|
||||
+ except KeyError:
|
||||
+ log.error(
|
||||
+ 'pygit2 was unable to get SHA for %s in %s remote '
|
||||
+ '\'%s\'', local_ref, self.role, self.id
|
||||
+ )
|
||||
+ return None
|
||||
+
|
||||
+ # Only perform a checkout if HEAD and target are not pointing
|
||||
+ # at the same SHA1.
|
||||
+ if head_sha != target_sha:
|
||||
+ # Check existence of the ref in refs/heads/ which
|
||||
+ # corresponds to the local HEAD. Checking out local_ref
|
||||
+ # below when no local ref for HEAD is missing will raise an
|
||||
+ # exception in pygit2 >= 0.21. If this ref is not present,
|
||||
+ # create it. The "head_ref != local_ref" check ensures we
|
||||
+ # don't try to add this ref if it is not necessary, as it
|
||||
+ # would have been added above already. head_ref would be
|
||||
+ # the same as local_ref if the branch name was changed but
|
||||
+ # the cachedir was not (for example if a "name" parameter
|
||||
+ # was used in a git_pillar remote, or if we are using
|
||||
+ # winrepo which takes the basename of the repo as the
|
||||
+ # cachedir).
|
||||
+ head_ref = local_head.target
|
||||
+ # If head_ref is not a string, it will point to a
|
||||
+ # pygit2.Oid object and we are in detached HEAD mode.
|
||||
+ # Therefore, there is no need to add a local reference. If
|
||||
+ # head_ref == local_ref, then the local reference for HEAD
|
||||
+ # in refs/heads/ already exists and again, no need to add.
|
||||
+ if isinstance(head_ref, six.string_types) \
|
||||
+ and head_ref not in refs and head_ref != local_ref:
|
||||
+ branch_name = head_ref.partition('refs/heads/')[-1]
|
||||
+ if not branch_name:
|
||||
+ # Shouldn't happen, but log an error if it does
|
||||
+ log.error(
|
||||
+ 'pygit2 was unable to resolve branch name from '
|
||||
+ 'HEAD ref \'{0}\' in {1} remote \'{2}\''.format(
|
||||
+ head_ref, self.role, self.id
|
||||
+ )
|
||||
)
|
||||
+ return None
|
||||
+ remote_head = 'refs/remotes/origin/' + branch_name
|
||||
+ if remote_head not in refs:
|
||||
+ log.error(
|
||||
+ 'Unable to find remote ref \'{0}\' in {1} remote '
|
||||
+ '\'{2}\''.format(head_ref, self.role, self.id)
|
||||
+ )
|
||||
+ return None
|
||||
+ self.repo.create_reference(
|
||||
+ head_ref,
|
||||
+ self.repo.lookup_reference(remote_head).target
|
||||
)
|
||||
+
|
||||
+ if not _perform_checkout(local_ref, branch=True):
|
||||
return None
|
||||
- remote_head = 'refs/remotes/origin/' + branch_name
|
||||
- if remote_head not in refs:
|
||||
- log.error(
|
||||
- 'Unable to find remote ref \'{0}\' in {1} remote '
|
||||
- '\'{2}\''.format(head_ref, self.role, self.id)
|
||||
- )
|
||||
- return None
|
||||
- self.repo.create_reference(
|
||||
- head_ref,
|
||||
- self.repo.lookup_reference(remote_head).target
|
||||
- )
|
||||
|
||||
- # Point HEAD at the local ref
|
||||
- self.repo.checkout(local_ref)
|
||||
- # Reset HEAD to the commit id of the remote ref
|
||||
- self.repo.reset(oid, pygit2.GIT_RESET_HARD)
|
||||
+ # Return the relative root, if present
|
||||
return self.check_root()
|
||||
+
|
||||
elif tag_ref in refs:
|
||||
- self.repo.checkout(tag_ref)
|
||||
- return self.check_root()
|
||||
+ tag_obj = self.repo.revparse_single(tag_ref)
|
||||
+ if not isinstance(tag_obj, pygit2.Tag):
|
||||
+ log.error(
|
||||
+ '%s does not correspond to pygit2.Tag object',
|
||||
+ tag_ref
|
||||
+ )
|
||||
+ else:
|
||||
+ try:
|
||||
+ # If no AttributeError raised, this is an annotated tag
|
||||
+ tag_sha = tag_obj.target.hex
|
||||
+ except AttributeError:
|
||||
+ try:
|
||||
+ tag_sha = tag_obj.hex
|
||||
+ except AttributeError:
|
||||
+ # Shouldn't happen, but could if a future pygit2
|
||||
+ # API change breaks things.
|
||||
+ log.error(
|
||||
+ 'Unable to resolve %s from %s remote \'%s\' '
|
||||
+ 'to either an annotated or non-annotated tag',
|
||||
+ tag_ref, self.role, self.id
|
||||
+ )
|
||||
+ return None
|
||||
+
|
||||
+ if head_sha != target_sha:
|
||||
+ if not _perform_checkout(local_ref, branch=False):
|
||||
+ return None
|
||||
+
|
||||
+ # Return the relative root, if present
|
||||
+ return self.check_root()
|
||||
+ except GitLockError:
|
||||
+ raise
|
||||
except Exception as exc:
|
||||
log.error(
|
||||
'Failed to checkout {0} from {1} remote \'{2}\': {3}'.format(
|
||||
@@ -921,7 +1141,7 @@ class Pygit2(GitProvider):
|
||||
log.error(_INVALID_REPO.format(self.cachedir, self.url))
|
||||
return new
|
||||
|
||||
- self.lockfile = os.path.join(self.repo.workdir, 'update.lk')
|
||||
+ self.gitdir = os.path.join(self.repo.workdir, '.git')
|
||||
|
||||
if not self.repo.remotes:
|
||||
try:
|
||||
@@ -997,13 +1217,11 @@ class Pygit2(GitProvider):
|
||||
ref_paths = self.repo.listall_references()
|
||||
return self._get_envs_from_ref_paths(ref_paths)
|
||||
|
||||
- def fetch(self):
|
||||
+ def _fetch(self):
|
||||
'''
|
||||
Fetch the repo. If the local copy was updated, return True. If the
|
||||
local copy was already up-to-date, return False.
|
||||
'''
|
||||
- if not self.check_lock():
|
||||
- return False
|
||||
origin = self.repo.remotes[0]
|
||||
refs_pre = self.repo.listall_references()
|
||||
fetch_kwargs = {}
|
||||
@@ -1345,13 +1563,11 @@ class Dulwich(GitProvider): # pylint: disable=abstract-method
|
||||
ref_paths = self.get_env_refs(self.repo.get_refs())
|
||||
return self._get_envs_from_ref_paths(ref_paths)
|
||||
|
||||
- def fetch(self):
|
||||
+ def _fetch(self):
|
||||
'''
|
||||
Fetch the repo. If the local copy was updated, return True. If the
|
||||
local copy was already up-to-date, return False.
|
||||
'''
|
||||
- if not self.check_lock():
|
||||
- return False
|
||||
# origin is just a url here, there is no origin object
|
||||
origin = self.url
|
||||
client, path = \
|
||||
@@ -1613,6 +1829,23 @@ class Dulwich(GitProvider): # pylint: disable=abstract-method
|
||||
new = False
|
||||
if not os.listdir(self.cachedir):
|
||||
# Repo cachedir is empty, initialize a new repo there
|
||||
+ self.repo = dulwich.repo.Repo.init(self.cachedir)
|
||||
+ new = True
|
||||
+ else:
|
||||
+ # Repo cachedir exists, try to attach
|
||||
+ try:
|
||||
+ self.repo = dulwich.repo.Repo(self.cachedir)
|
||||
+ except dulwich.repo.NotGitRepository:
|
||||
+ log.error(_INVALID_REPO.format(self.cachedir, self.url))
|
||||
+ return new
|
||||
+
|
||||
+ self.gitdir = os.path.join(self.repo.path, '.git')
|
||||
+
|
||||
+ # Read in config file and look for the remote
|
||||
+ try:
|
||||
+ conf = self.get_conf()
|
||||
+ conf.get(('remote', 'origin'), 'url')
|
||||
+ except KeyError:
|
||||
try:
|
||||
self.repo = dulwich.repo.Repo.init(self.cachedir)
|
||||
new = True
|
||||
@@ -1827,9 +2060,9 @@ class GitBase(object):
|
||||
)
|
||||
return errors
|
||||
|
||||
- def clear_lock(self, remote=None):
|
||||
+ def clear_lock(self, remote=None, lock_type='update'):
|
||||
'''
|
||||
- Clear update.lk
|
||||
+ Clear update.lk for all remotes
|
||||
'''
|
||||
cleared = []
|
||||
errors = []
|
||||
@@ -1844,7 +2077,7 @@ class GitBase(object):
|
||||
# remote was non-string, try again
|
||||
if not fnmatch.fnmatch(repo.url, six.text_type(remote)):
|
||||
continue
|
||||
- success, failed = repo.clear_lock()
|
||||
+ success, failed = repo.clear_lock(lock_type=lock_type)
|
||||
cleared.extend(success)
|
||||
errors.extend(failed)
|
||||
return cleared, errors
|
||||
@@ -1870,8 +2103,6 @@ class GitBase(object):
|
||||
'\'{2}\''.format(exc, self.role, repo.id),
|
||||
exc_info_on_loglevel=logging.DEBUG
|
||||
)
|
||||
- finally:
|
||||
- repo.clear_lock()
|
||||
return changed
|
||||
|
||||
def lock(self, remote=None):
|
||||
@@ -1936,7 +2167,7 @@ class GitBase(object):
|
||||
self.hash_cachedir,
|
||||
self.find_file
|
||||
)
|
||||
- except (IOError, OSError):
|
||||
+ except (OSError, IOError):
|
||||
# Hash file won't exist if no files have yet been served up
|
||||
pass
|
||||
|
||||
@@ -2166,6 +2397,38 @@ class GitBase(object):
|
||||
)
|
||||
)
|
||||
|
||||
+ def do_checkout(self, repo):
|
||||
+ '''
|
||||
+ Common code for git_pillar/winrepo to handle locking and checking out
|
||||
+ of a repo.
|
||||
+ '''
|
||||
+ time_start = time.time()
|
||||
+ while time.time() - time_start <= 5:
|
||||
+ try:
|
||||
+ return repo.checkout()
|
||||
+ except GitLockError as exc:
|
||||
+ if exc.errno == errno.EEXIST:
|
||||
+ time.sleep(0.1)
|
||||
+ continue
|
||||
+ else:
|
||||
+ log.error(
|
||||
+ 'Error %d encountered while obtaining checkout '
|
||||
+ 'lock for %s remote \'%s\': %s',
|
||||
+ exc.errno,
|
||||
+ repo.role,
|
||||
+ repo.id,
|
||||
+ exc
|
||||
+ )
|
||||
+ break
|
||||
+ else:
|
||||
+ log.error(
|
||||
+ 'Timed out waiting for checkout lock to be released for '
|
||||
+ '%s remote \'%s\'. If this error persists, run \'salt-run '
|
||||
+ 'cache.clear_git_lock %s type=checkout\' to clear it.',
|
||||
+ self.role, repo.id, self.role
|
||||
+ )
|
||||
+ return None
|
||||
+
|
||||
|
||||
class GitFS(GitBase):
|
||||
'''
|
||||
@@ -2460,7 +2723,7 @@ class GitPillar(GitBase):
|
||||
'''
|
||||
self.pillar_dirs = {}
|
||||
for repo in self.remotes:
|
||||
- cachedir = repo.checkout()
|
||||
+ cachedir = self.do_checkout(repo)
|
||||
if cachedir is not None:
|
||||
# Figure out which environment this remote should be assigned
|
||||
if repo.env:
|
||||
@@ -2502,6 +2765,6 @@ class WinRepo(GitBase):
|
||||
'''
|
||||
self.winrepo_dirs = {}
|
||||
for repo in self.remotes:
|
||||
- cachedir = repo.checkout()
|
||||
+ cachedir = self.do_checkout(repo)
|
||||
if cachedir is not None:
|
||||
- self.winrepo_dirs[repo.url] = cachedir
|
||||
+ self.winrepo_dirs[repo.id] = cachedir
|
||||
--
|
||||
2.7.2
|
||||
|
@ -1,67 +0,0 @@
|
||||
From bb8048d4bd842746b09dbafe3a610e0d7c3e1bc2 Mon Sep 17 00:00:00 2001
|
||||
From: Bo Maryniuk <bo@suse.de>
|
||||
Date: Tue, 8 Mar 2016 16:00:26 +0100
|
||||
Subject: [PATCH 35/35] Fix the always-false behavior on checking state
|
||||
|
||||
- Fix PEP8 continuation
|
||||
- Keep first level away from lists.
|
||||
- Adjust test
|
||||
---
|
||||
salt/utils/__init__.py | 15 +++++----------
|
||||
tests/unit/utils/utils_test.py | 2 +-
|
||||
2 files changed, 6 insertions(+), 11 deletions(-)
|
||||
|
||||
diff --git a/salt/utils/__init__.py b/salt/utils/__init__.py
|
||||
index 5ee73b168349..8c8309e99f95 100644
|
||||
--- a/salt/utils/__init__.py
|
||||
+++ b/salt/utils/__init__.py
|
||||
@@ -1741,7 +1741,7 @@ def gen_state_tag(low):
|
||||
return '{0[state]}_|-{0[__id__]}_|-{0[name]}_|-{0[fun]}'.format(low)
|
||||
|
||||
|
||||
-def check_state_result(running):
|
||||
+def check_state_result(running, recurse=False):
|
||||
'''
|
||||
Check the total return value of the run and determine if the running
|
||||
dict has any issues
|
||||
@@ -1754,20 +1754,15 @@ def check_state_result(running):
|
||||
|
||||
ret = True
|
||||
for state_result in six.itervalues(running):
|
||||
- if not isinstance(state_result, dict):
|
||||
- # return false when hosts return a list instead of a dict
|
||||
+ if not recurse and not isinstance(state_result, dict):
|
||||
ret = False
|
||||
- if ret:
|
||||
+ if ret and isinstance(state_result, dict):
|
||||
result = state_result.get('result', _empty)
|
||||
if result is False:
|
||||
ret = False
|
||||
# only override return value if we are not already failed
|
||||
- elif (
|
||||
- result is _empty
|
||||
- and isinstance(state_result, dict)
|
||||
- and ret
|
||||
- ):
|
||||
- ret = check_state_result(state_result)
|
||||
+ elif result is _empty and isinstance(state_result, dict) and ret:
|
||||
+ ret = check_state_result(state_result, recurse=True)
|
||||
# return as soon as we got a failure
|
||||
if not ret:
|
||||
break
|
||||
diff --git a/tests/unit/utils/utils_test.py b/tests/unit/utils/utils_test.py
|
||||
index 611bfce0ed4b..261af69b59fc 100644
|
||||
--- a/tests/unit/utils/utils_test.py
|
||||
+++ b/tests/unit/utils/utils_test.py
|
||||
@@ -406,7 +406,7 @@ class UtilsTestCase(TestCase):
|
||||
('test_state0', {'result': True}),
|
||||
('test_state', {'result': True}),
|
||||
])),
|
||||
- ('host2', [])
|
||||
+ ('host2', OrderedDict([]))
|
||||
]))
|
||||
])
|
||||
}
|
||||
--
|
||||
2.7.2
|
||||
|
@ -1,3 +0,0 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:61a2f9cff77dd11fc6bf7630d82d1955238818dfa7eedb53e6bf3edbbc9d6029
|
||||
size 6877927
|
3
salt-2015.8.8.tar.gz
Normal file
3
salt-2015.8.8.tar.gz
Normal file
@ -0,0 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:b2ecce7bf562cfcd6586d66ade278f268bb89023f0fa0accaa55f90b8a668ef5
|
||||
size 6982904
|
130
salt.changes
130
salt.changes
@ -1,3 +1,133 @@
|
||||
-------------------------------------------------------------------
|
||||
Wed Apr 20 09:27:31 UTC 2016 - bmaryniuk@suse.com
|
||||
|
||||
- Prevent crash if pygit2 package requests recompilation.
|
||||
Add:
|
||||
* 0013-Prevent-crash-if-pygit2-package-is-requesting-re-com.patch
|
||||
- Align OS grains from older SLES with the current one (bsc#975757)
|
||||
Add:
|
||||
* 0014-align-OS-grains-from-older-SLES-with-current-one-326.patch
|
||||
|
||||
-------------------------------------------------------------------
|
||||
Sat Apr 16 15:15:07 UTC 2016 - mc@suse.com
|
||||
|
||||
- remove patches which produce duplicate functions:
|
||||
Remove:
|
||||
* 0004-implement-version_cmp-for-zypper.patch
|
||||
* 0005-pylint-changes.patch
|
||||
* 0006-Check-if-rpm-python-can-be-imported.patch
|
||||
- remove patches which add and revert the same file
|
||||
Remove:
|
||||
* 0007-Initial-Zypper-Unit-Tests-and-bugfixes.patch
|
||||
* 0009-Bugfix-on-SLE11-series-base-product-reported-as-addi.patch
|
||||
- rename patches:
|
||||
0008-do-not-generate-a-date-in-a-comment-to-prevent-rebui.patch to
|
||||
0004-do-not-generate-a-date-in-a-comment-to-prevent-rebui.patch
|
||||
0010-Use-SHA256-hash-type-by-default.patch to
|
||||
0005-Use-SHA256-hash-type-by-default.patch
|
||||
0011-Update-to-2015.8.8.2.patch to
|
||||
0006-Update-to-2015.8.8.2.patch
|
||||
0012-Force-sort-the-RPM-output-to-ensure-latest-version-o.patch to
|
||||
0007-Force-sort-the-RPM-output-to-ensure-latest-version-o.patch
|
||||
0013-Cleaner-deprecation-process-with-decorators.patch to
|
||||
0008-Cleaner-deprecation-process-with-decorators.patch
|
||||
- fix sorting by latest package
|
||||
Add:
|
||||
* 0009-fix-sorting-by-latest-version-when-called-with-an-at.patch
|
||||
- Prevent metadata download when getting installed products
|
||||
Add:
|
||||
* 0010-Prevent-metadata-download-when-getting-installed-pro.patch
|
||||
- Check if EOL is available in a particular product (bsc#975093)
|
||||
Add:
|
||||
* 0011-Check-if-EOL-is-available-in-a-particular-product-bs.patch
|
||||
- Bugfix: salt-key crashes if tries to generate keys
|
||||
to the directory w/o write access (bsc#969320)
|
||||
Add:
|
||||
* 0012-Bugfix-salt-key-crashes-if-tries-to-generate-keys-to.patch
|
||||
|
||||
-------------------------------------------------------------------
|
||||
Thu Apr 7 10:25:36 UTC 2016 - bmaryniuk@suse.com
|
||||
|
||||
- Deprecation process using decorators and re-implementation
|
||||
of status.update function.
|
||||
Add:
|
||||
* 0013-Cleaner-deprecation-process-with-decorators.patch
|
||||
|
||||
-------------------------------------------------------------------
|
||||
Fri Apr 1 19:19:10 UTC 2016 - aboe76@gmail.com
|
||||
|
||||
- Reverted the fake 2015.8.8.2 patch, with the right one,
|
||||
- this patch only contains:
|
||||
- https://github.com/saltstack/salt/pull/32135
|
||||
- https://github.com/saltstack/salt/pull/32023
|
||||
- https://github.com/saltstack/salt/pull/32117
|
||||
|
||||
-------------------------------------------------------------------
|
||||
Fri Apr 1 12:16:10 UTC 2016 - bmaryniuk@suse.com
|
||||
|
||||
- Ensure that in case of multi-packages installed on the system,
|
||||
the latest is reported by pkg.info_installed (bsc#972490)
|
||||
Add:
|
||||
* 0012-Force-sort-the-RPM-output-to-ensure-latest-version-o.patch
|
||||
|
||||
-------------------------------------------------------------------
|
||||
Wed Mar 30 22:33:19 UTC 2016 - tampakrap@opensuse.org
|
||||
|
||||
- Update to the fake 2015.8.8.2 release
|
||||
upstream released a bunch of fixes on top of 2015.8.8, without creating a new
|
||||
tag and proper release. This commit includes all the changes between tag
|
||||
v2015.8.8 and commit ID 596444e2b447b7378dbcdfeb9fc9610b90057745 which
|
||||
introduces the fake 2015.8.8.2 release.
|
||||
see https://docs.saltstack.com/en/latest/topics/releases/2015.8.8.html#salt-2015-8-8-2
|
||||
|
||||
-------------------------------------------------------------------
|
||||
Thu Mar 24 17:34:03 UTC 2016 - tampakrap@opensuse.org
|
||||
|
||||
- Update to 2015.8.8
|
||||
see https://docs.saltstack.com/en/latest/topics/releases/2015.8.8.html
|
||||
Patches renamed:
|
||||
* 0004-implement-version_cmp-for-zypper.patch
|
||||
* 0005-pylint-changes.patch
|
||||
* 0006-Check-if-rpm-python-can-be-imported.patch
|
||||
* 0007-Initial-Zypper-Unit-Tests-and-bugfixes.patch
|
||||
* 0008-do-not-generate-a-date-in-a-comment-to-prevent-rebui.patch
|
||||
* 0009-Bugfix-on-SLE11-series-base-product-reported-as-addi.patch
|
||||
* 0010-Use-SHA256-hash-type-by-default.patch
|
||||
Patches removed:
|
||||
* 0004-Fix-pkg.latest-prevent-crash-on-multiple-package-ins.patch
|
||||
* 0005-Fix-package-status-filtering-on-latest-version-and-i.patch
|
||||
* 0006-add_key-reject_key-do-not-crash-w-Permission-denied-.patch
|
||||
* 0007-Force-kill-websocket-s-child-processes-faster-than-d.patch
|
||||
* 0008-Fix-types-in-the-output-data-and-return-just-a-list-.patch
|
||||
* 0009-The-functions-in-the-state-module-that-return-a-retc.patch
|
||||
* 0010-add-handling-for-OEM-products.patch
|
||||
* 0011-improve-doc-for-list_pkgs.patch
|
||||
* 0012-implement-version_cmp-for-zypper.patch
|
||||
* 0013-pylint-changes.patch
|
||||
* 0014-Check-if-rpm-python-can-be-imported.patch
|
||||
* 0015-call-zypper-with-option-non-interactive-everywhere.patch
|
||||
* 0016-write-a-zypper-command-builder-function.patch
|
||||
* 0017-Fix-crash-with-scheduler-and-runners-31106.patch
|
||||
* 0018-unify-behavior-of-refresh.patch
|
||||
* 0019-add-refresh-option-to-more-functions.patch
|
||||
* 0020-simplify-checking-the-refresh-paramater.patch
|
||||
* 0021-do-not-change-kwargs-in-refresh-while-checking-a-val.patch
|
||||
* 0022-fix-argument-handling-for-pkg.download.patch
|
||||
* 0023-Initial-Zypper-Unit-Tests-and-bugfixes.patch
|
||||
* 0024-proper-checking-if-zypper-exit-codes-and-handling-of.patch
|
||||
* 0025-adapt-tests-to-new-zypper_check_result-output.patch
|
||||
* 0026-do-not-generate-a-date-in-a-comment-to-prevent-rebui.patch
|
||||
* 0027-make-suse-check-consistent-with-rh_service.patch
|
||||
* 0028-fix-numerical-check-of-osrelease.patch
|
||||
* 0029-Make-use-of-checksum-configurable-defaults-to-MD5-SH.patch
|
||||
* 0030-Bugfix-on-SLE11-series-base-product-reported-as-addi.patch
|
||||
* 0031-Only-use-LONGSIZE-in-rpm.info-if-available.-Otherwis.patch
|
||||
* 0032-Add-error-check-when-retcode-is-0-but-stderr-is-pres.patch
|
||||
* 0033-fixing-init-system-dectection-on-sles-11-refs-31617.patch
|
||||
* 0034-Fix-git_pillar-race-condition.patch
|
||||
* 0035-Fix-the-always-false-behavior-on-checking-state.patch
|
||||
* 0036-Use-SHA256-hash-type-by-default.patch
|
||||
|
||||
-------------------------------------------------------------------
|
||||
Thu Mar 17 12:09:14 UTC 2016 - bmaryniuk@suse.com
|
||||
|
||||
|
100
salt.spec
100
salt.spec
@ -36,7 +36,7 @@
|
||||
%bcond_without docs
|
||||
|
||||
Name: salt
|
||||
Version: 2015.8.7
|
||||
Version: 2015.8.8
|
||||
Release: 0
|
||||
Summary: A parallel remote execution system
|
||||
License: Apache-2.0
|
||||
@ -53,64 +53,28 @@ Patch1: 0001-tserong-suse.com-We-don-t-have-python-systemd-so-not.patch
|
||||
Patch2: 0002-Run-salt-master-as-dedicated-salt-user.patch
|
||||
# PATCH-FIX-OPENSUSE https://github.com/saltstack/salt/pull/30424
|
||||
Patch3: 0003-Check-if-byte-strings-are-properly-encoded-in-UTF-8.patch
|
||||
# PATCH-FIX-OPENSUSE https://github.com/saltstack/salt/pull/30611
|
||||
Patch4: 0004-Fix-pkg.latest-prevent-crash-on-multiple-package-ins.patch
|
||||
# PATCH-FIX-OPENSUSE https://github.com/saltstack/salt/pull/30663
|
||||
Patch5: 0005-Fix-package-status-filtering-on-latest-version-and-i.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/30998
|
||||
Patch6: 0006-add_key-reject_key-do-not-crash-w-Permission-denied-.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31125
|
||||
Patch7: 0007-Force-kill-websocket-s-child-processes-faster-than-d.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31134
|
||||
Patch8: 0008-Fix-types-in-the-output-data-and-return-just-a-list-.patch
|
||||
# PATCH-FIX-UPSTREAM
|
||||
Patch9: 0009-The-functions-in-the-state-module-that-return-a-retc.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31237
|
||||
Patch10: 0010-add-handling-for-OEM-products.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31234
|
||||
Patch11: 0011-improve-doc-for-list_pkgs.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31233
|
||||
Patch12: 0012-implement-version_cmp-for-zypper.patch
|
||||
Patch13: 0013-pylint-changes.patch
|
||||
Patch14: 0014-Check-if-rpm-python-can-be-imported.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31305
|
||||
Patch15: 0015-call-zypper-with-option-non-interactive-everywhere.patch
|
||||
Patch16: 0016-write-a-zypper-command-builder-function.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31189
|
||||
Patch17: 0017-Fix-crash-with-scheduler-and-runners-31106.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31378
|
||||
Patch18: 0018-unify-behavior-of-refresh.patch
|
||||
Patch19: 0019-add-refresh-option-to-more-functions.patch
|
||||
Patch20: 0020-simplify-checking-the-refresh-paramater.patch
|
||||
Patch21: 0021-do-not-change-kwargs-in-refresh-while-checking-a-val.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31429
|
||||
Patch22: 0022-fix-argument-handling-for-pkg.download.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31479
|
||||
# https://github.com/saltstack/salt/pull/31488
|
||||
Patch23: 0023-Initial-Zypper-Unit-Tests-and-bugfixes.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31508
|
||||
Patch24: 0024-proper-checking-if-zypper-exit-codes-and-handling-of.patch
|
||||
Patch25: 0025-adapt-tests-to-new-zypper_check_result-output.patch
|
||||
# PATCH-FIX-OPENSUSE prevent rebuilds in OBS
|
||||
Patch26: 0026-do-not-generate-a-date-in-a-comment-to-prevent-rebui.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31629
|
||||
Patch27: 0027-make-suse-check-consistent-with-rh_service.patch
|
||||
Patch28: 0028-fix-numerical-check-of-osrelease.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31162
|
||||
Patch29: 0029-Make-use-of-checksum-configurable-defaults-to-MD5-SH.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31786
|
||||
Patch30: 0030-Bugfix-on-SLE11-series-base-product-reported-as-addi.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31445
|
||||
Patch31: 0031-Only-use-LONGSIZE-in-rpm.info-if-available.-Otherwis.patch
|
||||
Patch32: 0032-Add-error-check-when-retcode-is-0-but-stderr-is-pres.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31793
|
||||
Patch33: 0033-fixing-init-system-dectection-on-sles-11-refs-31617.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31836
|
||||
Patch34: 0034-Fix-git_pillar-race-condition.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/31745
|
||||
Patch35: 0035-Fix-the-always-false-behavior-on-checking-state.patch
|
||||
Patch4: 0004-do-not-generate-a-date-in-a-comment-to-prevent-rebui.patch
|
||||
# PATCH-FIX-OPENSUSE - Upstream default hash type is set to MD5, while we require SHA256 (bsc#955373)
|
||||
Patch36: 0036-Use-SHA256-hash-type-by-default.patch
|
||||
Patch5: 0005-Use-SHA256-hash-type-by-default.patch
|
||||
# PATCH-FIX-UPSTREAM https://docs.saltstack.com/en/latest/topics/releases/2015.8.8.html#salt-2015-8-8-2
|
||||
Patch6: 0006-Update-to-2015.8.8.2.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/32243
|
||||
Patch7: 0007-Force-sort-the-RPM-output-to-ensure-latest-version-o.patch
|
||||
# PATCH-FIX-UPSTREAM https://github.com/saltstack/salt/pull/32068
|
||||
Patch8: 0008-Cleaner-deprecation-process-with-decorators.patch
|
||||
# PATCH-FIX_UPSTREAM https://github.com/saltstack/salt/pull/32323
|
||||
Patch9: 0009-fix-sorting-by-latest-version-when-called-with-an-at.patch
|
||||
# PATCH-FIX_UPSTREAM https://github.com/saltstack/salt/pull/32353
|
||||
Patch10: 0010-Prevent-metadata-download-when-getting-installed-pro.patch
|
||||
# PATCH-FIX_UPSTREAM https://github.com/saltstack/salt/pull/32505
|
||||
Patch11: 0011-Check-if-EOL-is-available-in-a-particular-product-bs.patch
|
||||
# PATCH-FIX_UPSTREAM https://github.com/saltstack/salt/pull/32436
|
||||
Patch12: 0012-Bugfix-salt-key-crashes-if-tries-to-generate-keys-to.patch
|
||||
# PATCH-FIX_UPSTREAM https://github.com/saltstack/salt/pull/32652
|
||||
Patch13: 0013-Prevent-crash-if-pygit2-package-is-requesting-re-com.patch
|
||||
# PATCH-FIX_UPSTREAM https://github.com/saltstack/salt/pull/32649
|
||||
Patch14: 0014-align-OS-grains-from-older-SLES-with-current-one-326.patch
|
||||
|
||||
BuildRoot: %{_tmppath}/%{name}-%{version}-build
|
||||
BuildRequires: logrotate
|
||||
@ -467,28 +431,6 @@ cp %{S:1} .
|
||||
%patch12 -p1
|
||||
%patch13 -p1
|
||||
%patch14 -p1
|
||||
%patch15 -p1
|
||||
%patch16 -p1
|
||||
%patch17 -p1
|
||||
%patch18 -p1
|
||||
%patch19 -p1
|
||||
%patch20 -p1
|
||||
%patch21 -p1
|
||||
%patch22 -p1
|
||||
%patch23 -p1
|
||||
%patch24 -p1
|
||||
%patch25 -p1
|
||||
%patch26 -p1
|
||||
%patch27 -p1
|
||||
%patch28 -p1
|
||||
%patch29 -p1
|
||||
%patch30 -p1
|
||||
%patch31 -p1
|
||||
%patch32 -p1
|
||||
%patch33 -p1
|
||||
%patch34 -p1
|
||||
%patch35 -p1
|
||||
%patch36 -p1
|
||||
|
||||
%build
|
||||
python setup.py --salt-transport=both build
|
||||
|
Loading…
Reference in New Issue
Block a user