Workaround glibc.i686 package config, it had a topadd block in _link,
and looks like it causes the disturl won't consistently with glibc even
with the same srcmd5. And the lastsuccess state will be outdated if the
revision is used the same srcmd5.
Note: the proper workaround, to instead of
5daf2bb6509abaf966268c9d6b13aa4e330a8af3
Workaround glibc.i686 package config, it had a topadd block in _link,
and looks like it causes the disturl won't consistently with glibc even
with the same srcmd5. And the lastsuccess state will be outdated if the
revision is used the same srcmd5.
Add glibc.i686 to specs if its glibc request, therefore, repochecker
will not ignore glibc.i686 binaries in case some binaries need them.
Also ignore build_excluded if it's i686 only package, and downloading
it's i586 packages.
The additional message will look more clear than telling it's bad
disturl and no matching downloads available. Note that, before executing
in check_disturl(), there is another place to check the revision which
was in check_specs(), but it was mainly for 2nd spec.
And makes DECLINED message consistently.
Some users are confused as the old message "package never built" does
not have to be true in many cases, especially on longer-lasting
stagings. In this case, the binaries might have been built but have
since been replaced by newer ones - the effect is the same: repo-checker
can't receive them in order to perform the full checks.
Adding the common Factory build repository to the candidate repos only,
adn re-introduced build_excluded attribute. If it haven't good repo ie.
builds against ARM, PPC, images, i586 only or whatever, leave them there
for human to check.
Verified with SR#359327, SR#360767, SR#361364 and SR#361243.
- Add withhistory to recover the creator
- Remove the check of the user creator test for now, I need
to figure out how to detect who is the maintainer in the
tgt_project
- Check that the package is not needed in the tgt_project
(usually Factory)
The request can be partially cached. When this is the case, we
still try to download the packages (avoided in the last moment
by the original cache system). But there is a prepopulated
data in 'download' and 'missings' in the request that are agregated
with the new data.