RHV 4.3.x & RHEL 7.7 Update Dependency Hell

RHEL 7.7 went GA, and dependency hell descended and broke my RHV Manager and RHV Host. I needed to fix this.

Red Hat Enterprise Linux 7.6 going 7.7
Red Hat Virtualization 4.3.1

Error Message:
-> Processing Dependency: rsyslog = 8.24.0-34.el7 for package: rsyslog-mmnormalize-8.24.0-34.el7.x86_64
--> Finished Dependency Resolution
Error: Package: rsyslog-elasticsearch-8.24.0-34.el7.x86_64 (rhel-7-server-rhv-4.3-manager-rpms)
Requires: rsyslog = 8.24.0-34.el7
Removing: rsyslog-8.24.0-34.el7.x86_64 (installed)
rsyslog = 8.24.0-34.el7
Updated By: rsyslog-8.24.0-38.el7.x86_64 (rhel-7-server-rpms)
rsyslog = 8.24.0-38.el7
Available: rsyslog-7.4.7-6.el7.x86_64 (rhel-7-server-rpms)
rsyslog = 7.4.7-6.el7

Updates on the RHV-Manager

yum update --exclude=rsyslog*

Updating the RHV Host - you can put this in the /etc/yum.conf


If you are yet to allow the host to update from 7.6 to 7.7 - then you can prevent it with this:
subscription-manager release --set=7.6
… thus pinning it to that release until the dependencies are resolved.

But, things are not always as clear cut as the examples you are given to work with - especially if you assumed the issue was local as opposed to with the release. In which case - read on.


There is a KCS article on the solution until the yum dependencies are fixed; however, these didn't fix my issues. In its defence, this is mainly as I had assumed that it was me who had made the mess, not the software provider. In making this sweeping assumption - I dug myself into a series of joyous holes starting with DHCP running on a VM that was down, and could not be started, a Satellite that only kept the most current versions of RHEL7 and joy that pretty much jump-started the goalposts, and headed off over the horizon with them.

Fixing for the RHEL Host that is the RHV Hypervisor was painless enough - reinstall it - and tell it that it is pinned to 7.6.

subscription-manager release --set=7.6

Which you can release --unset once the issue has been resolved.
For updating the RHV-M you can simply tell it "I see no ships", and it is more than happy with that.

yum update --exclude=rsyslog*

However, let's assume that you have managed through 'learning experiences' (that will not be repeated) to embrace the deep joy of having to reinstall again - well suddenly the above don't really cut it.

So here are the steps - give or take - for the challenged, who find themselves in a hole - looking at the console in the evening, with a glass of wine and pondering what you did so wrong to have such a steaming pile to work with.

Let's start by assuming that you have attached it to the correct SKU / PoolID - and that you have the toys you are looking for. I have not removed everything - simply added to the default after registration.

subscription-manager repos --enable=rhel-7-server-rpms \
--enable=rhel-7-server-supplementary-rpms \
--enable=rhel-7-server-rhv-4.3-manager-rpms \
--enable=rhel-7-server-rhv-4-manager-tools-rpms \
--enable=rhel-7-server-ansible-2-rpms \

yum clean all

subscription-manager release --set=7.6

yum -y update && reboot

So far, so good - you are suckered into a false sense of security.

yum -y install rhvm

…which churns away nicely then out of the blue:

Error: Package: ovirt-log-collector-4.3.3-1.el7ev.noarch (rhel-7-server-rhv-4.3-manager-rpms)
Requires: sos >= 3.7


A quick check of the versions available shows there is no (spoon?) 3.7 - but there is a 3.6 release of sos (which has a surprising dependency if you were thinking don't install it like me). At which point the wonder of suck-it-and-see surfaces, with the forced installation of a lesser version of ovirt-log-collector and its requirement for a lesser (and thus available sos release):

# yum search ovirt-log-collector --show-duplicates
Loaded plugins: product-id, search-disabled-repos, subscription-
: manager, versionlock
================ N/S matched: ovirt-log-collector =================
ovirt-log-collector-4.3.2-1.el7ev.noarch : Log Collector for oVirt
: Engine
ovirt-log-collector-4.3.2-1.el7ev.noarch : Log Collector for oVirt
: Engine
ovirt-log-collector-4.3.3-1.el7ev.noarch : Log Collector for oVirt
: Engine

# yum search sos --show-duplicates
Loaded plugins: product-id, search-disabled-repos, subscription-
: manager, versionlock
======================== N/S matched: sos =========================

sos-3.6-17.el7_6.noarch : A set of tools to gather troubleshooting
: information from a system
.sos-3.6-19.el7_6.noarch : A set of tools to gather troubleshooting
: information from a system
sos-3.6-19.el7_6.noarch : A set of tools to gather troubleshooting
: information from a system

Ahh - that will do NICELY. So it looks like we can forceably install the versions that play nicely together as follows:

# yum -y install rsyslog-mmnormalize-8.24.0-34.el7

# yum install -y rsyslog-elasticsearch-8.24.0-34.el7

# yum install -y ovirt-log-collector-4.3.2-1.el7ev

# yum install rhvm

…and we have successful completion of the build. HAPPY days (he says looking out the window through the mottled lens of hard rain at dark overcast skies on a midsummer day in August).

THAT sir is a working solution. Not suitable for enterprise/production - but just about everything you need for the home lab and getting on with the rest of your week.

Bugger. Engine-setup now running through as far as PACKAGES and saying "non".

--== PACKAGES ==--
[ INFO ] Checking for product updates…
[ ERROR ] Yum [u'ovirt-log-collector-4.3.3-1.el7ev.noarch requires sos >= 3.7']
[ INFO ] Yum Performing yum transaction rollback
[ ERROR ] Failed to execute stage 'Environment customization': [u'ovirt-log-collector-4.3.3-1.el7ev.noarch requires sos >= 3.7']
[ INFO ] Stage: Clean up
Log file is located at /var/log/ovirt-engine/setup/ovirt-engine-setup-20190814114826-0cspj1.log
[ INFO ] Generating answer file '/var/lib/ovirt-engine/setup/answers/20190814115641-setup.conf'
[ INFO ] Stage: Pre-termination
[ INFO ] Stage: Termination
[ ERROR ] Execution of setup failed

This - will never do. So editing /etc/yum.conf


yum clean all

…and then re-run rhvm-setup again - knowing we have two packages that we know will party together…. AND SUCCESS!

Hopefully given that the dependency fixes are currently RELEASE_PENDING and PUSH_READY they should be out to the CDN before the end of the week with a bit of luck… but none the less - this insight and a personal solution may assist if you see similar.

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: