Uploaded image for project: 'Lustre'
  1. Lustre
  2. LU-10858

lustre-initialization-1 lustre-initialization fails for SLES12 SP2 and SP3

Details

    • Bug
    • Resolution: Fixed
    • Minor
    • Lustre 2.11.0
    • None
    • None
    • 3
    • 9223372036854775807

    Description

      This issue was created by maloo for James Nunez <james.a.nunez@intel.com>

      This issue relates to the following test suite run: https://testing.hpdd.intel.com/test_sets/199f17d0-3149-11e8-b74b-52540065bddc

      lustre-initialization failed with the following error:

      'lustre-initialization failed'
      

      <<Please provide additional information about the failure here>>
      Looking at the autotest log, we see that the Lustre tests are not installed

      2018-03-26T21:56:49 trevis-18vm1: /usr/lib64/lustre/tests/cfg/: No such file or directory
      2018-03-26T21:56:49 pdsh@trevis-18vm1: trevis-18vm1: ssh exited with exit code 1
      2018-03-26T21:56:49 trevis-18vm3: /usr/lib64/lustre/tests/cfg/: No such file or directory
      2018-03-26T21:56:49 pdsh@trevis-18vm1: trevis-18vm3: ssh exited with exit code 1
      2018-03-26T21:56:49 trevis-18vm4: /usr/lib64/lustre/tests/cfg/: No such file or directory
      2018-03-26T21:56:49 pdsh@trevis-18vm1: trevis-18vm4: ssh exited with exit code 1
      2018-03-26T21:56:49 trevis-18vm2: /usr/lib64/lustre/tests/cfg/: No such file or directory
      2018-03-26T21:56:49 pdsh@trevis-18vm1: trevis-18vm2: ssh exited with exit code 1
      

      Yet, looking at the node console logs, I don’t see any failure relating to loading RPMS. Looking at the console logs for all the nodes, they all end with

      Welcome to SUSE Linux Enterprise Server 12 SP3  (x86_64) - Kernel 4.4.114-94.11-default (ttyS0).
      
      
      trevis-18vm2 login: [   80.209414] random: nonblocking pool is initialized
      
      <ConMan> Console [trevis-18vm2] disconnected from <trevis-18:6001> at 03-26 22:56.
      

      This failure started with master build #3731.

      Another test session that failed in this way is at
      https://testing.hpdd.intel.com/test_sessions/fb84aaa9-888e-4d17-9a76-1cfd67d415aa

      VVVVVVV DO NOT REMOVE LINES BELOW, Added by Maloo for auto-association VVVVVVV
      lustre-initialization-1 lustre-initialization - 'lustre-initialization failed'

      Attachments

        Issue Links

          Activity

            [LU-10858] lustre-initialization-1 lustre-initialization fails for SLES12 SP2 and SP3
            pjones Peter Jones added a comment -

            Landed for 2.11

            pjones Peter Jones added a comment - Landed for 2.11

            Oleg Drokin (oleg.drokin@intel.com) merged in patch https://review.whamcloud.com/31815/
            Subject: LU-10858 build: handle yaml library packaging on SLES systems
            Project: fs/lustre-release
            Branch: master
            Current Patch Set:
            Commit: 20ad3ed15c321c7740988728c49a97105c59a3c4

            gerrit Gerrit Updater added a comment - Oleg Drokin (oleg.drokin@intel.com) merged in patch https://review.whamcloud.com/31815/ Subject: LU-10858 build: handle yaml library packaging on SLES systems Project: fs/lustre-release Branch: master Current Patch Set: Commit: 20ad3ed15c321c7740988728c49a97105c59a3c4

            James Simmons (uja.ornl@yahoo.com) uploaded a new patch: https://review.whamcloud.com/31815
            Subject: LU-10858 build: handle yaml library packaging on SLES systems
            Project: fs/lustre-release
            Branch: master
            Current Patch Set: 1
            Commit: fc4a9793c5ef2a3abd260474fc9f9dc2e9102673

            gerrit Gerrit Updater added a comment - James Simmons (uja.ornl@yahoo.com) uploaded a new patch: https://review.whamcloud.com/31815 Subject: LU-10858 build: handle yaml library packaging on SLES systems Project: fs/lustre-release Branch: master Current Patch Set: 1 Commit: fc4a9793c5ef2a3abd260474fc9f9dc2e9102673

            Actually that fix looks good Bob. I'm going to try it.

            simmonsja James A Simmons added a comment - Actually that fix looks good Bob. I'm going to try it.
            bogl Bob Glossman (Inactive) added a comment - - edited

            Don't think the exact package name matters. As long as the .rpm has a Provides of "zlib" in it, it will be found and installed if required as a dependency.

            In any case support for SLES11 has been stopped or is going to be going away soon on master.

            bogl Bob Glossman (Inactive) added a comment - - edited Don't think the exact package name matters. As long as the .rpm has a Provides of "zlib" in it, it will be found and installed if required as a dependency. In any case support for SLES11 has been stopped or is going to be going away soon on master.
            yujian Jian Yu added a comment -

            Hi Bob,
            For SLES 11, the package name is 'zlib'. And for SLES 12, the name is 'libz1'.

            yujian Jian Yu added a comment - Hi Bob, For SLES 11, the package name is 'zlib'. And for SLES 12, the name is 'libz1'.
            bogl Bob Glossman (Inactive) added a comment - - edited

            if having the extra BuildRequires makes mock behave better I have no major objection.
            for manual builds I have found examining the config.log of failed builds usually gives enough clues to figure it out.

            If you insist on in I suggest something like the following to adapt to different names:

            --- a/lustre.spec.in
            +++ b/lustre.spec.in
            @@ -81,12 +81,14 @@
             %global modules_fs_path /lib/modules/%{kversion}/%{kmoddir}
             
             %if %{_vendor}=="redhat" || %{_vendor}=="fedora"
            +	%global requires_yaml_name libyaml
             	%global requires_kmod_name kmod-%{lustre_name}
             	%if %{with lustre_tests}
             		%global requires_kmod_tests_name kmod-%{lustre_name}-tests
             	%endif
             	%global requires_kmod_version %{version}
             %else	#for Suse
            +	%global requires_yaml_name libyaml-0-2
             	%global requires_kmod_name %{lustre_name}-kmp
             	%if %{with lustre_tests}
             		%global requires_kmod_tests_name %{lustre_name}-tests-kmp
            @@ -132,7 +134,8 @@ Source6: kmp-lustre-osd-zfs.files
             Source7: kmp-lustre-tests.files
             URL: https://wiki.hpdd.intel.com/
             BuildRoot: %{_tmppath}/lustre-%{version}-root
            -Requires: %{requires_kmod_name} = %{requires_kmod_version} libyaml zlib
            +Requires: %{requires_kmod_name} = %{requires_kmod_version} zlib
            +Requires: %{requires_yaml_name}
             BuildRequires: libtool libyaml-devel zlib-devel
             %if %{with servers}
             Requires: lustre-osd
            

            I don't say this is the best fix, but I think it will work.

            bogl Bob Glossman (Inactive) added a comment - - edited if having the extra BuildRequires makes mock behave better I have no major objection. for manual builds I have found examining the config.log of failed builds usually gives enough clues to figure it out. If you insist on in I suggest something like the following to adapt to different names: --- a/lustre.spec.in +++ b/lustre.spec.in @@ -81,12 +81,14 @@ %global modules_fs_path /lib/modules/%{kversion}/%{kmoddir} %if %{_vendor}=="redhat" || %{_vendor}=="fedora" + %global requires_yaml_name libyaml %global requires_kmod_name kmod-%{lustre_name} %if %{with lustre_tests} %global requires_kmod_tests_name kmod-%{lustre_name}-tests %endif %global requires_kmod_version %{version} %else #for Suse + %global requires_yaml_name libyaml-0-2 %global requires_kmod_name %{lustre_name}-kmp %if %{with lustre_tests} %global requires_kmod_tests_name %{lustre_name}-tests-kmp @@ -132,7 +134,8 @@ Source6: kmp-lustre-osd-zfs.files Source7: kmp-lustre-tests.files URL: https://wiki.hpdd.intel.com/ BuildRoot: %{_tmppath}/lustre-%{version}-root -Requires: %{requires_kmod_name} = %{requires_kmod_version} libyaml zlib +Requires: %{requires_kmod_name} = %{requires_kmod_version} zlib +Requires: %{requires_yaml_name} BuildRequires: libtool libyaml-devel zlib-devel %if %{with servers} Requires: lustre-osd I don't say this is the best fix, but I think it will work.
            simmonsja James A Simmons added a comment - - edited

            The BuildRequires were added to make people use use mock and other build system like that happy.  FOr somethingf like mock you drop in the source rpm which will use BuildRequires to pull down the needed development rpms to build the packages.

            libtool appears to the same. The issues is the logs from the failure are pretty useless. Do you have something better?

             

            simmonsja James A Simmons added a comment - - edited The BuildRequires were added to make people use use mock and other build system like that happy.  FOr somethingf like mock you drop in the source rpm which will use BuildRequires to pull down the needed development rpms to build the packages. libtool appears to the same. The issues is the logs from the failure are pretty useless. Do you have something better?  

            Since the names are the same I don't see any harm.
            I just question the need for it at all.
            I'm of the "if it's not broke, don't fix it" school of thought.

            bogl Bob Glossman (Inactive) added a comment - Since the names are the same I don't see any harm. I just question the need for it at all. I'm of the "if it's not broke, don't fix it" school of thought.
            yujian Jian Yu added a comment -

            I was questioning the need for added Build-Requires.

            Since the package names of libyaml-devel and zlib-devel are both correct on RHEL and SLES, I wonder if the following line in lustre.spec.in can cause any issue:

            BuildRequires: libtool libyaml-devel zlib-devel
            
            yujian Jian Yu added a comment - I was questioning the need for added Build-Requires. Since the package names of libyaml-devel and zlib-devel are both correct on RHEL and SLES, I wonder if the following line in lustre.spec.in can cause any issue: BuildRequires: libtool libyaml-devel zlib-devel

            People

              bogl Bob Glossman (Inactive)
              maloo Maloo
              Votes:
              0 Vote for this issue
              Watchers:
              8 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: