<!-- 
RSS generated by JIRA (9.4.14#940014-sha1:734e6822bbf0d45eff9af51f82432957f73aa32c) at Sat Feb 10 02:17:12 UTC 2024

It is possible to restrict the fields that are returned in this document by specifying the 'field' parameter in your request.
For example, to request only the issue key and summary append 'field=key&field=summary' to the URL of your request.
-->
<rss version="0.92" >
<channel>
    <title>Whamcloud Community JIRA</title>
    <link>https://jira.whamcloud.com</link>
    <description>This file is an XML representation of an issue</description>
    <language>en-us</language>    <build-info>
        <version>9.4.14</version>
        <build-number>940014</build-number>
        <build-date>05-12-2023</build-date>
    </build-info>


<item>
            <title>[LU-8398] conf-sanity test_32a: test_32a failed with 1</title>
                <link>https://jira.whamcloud.com/browse/LU-8398</link>
                <project id="10000" key="LU">Lustre</project>
                    <description>&lt;p&gt;This issue was created by maloo for Bob Glossman &amp;lt;bob.glossman@intel.com&amp;gt;&lt;/p&gt;

&lt;p&gt;This issue relates to the following test suite run: &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/4f228b3e-4975-11e6-9f8e-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/4f228b3e-4975-11e6-9f8e-5254006e85c2&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The sub-test test_32a failed with the following error:&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;test_32a failed with 1
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;while this fail is in the same test as &lt;a href=&quot;https://jira.whamcloud.com/browse/LU-7035&quot; title=&quot;conf-sanity test_32a failed with memory leak:(class_obd.c:633:cleanup_obdclass()) obd_memory max: *, leaked: *&quot; class=&quot;issue-link&quot; data-issue-key=&quot;LU-7035&quot;&gt;&lt;del&gt;LU-7035&lt;/del&gt;&lt;/a&gt; the errors are different.&lt;br/&gt;
Several errors are seen in the test log:&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;persistent mount opts: 
Parameters: lov.stripecount=0 lov.stripesize=1048576 mdt.identity_upcall=/usr/sbin/l_getidentity sys.timeout=20

exiting before disk write.
IOC_LIBCFS_GET_NI error 22: Invalid argument
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;
&lt;p&gt;  and&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;CMD: onyx-64 mount -t lustre -o exclude=t32fs-OST0000 t32fs-mdt1/mdt1 /tmp/t32/mnt/mdt
onyx-64: mount.lustre: mount t32fs-mdt1/mdt1 at /tmp/t32/mnt/mdt failed: No such file or directory
onyx-64: Is the MGS specification correct?
onyx-64: Is the filesystem name correct?
onyx-64: If upgrading, is the copied client log valid? (see upgrade docs)
CMD: onyx-64 losetup -a
 conf-sanity test_32a: @@@@@@ FAIL: Mounting the MDT 
  Trace dump:
  = /usr/lib64/lustre/tests/test-framework.sh:4713:error_noexit()
  = /usr/lib64/lustre/tests/conf-sanity.sh:1730:t32_test()
  = /usr/lib64/lustre/tests/conf-sanity.sh:2066:test_32a()
  = /usr/lib64/lustre/tests/test-framework.sh:4991:run_one()
  = /usr/lib64/lustre/tests/test-framework.sh:5028:run_one_logged()
  = /usr/lib64/lustre/tests/test-framework.sh:4893:run_test()
  = /usr/lib64/lustre/tests/conf-sanity.sh:2070:main()
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;
&lt;p&gt;  and&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;CMD: onyx-64 zpool destroy t32fs-ost1
 conf-sanity test_32a: @@@@@@ FAIL: test_32a failed with 1 
  Trace dump:
  = /usr/lib64/lustre/tests/test-framework.sh:4713:error_noexit()
  = /usr/lib64/lustre/tests/test-framework.sh:4744:error()
  = /usr/lib64/lustre/tests/test-framework.sh:4991:run_one()
  = /usr/lib64/lustre/tests/test-framework.sh:5028:run_one_logged()
  = /usr/lib64/lustre/tests/test-framework.sh:4893:run_test()
  = /usr/lib64/lustre/tests/conf-sanity.sh:2070:main()
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;Info required for matching: conf-sanity 32a&lt;/p&gt;</description>
                <environment></environment>
        <key id="38170">LU-8398</key>
            <summary>conf-sanity test_32a: test_32a failed with 1</summary>
                <type id="1" iconUrl="https://jira.whamcloud.com/secure/viewavatar?size=xsmall&amp;avatarId=11303&amp;avatarType=issuetype">Bug</type>
                                            <priority id="3" iconUrl="https://jira.whamcloud.com/images/icons/priorities/major.svg">Major</priority>
                        <status id="5" iconUrl="https://jira.whamcloud.com/images/icons/statuses/resolved.png" description="A resolution has been taken, and it is awaiting verification by reporter. From here issues are either reopened, or are closed.">Resolved</status>
                    <statusCategory id="3" key="done" colorName="success"/>
                                    <resolution id="5">Cannot Reproduce</resolution>
                                        <assignee username="wc-triage">WC Triage</assignee>
                                    <reporter username="maloo">Maloo</reporter>
                        <labels>
                    </labels>
                <created>Thu, 14 Jul 2016 13:16:06 +0000</created>
                <updated>Tue, 28 Apr 2020 07:52:55 +0000</updated>
                            <resolved>Tue, 28 Apr 2020 07:52:54 +0000</resolved>
                                    <version>Lustre 2.9.0</version>
                    <version>Lustre 2.10.0</version>
                                                        <due></due>
                            <votes>0</votes>
                                    <watches>13</watches>
                                                                            <comments>
                            <comment id="162564" author="yujian" created="Fri, 19 Aug 2016 19:24:05 +0000"  >&lt;p&gt;More failure instances on master branch:&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/sub_tests/01cfc912-6565-11e6-b5b1-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/sub_tests/01cfc912-6565-11e6-b5b1-5254006e85c2&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/sub_tests/f07ec770-6074-11e6-906c-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/sub_tests/f07ec770-6074-11e6-906c-5254006e85c2&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/sub_tests/21005b46-5712-11e6-b2e2-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/sub_tests/21005b46-5712-11e6-b2e2-5254006e85c2&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="162565" author="yujian" created="Fri, 19 Aug 2016 19:32:49 +0000"  >&lt;p&gt;Console log on MDS:&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;Lustre: DEBUG MARKER: mount -t lustre -o exclude=t32fs-OST0000 t32fs-mdt1/mdt1 /tmp/t32/mnt/mdt
Lustre: MGS: Connection restored to MGC192.168.5.144@o2ib_0 (at 0@lo)
Lustre: Skipped 32 previous similar messages
LustreError: 126628:0:(ldlm_lib.c:459:client_obd_setup()) can&apos;t add initial connection
LustreError: 126628:0:(osp_dev.c:1150:osp_init0()) t32fs-MDT0001-osp-MDT0000: can&apos;t setup obd: rc = -2
LustreError: 126628:0:(obd_config.c:578:class_setup()) setup t32fs-MDT0001-osp-MDT0000 failed (-2)
LustreError: 126628:0:(obd_config.c:1671:class_config_llog_handler()) MGC192.168.5.144@o2ib: cfg command failed: rc = -2 
Lustre:    cmd=cf003 0:t32fs-MDT0001-osp-MDT0000  1:t32fs-MDT0001_UUID  2:10.100.4.87@tcp
LustreError: 15c-8: MGC192.168.5.144@o2ib: The configuration from log &apos;t32fs-MDT0000&apos; failed (-2). This may be the result of communication errors between this node and the MGS, a bad configuration, or other errors. See the syslog for more information.
LustreError: 126533:0:(obd_mount_server.c:1352:server_start_targets()) failed to start server t32fs-MDT0000: -2
LustreError: 126533:0:(obd_mount_server.c:1844:server_fill_super()) Unable to start targets: -2
Lustre: Failing over t32fs-MDT0000
LustreError: 126533:0:(obd_mount.c:1453:lustre_fill_super()) Unable to mount  (-2)
Lustre: DEBUG MARKER: losetup -a
Lustre: DEBUG MARKER: /usr/sbin/lctl mark  conf-sanity test_32a: @@@@@@ FAIL: Mounting the MDT
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;All of the failures occurred on onyx-&lt;span class=&quot;error&quot;&gt;&amp;#91;64-67&amp;#93;&lt;/span&gt; test nodes with IB network.&lt;/p&gt;</comment>
                            <comment id="162566" author="yujian" created="Fri, 19 Aug 2016 19:35:12 +0000"  >&lt;p&gt;The failure occurred before in &lt;a href=&quot;https://jira.whamcloud.com/browse/LU-2200&quot; title=&quot;Test failure on test suite conf-sanity, subtest test_32a&quot; class=&quot;issue-link&quot; data-issue-key=&quot;LU-2200&quot;&gt;&lt;del&gt;LU-2200&lt;/del&gt;&lt;/a&gt; and Nathaniel created patch &lt;a href=&quot;http://review.whamcloud.com/6197&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;http://review.whamcloud.com/6197&lt;/a&gt; to fix that. Now the failure occurs again.&lt;/p&gt;</comment>
                            <comment id="165798" author="yong.fan" created="Tue, 13 Sep 2016 06:27:30 +0000"  >&lt;p&gt;Another failure instance on master:&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/51d0cee2-73b7-11e6-8afd-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/51d0cee2-73b7-11e6-8afd-5254006e85c2&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="168888" author="cengku9660" created="Mon, 10 Oct 2016 07:37:48 +0000"  >&lt;p&gt;Similar instance on master, but different error number.&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/b192bcbe-8eba-11e6-a9b0-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/b192bcbe-8eba-11e6-a9b0-5254006e85c2&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="171199" author="niu" created="Wed, 26 Oct 2016 16:07:34 +0000"  >&lt;p&gt;The network interface on onyx-&lt;span class=&quot;error&quot;&gt;&amp;#91;64-67&amp;#93;&lt;/span&gt; is IB, but I&apos;m not sure why &lt;a href=&quot;http://review.whamcloud.com/#/c/6197/&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;http://review.whamcloud.com/#/c/6197/&lt;/a&gt; didn&apos;t make it work, probably there are multiple types of interfaces on onyx-&lt;span class=&quot;error&quot;&gt;&amp;#91;64-67&amp;#93;&lt;/span&gt;, and that case can&apos;t be handled well by test script?&lt;/p&gt;</comment>
                            <comment id="175891" author="bogl" created="Thu, 1 Dec 2016 14:39:20 +0000"  >&lt;p&gt;more on master:&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/cb2a99e6-b776-11e6-be4d-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/cb2a99e6-b776-11e6-be4d-5254006e85c2&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/8585bbcc-b82b-11e6-847d-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/8585bbcc-b82b-11e6-847d-5254006e85c2&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="182996" author="casperjx" created="Thu, 2 Feb 2017 00:08:53 +0000"  >&lt;p&gt;In master branch, v2.9.52, b3499, the conf-sanity test_32a failure also caused 11 subsequent subtest failures (after 32a).&lt;/p&gt;</comment>
                            <comment id="201974" author="red" created="Thu, 13 Jul 2017 06:43:25 +0000"  >&lt;p&gt;I found an issue ( &lt;a href=&quot;https://jira.hpdd.intel.com/browse/LU-9760&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://jira.hpdd.intel.com/browse/LU-9760&lt;/a&gt; ) seems very much like this one, can some one help about this ?&lt;/p&gt;</comment>
                            <comment id="219763" author="mdiep" created="Thu, 1 Feb 2018 20:14:23 +0000"  >&lt;p&gt;+1 on master:&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/ebcce414-06e5-11e8-a10a-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/ebcce414-06e5-11e8-a10a-52540065bddc&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="268740" author="adilger" created="Tue, 28 Apr 2020 07:52:55 +0000"  >&lt;p&gt;Close old issue that has not been reported in a long time.&lt;/p&gt;</comment>
                    </comments>
                <issuelinks>
                            <issuelinktype id="10010">
                    <name>Duplicate</name>
                                                                <inwardlinks description="is duplicated by">
                                        <issuelink>
            <issuekey id="47185">LU-9760</issuekey>
        </issuelink>
                            </inwardlinks>
                                    </issuelinktype>
                    </issuelinks>
                <attachments>
                    </attachments>
                <subtasks>
                    </subtasks>
                <customfields>
                                                                                                                                                                                            <customfield id="customfield_10890" key="com.atlassian.jira.plugins.jira-development-integration-plugin:devsummary">
                        <customfieldname>Development</customfieldname>
                        <customfieldvalues>
                            
                        </customfieldvalues>
                    </customfield>
                                                                                                                                                                                                                                                                                                                                                        <customfield id="customfield_10390" key="com.pyxis.greenhopper.jira:gh-lexo-rank">
                        <customfieldname>Rank</customfieldname>
                        <customfieldvalues>
                            <customfieldvalue>1|hzyhgf:</customfieldvalue>

                        </customfieldvalues>
                    </customfield>
                                                                <customfield id="customfield_10090" key="com.pyxis.greenhopper.jira:gh-global-rank">
                        <customfieldname>Rank (Obsolete)</customfieldname>
                        <customfieldvalues>
                            <customfieldvalue>9223372036854775807</customfieldvalue>
                        </customfieldvalues>
                    </customfield>
                                                                                            <customfield id="customfield_10060" key="com.atlassian.jira.plugin.system.customfieldtypes:select">
                        <customfieldname>Severity</customfieldname>
                        <customfieldvalues>
                                <customfieldvalue key="10022"><![CDATA[3]]></customfieldvalue>

                        </customfieldvalues>
                    </customfield>
                                                                                                                                                                                                                                                                                                                                                        </customfields>
    </item>
</channel>
</rss>