<!-- 
RSS generated by JIRA (9.4.14#940014-sha1:734e6822bbf0d45eff9af51f82432957f73aa32c) at Sat Feb 10 02:34:03 UTC 2024

It is possible to restrict the fields that are returned in this document by specifying the 'field' parameter in your request.
For example, to request only the issue key and summary append 'field=key&field=summary' to the URL of your request.
-->
<rss version="0.92" >
<channel>
    <title>Whamcloud Community JIRA</title>
    <link>https://jira.whamcloud.com</link>
    <description>This file is an XML representation of an issue</description>
    <language>en-us</language>    <build-info>
        <version>9.4.14</version>
        <build-number>940014</build-number>
        <build-date>05-12-2023</build-date>
    </build-info>


<item>
            <title>[LU-10326] sanity test 60a times out on &#8216;umount -d /mnt/lustre-mds1&#8217;</title>
                <link>https://jira.whamcloud.com/browse/LU-10326</link>
                <project id="10000" key="LU">Lustre</project>
                    <description>&lt;p&gt;sanity test 60a hangs on unmount of the MDS for Ubuntu clients only. The last thing seen in the client test log is&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;NOW reload debugging syms..
CMD: trevis-18vm4 /usr/sbin/lctl dk
CMD: trevis-18vm4 which llog_reader 2&amp;gt; /dev/null
CMD: trevis-18vm4 grep -c /mnt/lustre-mds1&apos; &apos; /proc/mounts
Stopping /mnt/lustre-mds1 (opts:) on trevis-18vm4
CMD: trevis-18vm4 umount -d /mnt/lustre-mds1
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;Looking at the dmesg log on the MDS (vm4), all llog_test.c tests ran and completed, but we experienced issues when cleaning up/unmounting the MDS:&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;[ 3162.697626] Lustre: DEBUG MARKER: ls -d /sbin/llog_reader
[ 3163.062999] Lustre: DEBUG MARKER: grep -c /mnt/lustre-mds1&apos; &apos; /proc/mounts
[ 3163.339465] Lustre: DEBUG MARKER: umount -d /mnt/lustre-mds1
[ 3163.485523] LustreError: 25441:0:(ldlm_resource.c:1094:ldlm_resource_complain()) lustre-MDT0000-lwp-MDT0000: namespace resource [0x200000006:0x1010000:0x0].0x0 (ffff88005b72b6c0) refcount nonzero (1) after lock cleanup; forcing cleanup.
[ 3163.489978] LustreError: 25441:0:(ldlm_resource.c:1676:ldlm_resource_dump()) --- Resource: [0x200000006:0x1010000:0x0].0x0 (ffff88005b72b6c0) refcount = 2
[ 3163.493845] LustreError: 25441:0:(ldlm_resource.c:1679:ldlm_resource_dump()) Granted locks (in reverse order):
[ 3163.496226] LustreError: 25441:0:(ldlm_resource.c:1682:ldlm_resource_dump()) ### ### ns: lustre-MDT0000-lwp-MDT0000 lock: ffff88005b6a6d80/0xa7b5899cf22cc13d lrc: 2/1,0 mode: CR/CR res: [0x200000006:0x1010000:0x0].0x0 rrc: 3 type: PLN flags: 0x1106400000000 nid: local remote: 0xa7b5899cf22cc1bb expref: -99 pid: 12740 timeout: 0 lvb_type: 2
[ 3163.502995] LustreError: 25441:0:(ldlm_resource.c:1676:ldlm_resource_dump()) --- Resource: [0x200000006:0x10000:0x0].0x0 (ffff880000047e40) refcount = 2
[ 3163.507253] LustreError: 25441:0:(ldlm_resource.c:1679:ldlm_resource_dump()) Granted locks (in reverse order):
[ 3163.509922] Lustre: Failing over lustre-MDT0000
[ 3165.457763] Lustre: lustre-MDT0000: Not available for connect from 10.9.4.212@tcp (stopping)
[ 3165.572163] LustreError: 25441:0:(genops.c:436:class_free_dev()) Cleanup lustre-QMT0000 returned -95
[ 3165.574521] LustreError: 25441:0:(genops.c:436:class_free_dev()) Skipped 1 previous similar message
[ 3167.767456] Lustre: lustre-MDT0000: Not available for connect from 10.9.4.213@tcp (stopping)
[ 3167.770166] Lustre: Skipped 6 previous similar messages
[ 3170.455396] Lustre: lustre-MDT0000: Not available for connect from 10.9.4.212@tcp (stopping)
[ 3172.756491] Lustre: lustre-MDT0000: Not available for connect from 10.9.4.213@tcp (stopping)
[ 3172.759177] Lustre: Skipped 7 previous similar messages
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;This failure started on October 27, 2017 2.10.54 for master branch and on November 27, 2017 2.10.2 RC1  for b2_10&lt;/p&gt;

&lt;p&gt;Logs for this failure are at:&lt;br/&gt;
master branch&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/7f763f4e-d76b-11e7-a066-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/7f763f4e-d76b-11e7-a066-52540065bddc&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/6860bbbe-d021-11e7-a066-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/6860bbbe-d021-11e7-a066-52540065bddc&lt;/a&gt; (interop with 2.9.0)&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/591a4292-ca59-11e7-9840-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/591a4292-ca59-11e7-9840-52540065bddc&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/54b95766-bb6c-11e7-84a9-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/54b95766-bb6c-11e7-84a9-52540065bddc&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;b2_10&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/bbeaa6be-d459-11e7-9c63-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/bbeaa6be-d459-11e7-9c63-52540065bddc&lt;/a&gt;&lt;/p&gt;
</description>
                <environment>Ubuntu Lustre clients</environment>
        <key id="49553">LU-10326</key>
            <summary>sanity test 60a times out on &#8216;umount -d /mnt/lustre-mds1&#8217;</summary>
                <type id="1" iconUrl="https://jira.whamcloud.com/secure/viewavatar?size=xsmall&amp;avatarId=11303&amp;avatarType=issuetype">Bug</type>
                                            <priority id="4" iconUrl="https://jira.whamcloud.com/images/icons/priorities/minor.svg">Minor</priority>
                        <status id="5" iconUrl="https://jira.whamcloud.com/images/icons/statuses/resolved.png" description="A resolution has been taken, and it is awaiting verification by reporter. From here issues are either reopened, or are closed.">Resolved</status>
                    <statusCategory id="3" key="done" colorName="success"/>
                                    <resolution id="3">Duplicate</resolution>
                                        <assignee username="wc-triage">WC Triage</assignee>
                                    <reporter username="jamesanunez">James Nunez</reporter>
                        <labels>
                            <label>ubuntu</label>
                    </labels>
                <created>Mon, 4 Dec 2017 22:43:17 +0000</created>
                <updated>Tue, 19 Mar 2019 17:40:28 +0000</updated>
                            <resolved>Thu, 4 Jan 2018 20:54:30 +0000</resolved>
                                    <version>Lustre 2.11.0</version>
                    <version>Lustre 2.10.2</version>
                    <version>Lustre 2.12.0</version>
                                                        <due></due>
                            <votes>0</votes>
                                    <watches>3</watches>
                                                                            <comments>
                            <comment id="215304" author="jhammond" created="Tue, 5 Dec 2017 14:19:47 +0000"  >&lt;p&gt;Likely due to the same cause as &lt;a href=&quot;https://jira.whamcloud.com/browse/LU-10320&quot; title=&quot;sanity test 17g fails with &#8216;FAIL: &amp;lt;machine-name&amp;gt;:LBUG/LASSERT detected&#8217;.&quot; class=&quot;issue-link&quot; data-issue-key=&quot;LU-10320&quot;&gt;&lt;del&gt;LU-10320&lt;/del&gt;&lt;/a&gt;.&lt;/p&gt;</comment>
                            <comment id="228099" author="sarah" created="Thu, 17 May 2018 18:10:38 +0000"  >&lt;p&gt;+1 on b2_10 &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/051774a8-5956-11e8-abc3-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/051774a8-5956-11e8-abc3-52540065bddc&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="228816" author="sarah" created="Wed, 30 May 2018 00:44:29 +0000"  >&lt;p&gt;In tag-2.11.52 SLES12sp3 server/client testing, sanity 60a failed as similar reason. sanity test_17g passed in the same session&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/652db46e-5a74-11e8-abc3-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/652db46e-5a74-11e8-abc3-52540065bddc&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="244252" author="sarah" created="Tue, 19 Mar 2019 17:39:55 +0000"  >&lt;p&gt;similar issue hit in interop testing of 2.10.7&lt;br/&gt;
&lt;a href=&quot;https://testing.whamcloud.com/test_sets/5db78bc0-432a-11e9-92fe-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.whamcloud.com/test_sets/5db78bc0-432a-11e9-92fe-52540065bddc&lt;/a&gt;&lt;/p&gt;</comment>
                    </comments>
                <issuelinks>
                            <issuelinktype id="10010">
                    <name>Duplicate</name>
                                                                <inwardlinks description="is duplicated by">
                                        <issuelink>
            <issuekey id="49540">LU-10320</issuekey>
        </issuelink>
                            </inwardlinks>
                                    </issuelinktype>
                    </issuelinks>
                <attachments>
                    </attachments>
                <subtasks>
                    </subtasks>
                <customfields>
                                                                                                                                                                                            <customfield id="customfield_10890" key="com.atlassian.jira.plugins.jira-development-integration-plugin:devsummary">
                        <customfieldname>Development</customfieldname>
                        <customfieldvalues>
                            
                        </customfieldvalues>
                    </customfield>
                                                                                                                                                                                                                                                                                                                                                        <customfield id="customfield_10390" key="com.pyxis.greenhopper.jira:gh-lexo-rank">
                        <customfieldname>Rank</customfieldname>
                        <customfieldvalues>
                            <customfieldvalue>1|hzzoq7:</customfieldvalue>

                        </customfieldvalues>
                    </customfield>
                                                                <customfield id="customfield_10090" key="com.pyxis.greenhopper.jira:gh-global-rank">
                        <customfieldname>Rank (Obsolete)</customfieldname>
                        <customfieldvalues>
                            <customfieldvalue>9223372036854775807</customfieldvalue>
                        </customfieldvalues>
                    </customfield>
                                                                                            <customfield id="customfield_10060" key="com.atlassian.jira.plugin.system.customfieldtypes:select">
                        <customfieldname>Severity</customfieldname>
                        <customfieldvalues>
                                <customfieldvalue key="10022"><![CDATA[3]]></customfieldvalue>

                        </customfieldvalues>
                    </customfield>
                                                                                                                                                                                                                                                                                                                                                        </customfields>
    </item>
</channel>
</rss>