<!-- 
RSS generated by JIRA (9.4.14#940014-sha1:734e6822bbf0d45eff9af51f82432957f73aa32c) at Sat Feb 10 03:19:51 UTC 2024

It is possible to restrict the fields that are returned in this document by specifying the 'field' parameter in your request.
For example, to request only the issue key and summary append 'field=key&field=summary' to the URL of your request.
-->
<rss version="0.92" >
<channel>
    <title>Whamcloud Community JIRA</title>
    <link>https://jira.whamcloud.com</link>
    <description>This file is an XML representation of an issue</description>
    <language>en-us</language>    <build-info>
        <version>9.4.14</version>
        <build-number>940014</build-number>
        <build-date>05-12-2023</build-date>
    </build-info>


<item>
            <title>[LU-15616] sanity-lnet test_226: Timeout occurred after 112 minutes, last suite running was sanity-lnet</title>
                <link>https://jira.whamcloud.com/browse/LU-15616</link>
                <project id="10000" key="LU">Lustre</project>
                    <description>&lt;p&gt;This issue was created by maloo for Chris Horn &amp;lt;chris.horn@hpe.com&amp;gt;&lt;/p&gt;

&lt;p&gt;This issue relates to the following test suite run: &lt;a href=&quot;https://testing.whamcloud.com/test_sets/01cd6eae-ac03-48f6-980d-3977224dfaad&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.whamcloud.com/test_sets/01cd6eae-ac03-48f6-980d-3977224dfaad&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;test_226 failed with the following error:&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;Timeout occurred after 112 minutes, last suite running was sanity-lnet
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;LNetNIFini() and discovery thread appear to have hit a deadlock:&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;[Thu Mar  3 19:42:41 2022] INFO: task lnet_discovery:424118 blocked for more than 120 seconds.
[Thu Mar  3 19:42:41 2022]       Tainted: G           OE    --------- -  - 4.18.0-240.22.1.el8_lustre.x86_64 #1
[Thu Mar  3 19:42:41 2022] &quot;echo 0 &amp;gt; /proc/sys/kernel/hung_task_timeout_secs&quot; disables this message.
[Thu Mar  3 19:42:41 2022] lnet_discovery  D    0 424118      2 0x80004080
[Thu Mar  3 19:42:41 2022] Call Trace:
[Thu Mar  3 19:42:41 2022]  __schedule+0x2c4/0x700
[Thu Mar  3 19:42:41 2022]  schedule+0x38/0xa0
[Thu Mar  3 19:42:41 2022]  schedule_preempt_disabled+0xa/0x10
[Thu Mar  3 19:42:41 2022]  __mutex_lock.isra.5+0x2d0/0x4a0
[Thu Mar  3 19:42:41 2022]  lnet_peer_discovery+0x929/0x16c0 [lnet]
[Thu Mar  3 19:42:41 2022]  ? finish_wait+0x80/0x80
[Thu Mar  3 19:42:41 2022]  ? lnet_peer_merge_data+0xff0/0xff0 [lnet]
[Thu Mar  3 19:42:41 2022]  kthread+0x112/0x130
[Thu Mar  3 19:42:41 2022]  ? kthread_flush_work_fn+0x10/0x10
[Thu Mar  3 19:42:41 2022]  ret_from_fork+0x35/0x40
[Thu Mar  3 19:42:41 2022] INFO: task lnetctl:428295 blocked for more than 120 seconds.
[Thu Mar  3 19:42:41 2022]       Tainted: G           OE    --------- -  - 4.18.0-240.22.1.el8_lustre.x86_64 #1
[Thu Mar  3 19:42:41 2022] &quot;echo 0 &amp;gt; /proc/sys/kernel/hung_task_timeout_secs&quot; disables this message.
[Thu Mar  3 19:42:41 2022] lnetctl         D    0 428295 428283 0x00004080
[Thu Mar  3 19:42:41 2022] Call Trace:
[Thu Mar  3 19:42:41 2022]  __schedule+0x2c4/0x700
[Thu Mar  3 19:42:41 2022]  ? __wake_up_common_lock+0x89/0xc0
[Thu Mar  3 19:42:41 2022]  schedule+0x38/0xa0
[Thu Mar  3 19:42:41 2022]  lnet_peer_discovery_stop+0x112/0x260 [lnet]
[Thu Mar  3 19:42:41 2022]  ? finish_wait+0x80/0x80
[Thu Mar  3 19:42:41 2022]  LNetNIFini+0x5e/0x100 [lnet]
[Thu Mar  3 19:42:41 2022]  lnet_ioctl+0x220/0x260 [lnet]
[Thu Mar  3 19:42:41 2022]  notifier_call_chain+0x47/0x70
[Thu Mar  3 19:42:41 2022]  blocking_notifier_call_chain+0x3e/0x60
[Thu Mar  3 19:42:41 2022]  libcfs_psdev_ioctl+0x346/0x590 [libcfs]
[Thu Mar  3 19:42:41 2022]  do_vfs_ioctl+0xa4/0x640
[Thu Mar  3 19:42:41 2022]  ? syscall_trace_enter+0x1d3/0x2c0
[Thu Mar  3 19:42:41 2022]  ksys_ioctl+0x60/0x90
[Thu Mar  3 19:42:41 2022]  __x64_sys_ioctl+0x16/0x20
[Thu Mar  3 19:42:41 2022]  do_syscall_64+0x5b/0x1a0
[Thu Mar  3 19:42:41 2022]  entry_SYSCALL_64_after_hwframe+0x65/0xca
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;LNetNIFini() has the ln_api_mutex and is waiting for the discovery thread to stop. The discovery thread needs the mutex to progress.&lt;/p&gt;





&lt;p&gt;VVVVVVV DO NOT REMOVE LINES BELOW, Added by Maloo for auto-association VVVVVVV&lt;br/&gt;
sanity-lnet test_226 - Timeout occurred after 112 minutes, last suite running was sanity-lnet&lt;/p&gt;</description>
                <environment></environment>
        <key id="68964">LU-15616</key>
            <summary>sanity-lnet test_226: Timeout occurred after 112 minutes, last suite running was sanity-lnet</summary>
                <type id="1" iconUrl="https://jira.whamcloud.com/secure/viewavatar?size=xsmall&amp;avatarId=11303&amp;avatarType=issuetype">Bug</type>
                                            <priority id="4" iconUrl="https://jira.whamcloud.com/images/icons/priorities/minor.svg">Minor</priority>
                        <status id="5" iconUrl="https://jira.whamcloud.com/images/icons/statuses/resolved.png" description="A resolution has been taken, and it is awaiting verification by reporter. From here issues are either reopened, or are closed.">Resolved</status>
                    <statusCategory id="3" key="done" colorName="success"/>
                                    <resolution id="1">Fixed</resolution>
                                        <assignee username="hornc">Chris Horn</assignee>
                                    <reporter username="maloo">Maloo</reporter>
                        <labels>
                    </labels>
                <created>Fri, 4 Mar 2022 15:33:00 +0000</created>
                <updated>Fri, 23 Sep 2022 16:42:18 +0000</updated>
                            <resolved>Sun, 3 Apr 2022 18:18:58 +0000</resolved>
                                                    <fixVersion>Lustre 2.15.0</fixVersion>
                                        <due></due>
                            <votes>0</votes>
                                    <watches>3</watches>
                                                                            <comments>
                            <comment id="328092" author="hornc" created="Fri, 4 Mar 2022 15:35:45 +0000"  >&lt;p&gt;&lt;a href=&quot;https://jira.whamcloud.com/browse/LU-12233&quot; title=&quot;Deadlock on LNet shutdown&quot; class=&quot;issue-link&quot; data-issue-key=&quot;LU-12233&quot;&gt;&lt;del&gt;LU-12233&lt;/del&gt;&lt;/a&gt; was another issue with deadlock on shutdown.&lt;/p&gt;

&lt;p&gt;I&apos;ve seen other variations of this. For example:&lt;br/&gt;
LNet shutdown initiated by lustre_rmmod:&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;crash_x86_64&amp;gt; bt
PID: 524217  TASK: ffff8fc35f65af80  CPU: 2   COMMAND: &quot;rmmod&quot;
 #0 [ffffa02f069abc68] __schedule at ffffffffae54e1d4
 #1 [ffffa02f069abd00] schedule at ffffffffae54e648
 #2 [ffffa02f069abd10] schedule_timeout at ffffffffae551cd3
 #3 [ffffa02f069abda8] kiblnd_shutdown at ffffffffc0fa67e4 [ko2iblnd]
 #4 [ffffa02f069abe28] lnet_shutdown_lndni at ffffffffc0ef0e43 [lnet]
 #5 [ffffa02f069abe60] lnet_shutdown_lndnet at ffffffffc0ef10ac [lnet]
 #6 [ffffa02f069abe88] lnet_shutdown_lndnets at ffffffffc0ef2029 [lnet]
 #7 [ffffa02f069abed0] LNetNIFini at ffffffffc0ef3ccb [lnet]
 #8 [ffffa02f069abed8] ptlrpc_exit at ffffffffc14e772e [ptlrpc]
 #9 [ffffa02f069abee0] __x64_sys_delete_module at ffffffffadd8a0f9
#10 [ffffa02f069abf38] do_syscall_64 at ffffffffadc0420b
#11 [ffffa02f069abf50] entry_SYSCALL_64_after_hwframe at ffffffffae6000ad
    RIP: 00007f91fc94e72b  RSP: 00007ffcdb74c588  RFLAGS: 00000206
    RAX: ffffffffffffffda  RBX: 0000557518d633d0  RCX: 00007f91fc94e72b
    RDX: 000000000000000a  RSI: 0000000000000800  RDI: 0000557518d63438
    RBP: 0000000000000000   R8: 00007ffcdb74b501   R9: 0000000000000000
    R10: 00007f91fc9c2200  R11: 0000000000000206  R12: 00007ffcdb74c7b0
    R13: 00007ffcdb74e5f9  R14: 0000557518d612a0  R15: 0000557518d633d0
    ORIG_RAX: 00000000000000b0  CS: 0033  SS: 002b
crash_x86_64&amp;gt;
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;LNetNIFini() takes ln_api_mutex:&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;LNetNIFini(void)
{
        mutex_lock(&amp;amp;the_lnet.ln_api_mutex);
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;The shutdown seems to be stuck in kiblnd_shutdown():&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;*hornc@cflosbld09 LUS-10766 $ grep kiblnd_shutdown dk.log.fmt
00000800:00000200:1.0:1645008262.725728:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
00000800:00000200:1.0:1645008265.797688:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
00000800:00000200:1.0:1645008272.965750:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
00000800:00000200:1.0:1645008288.325727:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
00000800:00000200:1.0:1645008320.069708:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
00000800:00000200:1.0:1645008384.581724:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
00000800:00000200:2.0:1645008514.629693:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
00000800:00000200:2.0:1645008775.749732:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
00000800:00000200:8.0:1645009299.013721:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
00000800:00000200:8.0:1645010346.565723:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
00000800:00000200:18.0:1645012442.693692:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
00000800:00000200:4.0:1645016635.973730:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
00000800:00000200:5.0:1645025023.557725:0:524217:0:(o2iblnd.c:3050:kiblnd_shutdown()) 10.12.0.50@o2ib4000: waiting for 1 peers to disconnect
*hornc@cflosbld09 LUS-10766 $
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;The hung tasks in kern log come from someone running lctl commands:&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;crash_x86_64&amp;gt; ps | grep lctl
  620605  620604   6  ffff8fc2bf4c5f00  UN   0.0   32392   2656  lctl
  621757  621756   5  ffff8fad312797c0  UN   0.0   32392   2828  lctl
crash_x86_64&amp;gt; bt 620605
PID: 620605  TASK: ffff8fc2bf4c5f00  CPU: 6   COMMAND: &quot;lctl&quot;
 #0 [ffffa02f082c7c38] __schedule at ffffffffae54e1d4
 #1 [ffffa02f082c7cd0] schedule at ffffffffae54e648
 #2 [ffffa02f082c7ce0] schedule_preempt_disabled at ffffffffae54e98a
 #3 [ffffa02f082c7ce8] __mutex_lock at ffffffffae550690
 #4 [ffffa02f082c7d78] LNetNIInit at ffffffffc0ef5634 [lnet]
 #5 [ffffa02f082c7dd0] lnet_ioctl at ffffffffc0f16fdf [lnet]
 #6 [ffffa02f082c7de8] notifier_call_chain at ffffffffadd06197
 #7 [ffffa02f082c7e18] blocking_notifier_call_chain at ffffffffadd068be
 #8 [ffffa02f082c7e40] libcfs_psdev_ioctl at ffffffffc0c20cb6 [libcfs]
 #9 [ffffa02f082c7e80] do_vfs_ioctl at ffffffffadf2e084
#10 [ffffa02f082c7ef8] ksys_ioctl at ffffffffadf2e6c0
#11 [ffffa02f082c7f30] __x64_sys_ioctl at ffffffffadf2e706
#12 [ffffa02f082c7f38] do_syscall_64 at ffffffffadc0420b
#13 [ffffa02f082c7f50] entry_SYSCALL_64_after_hwframe at ffffffffae6000ad
    RIP: 00007fa4dc43563b  RSP: 00007ffee2603018  RFLAGS: 00000206
    RAX: ffffffffffffffda  RBX: 00007fa4ddbb8b20  RCX: 00007fa4dc43563b
    RDX: 00007ffee2603060  RSI: 00000000c0086532  RDI: 0000000000000003
    RBP: 00000000c0086532   R8: 00007fa4ddde5960   R9: 0000000000000003
    R10: 000000000000000f  R11: 0000000000000206  R12: 000055fd4563253b
    R13: 00007ffee2603060  R14: 0000000000000000  R15: 0000000000000000
    ORIG_RAX: 0000000000000010  CS: 0033  SS: 002b
crash_x86_64&amp;gt; bt 621757
PID: 621757  TASK: ffff8fad312797c0  CPU: 5   COMMAND: &quot;lctl&quot;
 #0 [ffffa02f06727c38] __schedule at ffffffffae54e1d4
 #1 [ffffa02f06727cd0] schedule at ffffffffae54e648
 #2 [ffffa02f06727ce0] schedule_preempt_disabled at ffffffffae54e98a
 #3 [ffffa02f06727ce8] __mutex_lock at ffffffffae550690
 #4 [ffffa02f06727d78] LNetNIInit at ffffffffc0ef5634 [lnet]
 #5 [ffffa02f06727dd0] lnet_ioctl at ffffffffc0f16fdf [lnet]
 #6 [ffffa02f06727de8] notifier_call_chain at ffffffffadd06197
 #7 [ffffa02f06727e18] blocking_notifier_call_chain at ffffffffadd068be
 #8 [ffffa02f06727e40] libcfs_psdev_ioctl at ffffffffc0c20cb6 [libcfs]
 #9 [ffffa02f06727e80] do_vfs_ioctl at ffffffffadf2e084
#10 [ffffa02f06727ef8] ksys_ioctl at ffffffffadf2e6c0
#11 [ffffa02f06727f30] __x64_sys_ioctl at ffffffffadf2e706
#12 [ffffa02f06727f38] do_syscall_64 at ffffffffadc0420b
#13 [ffffa02f06727f50] entry_SYSCALL_64_after_hwframe at ffffffffae6000ad
    RIP: 00007f881e90c63b  RSP: 00007ffca55ea0f8  RFLAGS: 00000202
    RAX: ffffffffffffffda  RBX: 00007f882008fb20  RCX: 00007f881e90c63b
    RDX: 00007ffca55ea140  RSI: 00000000c0086532  RDI: 0000000000000003
    RBP: 00000000c0086532   R8: 00007f88202bc960   R9: 0000000000000003
    R10: 000000000000000f  R11: 0000000000000202  R12: 0000561639d7753b
    R13: 00007ffca55ea140  R14: 0000000000000000  R15: 0000000000000000
    ORIG_RAX: 0000000000000010  CS: 0033  SS: 002b
crash_x86_64&amp;gt;
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;The lctl commands attempt to acquire the ln_api_mutex. Since it is held by shutdown code, and shutdown is stuck, the lctl commands are hung/stuck too.&lt;/p&gt;</comment>
                            <comment id="328301" author="gerrit" created="Mon, 7 Mar 2022 17:33:00 +0000"  >&lt;p&gt;&quot;Chris Horn &amp;lt;chris.horn@hpe.com&amp;gt;&quot; uploaded a new patch: &lt;a href=&quot;https://review.whamcloud.com/46727&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://review.whamcloud.com/46727&lt;/a&gt;&lt;br/&gt;
Subject: &lt;a href=&quot;https://jira.whamcloud.com/browse/LU-15616&quot; title=&quot;sanity-lnet test_226: Timeout occurred after 112 minutes, last suite running was sanity-lnet&quot; class=&quot;issue-link&quot; data-issue-key=&quot;LU-15616&quot;&gt;&lt;del&gt;LU-15616&lt;/del&gt;&lt;/a&gt; lnet: ln_api_mutex deadlocks&lt;br/&gt;
Project: fs/lustre-release&lt;br/&gt;
Branch: master&lt;br/&gt;
Current Patch Set: 1&lt;br/&gt;
Commit: 60722c29610275f533b4ca8e28fa9852c570a52d&lt;/p&gt;</comment>
                            <comment id="330865" author="hornc" created="Fri, 1 Apr 2022 18:53:31 +0000"  >&lt;p&gt;One more example of this with 2.12:&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;&amp;lt;root@ncn-w001&amp;gt;ps axOT | grep -Pi &apos;^\s*\S+\s+\S+\s+D&apos;
1117335 ?        D      0:00 [kiblnd_sd_02_00]
1142272 ?        D      0:00 lnetctl lnet unconfigure
&amp;lt;root@ncn-w001&amp;gt;cat /proc/1117335/stack
[&amp;lt;0&amp;gt;] lnet_nid2peerni_locked+0x53/0x170 [lnet]
[&amp;lt;0&amp;gt;] lnet_parse+0x4c9/0x12e0 [lnet]
[&amp;lt;0&amp;gt;] kiblnd_handle_rx+0x1bd/0x630 [ko2iblnd]
[&amp;lt;0&amp;gt;] kiblnd_scheduler+0xfc8/0x1020 [ko2iblnd]
[&amp;lt;0&amp;gt;] kthread+0x10d/0x130
[&amp;lt;0&amp;gt;] ret_from_fork+0x22/0x40
&amp;lt;root@ncn-w001&amp;gt;cat /proc/1142272/stack
[&amp;lt;0&amp;gt;] kiblnd_shutdown+0xfa/0x490 [ko2iblnd]
[&amp;lt;0&amp;gt;] lnet_shutdown_lndni+0x265/0x460 [lnet]
[&amp;lt;0&amp;gt;] lnet_shutdown_lndnet+0x67/0xd0 [lnet]
[&amp;lt;0&amp;gt;] lnet_shutdown_lndnets+0x119/0x2d0 [lnet]
[&amp;lt;0&amp;gt;] LNetNIFini+0x8b/0x100 [lnet]
[&amp;lt;0&amp;gt;] lnet_ioctl+0x222/0x250 [lnet]
[&amp;lt;0&amp;gt;] notifier_call_chain+0x47/0x70
[&amp;lt;0&amp;gt;] blocking_notifier_call_chain+0x3e/0x60
[&amp;lt;0&amp;gt;] libcfs_ioctl+0xab/0x4a0 [libcfs]
[&amp;lt;0&amp;gt;] libcfs_psdev_ioctl+0xba/0xd0 [libcfs]
[&amp;lt;0&amp;gt;] do_vfs_ioctl+0xa0/0x680
[&amp;lt;0&amp;gt;] ksys_ioctl+0x70/0x80
[&amp;lt;0&amp;gt;] __x64_sys_ioctl+0x16/0x20
[&amp;lt;0&amp;gt;] do_syscall_64+0x5b/0x1e0
[&amp;lt;0&amp;gt;] entry_SYSCALL_64_after_hwframe+0x44/0xa9
&amp;lt;root@ncn-w001
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;</comment>
                            <comment id="330916" author="gerrit" created="Sun, 3 Apr 2022 16:08:53 +0000"  >&lt;p&gt;&quot;Oleg Drokin &amp;lt;green@whamcloud.com&amp;gt;&quot; merged in patch &lt;a href=&quot;https://review.whamcloud.com/46727/&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://review.whamcloud.com/46727/&lt;/a&gt;&lt;br/&gt;
Subject: &lt;a href=&quot;https://jira.whamcloud.com/browse/LU-15616&quot; title=&quot;sanity-lnet test_226: Timeout occurred after 112 minutes, last suite running was sanity-lnet&quot; class=&quot;issue-link&quot; data-issue-key=&quot;LU-15616&quot;&gt;&lt;del&gt;LU-15616&lt;/del&gt;&lt;/a&gt; lnet: ln_api_mutex deadlocks&lt;br/&gt;
Project: fs/lustre-release&lt;br/&gt;
Branch: master&lt;br/&gt;
Current Patch Set: &lt;br/&gt;
Commit: 22de0bd145b649768b16dd42559d326af3c13200&lt;/p&gt;</comment>
                            <comment id="330920" author="pjones" created="Sun, 3 Apr 2022 18:18:58 +0000"  >&lt;p&gt;Landed for 2.15&lt;/p&gt;</comment>
                    </comments>
                <issuelinks>
                            <issuelinktype id="10010">
                    <name>Duplicate</name>
                                                                <inwardlinks description="is duplicated by">
                                        <issuelink>
            <issuekey id="69103">LU-15650</issuekey>
        </issuelink>
            <issuelink>
            <issuekey id="55330">LU-12148</issuekey>
        </issuelink>
            <issuelink>
            <issuekey id="58026">LU-13218</issuekey>
        </issuelink>
                            </inwardlinks>
                                    </issuelinktype>
                            <issuelinktype id="10011">
                    <name>Related</name>
                                                                <inwardlinks description="is related to">
                                        <issuelink>
            <issuekey id="69493">LU-15705</issuekey>
        </issuelink>
                            </inwardlinks>
                                    </issuelinktype>
                    </issuelinks>
                <attachments>
                    </attachments>
                <subtasks>
                    </subtasks>
                <customfields>
                                                                                                                                                                                            <customfield id="customfield_10890" key="com.atlassian.jira.plugins.jira-development-integration-plugin:devsummary">
                        <customfieldname>Development</customfieldname>
                        <customfieldvalues>
                            
                        </customfieldvalues>
                    </customfield>
                                                                                                                                                                                                                                                                                                                                                        <customfield id="customfield_10390" key="com.pyxis.greenhopper.jira:gh-lexo-rank">
                        <customfieldname>Rank</customfieldname>
                        <customfieldvalues>
                            <customfieldvalue>1|i02jz3:</customfieldvalue>

                        </customfieldvalues>
                    </customfield>
                                                                <customfield id="customfield_10090" key="com.pyxis.greenhopper.jira:gh-global-rank">
                        <customfieldname>Rank (Obsolete)</customfieldname>
                        <customfieldvalues>
                            <customfieldvalue>9223372036854775807</customfieldvalue>
                        </customfieldvalues>
                    </customfield>
                                                                                            <customfield id="customfield_10060" key="com.atlassian.jira.plugin.system.customfieldtypes:select">
                        <customfieldname>Severity</customfieldname>
                        <customfieldvalues>
                                <customfieldvalue key="10022"><![CDATA[3]]></customfieldvalue>

                        </customfieldvalues>
                    </customfield>
                                                                                                                                                                                                                                                                                                                                                        </customfields>
    </item>
</channel>
</rss>