<!-- 
RSS generated by JIRA (9.4.14#940014-sha1:734e6822bbf0d45eff9af51f82432957f73aa32c) at Sat Feb 10 02:55:55 UTC 2024

It is possible to restrict the fields that are returned in this document by specifying the 'field' parameter in your request.
For example, to request only the issue key and summary append 'field=key&field=summary' to the URL of your request.
-->
<rss version="0.92" >
<channel>
    <title>Whamcloud Community JIRA</title>
    <link>https://jira.whamcloud.com</link>
    <description>This file is an XML representation of an issue</description>
    <language>en-us</language>    <build-info>
        <version>9.4.14</version>
        <build-number>940014</build-number>
        <build-date>05-12-2023</build-date>
    </build-info>


<item>
            <title>[LU-12818] replay-single test_70b and other tests fail with &#8220;Numerical result out of range&#8221; error</title>
                <link>https://jira.whamcloud.com/browse/LU-12818</link>
                <project id="10000" key="LU">Lustre</project>
                    <description>&lt;p&gt;Many replay-single tests fail for RHEL8 and Ubuntu18.04 with the &#8220;Numerical result out of range&#8221; error.  All tests have different failure messages, but they all have the &#8220;Numerical result out of range&#8221; error during file open. &lt;/p&gt;

&lt;p&gt;Looking at the suite_log for &lt;a href=&quot;https://testing.whamcloud.com/test_sets/0f9f03d0-c158-11e9-90ad-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.whamcloud.com/test_sets/0f9f03d0-c158-11e9-90ad-52540065bddc&lt;/a&gt;, we see replay-single tests 13, 14, 15, 18, 21, 22, 23, 31, 32, 33a, 53b, 53c, 53g, 53h, and 70b all fail&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;== replay-single test 13: open chmod 0 |x| write close =============================================== 20:22:03 (1566073323)
multiop /mnt/lustre/f13.replay-single vO_wc
TMPPIPE=/tmp/multiop_open_wait_pipe.29190
open(O_RDWR|O_CREAT): Numerical result out of range
 replay-single test_13: @@@@@@ FAIL: multiop_bg_pause /mnt/lustre/f13.replay-single failed 
&#8230;
== replay-single test 14: open(O_CREAT), unlink |X| close ============================================ 20:22:07 (1566073327)
multiop /mnt/lustre/f14.replay-single vO_tSc
TMPPIPE=/tmp/multiop_open_wait_pipe.29190
open(O_RDWR|O_CREAT): Numerical result out of range
 replay-single test_14: @@@@@@ FAIL: multiop_bg_pause /mnt/lustre/f14.replay-single failed 
&#8230;
== replay-single test 15: open(O_CREAT), unlink |X|  touch new, close ================================ 20:22:11 (1566073331)
multiop /mnt/lustre/f15.replay-single vO_tSc
TMPPIPE=/tmp/multiop_open_wait_pipe.29190
open(O_RDWR|O_CREAT): Numerical result out of range
 replay-single test_15: @@@@@@ FAIL: multiop_bg_pause /mnt/lustre/f15.replay-single failed 
&#8230;
trevis-8vm6: mdc.lustre-MDT0000-mdc-*.mds_server_uuid in FULL state after 0 sec
touch: cannot touch &apos;/mnt/lustre/f18.replay-single-3&apos;: Numerical result out of range
 replay-single test_18: @@@@@@ FAIL: touch /mnt/lustre/f18.replay-single-3 failed 
&#8230;
trevis-8vm6: mdc.lustre-MDT0000-mdc-*.mds_server_uuid in FULL state after 0 sec
touch: cannot touch &apos;/mnt/lustre/f21.replay-single-2&apos;: Numerical result out of range
 replay-single test_21: @@@@@@ FAIL: touch /mnt/lustre/f21.replay-single-2 failed 
&#8230;
== replay-single test 22: open(O_CREAT), |X| unlink, replay, close (test mds_cleanup_orphans) ======== 20:33:35 (1566074015)
multiop /mnt/lustre/f22.replay-single vO_tSc
TMPPIPE=/tmp/multiop_open_wait_pipe.29190
open(O_RDWR|O_CREAT): Numerical result out of range
&#8230;
== replay-single test 23: open(O_CREAT), |X| unlink touch new, replay, close (test mds_cleanup_orphans) ====================================================================================================== 20:33:39 (1566074019)
multiop /mnt/lustre/f23.replay-single vO_tSc
TMPPIPE=/tmp/multiop_open_wait_pipe.29190
open(O_RDWR|O_CREAT): Numerical result out of range
 replay-single test_23: @@@@@@ FAIL: multiop_bg_pause /mnt/lustre/f23.replay-single failed 
&#8230;
== replay-single test 31: open(O_CREAT) two, unlink one, |X| unlink one, close two (test mds_cleanup_orphans) ====================================================================================================== 20:44:50 (1566074690)
multiop /mnt/lustre/f31.replay-single-1 vO_tSc
TMPPIPE=/tmp/multiop_open_wait_pipe.29190
open(O_RDWR|O_CREAT): Numerical result out of range
 replay-single test_31: @@@@@@ FAIL: multiop_bg_pause /mnt/lustre/f31.replay-single-1 failed 
&#8230;
== replay-single test 32: close() notices client eviction; close() after client eviction ============= 20:44:54 (1566074694)
multiop /mnt/lustre/f32.replay-single vO_c
TMPPIPE=/tmp/multiop_open_wait_pipe.29190
open(O_RDWR|O_CREAT): Numerical result out of range
 replay-single test_32: @@@@@@ FAIL: multiop_bg_pause /mnt/lustre/f32.replay-single failed 
&#8230;
== replay-single test 33a: fid seq shouldn&apos;t be reused after abort recovery ========================== 20:44:58 (1566074698)
open(/mnt/lustre/f33a.replay-single-0) error: Numerical result out of range
total: 0 open/close in 0.00 seconds: 0.00 ops/second
 replay-single test_33a: @@@@@@ FAIL: createmany create /mnt/lustre/f33a.replay-single failed 
&#8230;
== replay-single test 53b: |X| open request while two MDC requests in flight ========================= 21:35:13 (1566077713)
multiop /mnt/lustre/d53b.replay-single-1/f vO_c
TMPPIPE=/tmp/multiop_open_wait_pipe.29190
open(O_RDWR|O_CREAT): Numerical result out of range
 replay-single test_53b: @@@@@@ FAIL: multiop_bg_pause /mnt/lustre/d53b.replay-single-1/f failed 
&#8230;
== replay-single test 53c: |X| open request and close request while two MDC requests in flight ======= 21:35:17 (1566077717)
open(O_RDWR|O_CREAT): Numerical result out of range
CMD: trevis-8vm10 lctl set_param fail_loc=0x80000107
fail_loc=0x80000107
CMD: trevis-8vm10 lctl set_param fail_loc=0x80000115
fail_loc=0x80000115
/usr/lib64/lustre/tests/replay-single.sh: line 1294: kill: (15973) - No such process
&#8230;
== replay-single test 53g: |X| drop open reply and close request while close and open are both in flight ====================================================================================================== 21:39:58 (1566077998)
open(O_RDWR|O_CREAT): Numerical result out of range
CMD: trevis-8vm11 lctl set_param fail_loc=0x119
fail_loc=0x119
CMD: trevis-8vm11 lctl set_param fail_loc=0x80000115
fail_loc=0x80000115
/usr/lib64/lustre/tests/replay-single.sh: line 1437: kill: (22835) - No such process
CMD: trevis-8vm11 lctl set_param fail_loc=0
fail_loc=0
 replay-single test_53g: @@@@@@ FAIL: close_pid doesn&apos;t exist 
&#8230;
== replay-single test 53h: open request and close reply while two MDC requests in flight ============= 21:40:07 (1566078007)
open(O_RDWR|O_CREAT): Numerical result out of range
CMD: trevis-8vm11 lctl set_param fail_loc=0x80000107
fail_loc=0x80000107
CMD: trevis-8vm11 lctl set_param fail_loc=0x8000013b
fail_loc=0x8000013b
/usr/lib64/lustre/tests/replay-single.sh: line 1475: kill: (23667) - No such process
 replay-single test_53h: @@@@@@ FAIL: close_pid doesn&apos;t exist 
&#8230;
trevis-8vm7: [44191] open ./clients/client0/~dmtmp/PARADOX/__40D6B.DB failed for handle 10888 (Numerical result out of range)
trevis-8vm7: (44193) ERROR: handle 10888 was not found
trevis-8vm7: Child failed with status 1
trevis-8vm7: dbench: no process found
pdsh@trevis-8vm4: trevis-8vm7: ssh exited with exit code 1
trevis-8vm4: [44069] open ./clients/client0/~dmtmp/PARADOX/__3F2C4.DB failed for handle 10851 (Numerical result out of range)
trevis-8vm4: (44071) ERROR: handle 10851 was not found
trevis-8vm6:    1     44591     0.68 MB/sec  execute 193 sec  latency 72567.258 ms
trevis-8vm4: Child failed with status 1
trevis-8vm4: dbench: no process found
pdsh@trevis-8vm4: trevis-8vm4: ssh exited with exit code 1
trevis-8vm6: [44660] open ./clients/client0/~dmtmp/PARADOX/__50172.DB failed for handle 11004 (Numerical result out of range)
trevis-8vm6: (44662) ERROR: handle 11004 was not found
trevis-8vm6: Child failed with status 1
trevis-8vm6: dbench: no process found
&#8230;
trevis-8vm6: dbench: no process found
pdsh@trevis-8vm4: trevis-8vm6: ssh exited with exit code 1
 replay-single test_70b: @@@@@@ FAIL: dbench stopped on some of trevis-8vm4.trevis.whamcloud.com,trevis-8vm6,trevis-8vm7! 
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;Different test sessions have different tests that fail. For example tests 28, 29, 30, 31, 32, 53f, 53g, and 70b for the Ubuntu 18.04 test session &lt;a href=&quot;https://testing.whamcloud.com/test_sets/9742e3aa-cc8f-11e9-98c8-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.whamcloud.com/test_sets/9742e3aa-cc8f-11e9-98c8-52540065bddc&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Although the error messages are different, this issue may be related to &lt;a href=&quot;https://jira.whamcloud.com/browse/LU-10613&quot; title=&quot;replay-single tests 20c, 21, 23, 24, 25, 26, 30, 48, 53f, 53g, 62, 70b, 70c,  fails on open with &#8216; No space left on device&#8217;&quot; class=&quot;issue-link&quot; data-issue-key=&quot;LU-10613&quot;&gt;LU-10613&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Logs for other failed test sessions are at:&lt;br/&gt;
&lt;a href=&quot;https://testing.whamcloud.com/test_sets/b80265a8-d8af-11e9-a25b-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.whamcloud.com/test_sets/b80265a8-d8af-11e9-a25b-52540065bddc&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://testing.whamcloud.com/test_sets/c68821fe-b9fa-11e9-97d5-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.whamcloud.com/test_sets/c68821fe-b9fa-11e9-97d5-52540065bddc&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://testing.whamcloud.com/test_sets/6615df92-d105-11e9-90ad-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.whamcloud.com/test_sets/6615df92-d105-11e9-90ad-52540065bddc&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://testing.whamcloud.com/test_sets/95922fb6-d80b-11e9-98c8-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.whamcloud.com/test_sets/95922fb6-d80b-11e9-98c8-52540065bddc&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://testing.whamcloud.com/test_sets/211af22a-ddcf-11e9-a197-52540065bddc&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.whamcloud.com/test_sets/211af22a-ddcf-11e9-a197-52540065bddc&lt;/a&gt;&lt;/p&gt;</description>
                <environment>RHEL8 and Ubuntu 18.04</environment>
        <key id="57023">LU-12818</key>
            <summary>replay-single test_70b and other tests fail with &#8220;Numerical result out of range&#8221; error</summary>
                <type id="1" iconUrl="https://jira.whamcloud.com/secure/viewavatar?size=xsmall&amp;avatarId=11303&amp;avatarType=issuetype">Bug</type>
                                            <priority id="4" iconUrl="https://jira.whamcloud.com/images/icons/priorities/minor.svg">Minor</priority>
                        <status id="5" iconUrl="https://jira.whamcloud.com/images/icons/statuses/resolved.png" description="A resolution has been taken, and it is awaiting verification by reporter. From here issues are either reopened, or are closed.">Resolved</status>
                    <statusCategory id="3" key="done" colorName="success"/>
                                    <resolution id="5">Cannot Reproduce</resolution>
                                        <assignee username="hongchao.zhang">Hongchao Zhang</assignee>
                                    <reporter username="jamesanunez">James Nunez</reporter>
                        <labels>
                            <label>rhel8</label>
                            <label>ubuntu18</label>
                            <label>zfs</label>
                    </labels>
                <created>Fri, 27 Sep 2019 19:39:24 +0000</created>
                <updated>Thu, 31 Mar 2022 15:20:08 +0000</updated>
                            <resolved>Thu, 31 Mar 2022 15:20:08 +0000</resolved>
                                    <version>Lustre 2.13.0</version>
                    <version>Lustre 2.14.0</version>
                    <version>Lustre 2.12.4</version>
                                                        <due></due>
                            <votes>0</votes>
                                    <watches>6</watches>
                                                                            <comments>
                            <comment id="257570" author="pjones" created="Sun, 3 Nov 2019 14:09:34 +0000"  >&lt;p&gt;Hongchao&lt;/p&gt;

&lt;p&gt;Can you please advise&lt;/p&gt;

&lt;p&gt;Peter&lt;/p&gt;</comment>
                            <comment id="258345" author="gerrit" created="Fri, 15 Nov 2019 03:24:04 +0000"  >&lt;p&gt;Hongchao Zhang (hongchao@whamcloud.com) uploaded a new patch: &lt;a href=&quot;https://review.whamcloud.com/36762&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://review.whamcloud.com/36762&lt;/a&gt;&lt;br/&gt;
Subject: &lt;a href=&quot;https://jira.whamcloud.com/browse/LU-12818&quot; title=&quot;replay-single test_70b and other tests fail with &#8220;Numerical result out of range&#8221; error&quot; class=&quot;issue-link&quot; data-issue-key=&quot;LU-12818&quot;&gt;&lt;del&gt;LU-12818&lt;/del&gt;&lt;/a&gt; test: debug patch&lt;br/&gt;
Project: fs/lustre-release&lt;br/&gt;
Branch: master&lt;br/&gt;
Current Patch Set: 1&lt;br/&gt;
Commit: 6ca3d6684d7c6adbc812a1c84c9afa8c8fcc7050&lt;/p&gt;</comment>
                            <comment id="262324" author="adilger" created="Fri, 31 Jan 2020 17:36:37 +0000"  >&lt;p&gt;I just saw this failure recently for a patch I was looking at, but I didn&apos;t find this bug at the time, so +1 but I can&apos;t link to the test results.&lt;/p&gt;</comment>
                            <comment id="289029" author="jhammond" created="Fri, 8 Jan 2021 13:57:28 +0000"  >&lt;p&gt;&lt;a href=&quot;https://jira.whamcloud.com/secure/ViewProfile.jspa?name=ys&quot; class=&quot;user-hover&quot; rel=&quot;ys&quot;&gt;ys&lt;/a&gt;&apos;s comment from &lt;a href=&quot;https://jira.whamcloud.com/browse/LU-13995&quot; title=&quot;conf-sanity_112: Numerical result out of range&quot; class=&quot;issue-link&quot; data-issue-key=&quot;LU-13995&quot;&gt;&lt;del&gt;LU-13995&lt;/del&gt;&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Looks like we need ensure ost0 is FULL before create on it?&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;
---client side

00010000:00000001:1.0:1603787411.358695:0:25028:0:(ldlm_lock.c:249:ldlm_lock_put()) Process leaving
00010000:00000001:1.0:1603787411.358696:0:25028:0:(ldlm_request.c:1136:ldlm_cli_enqueue()) Process leaving (rc=301 : 301 : 12d)
00000002:00000001:1.0:1603787411.358697:0:25028:0:(mdc_locks.c:686:mdc_finish_enqueue()) Process entered
00000002:00100000:1.0:1603787411.358698:0:25028:0:(mdc_locks.c:741:mdc_finish_enqueue()) @@@ op=1 disposition=3, status=-34  req@ffff8b1a79d88d80 x1681692846730048/t0(0) o101-&amp;gt;lustre-MDT0000-mdc-ffff8b1a7a48f000@10.9.7.66@tcp:12/10 lens 648/600 e 0 to 0 dl 1603787418 ref 1 fl Complete:RQU/0/0 rc 301/301 job:&apos;lfs.0&apos;

--server side

00020000:00000001:1.0:1603787411.359031:0:23957:0:(lod_qos.c:108:lod_statfs_and_check()) Process entered
00000004:00000001:1.0:1603787411.359032:0:23957:0:(osp_dev.c:774:osp_statfs()) Process entered
00000004:00000001:1.0:1603787411.359033:0:23957:0:(osp_dev.c:780:osp_statfs()) Process leaving (rc=18446744073709551509 : -107 : ffffffffffffff95)
00020000:01000000:1.0:1603787411.359034:0:23957:0:(lod_qos.c:143:lod_statfs_and_check()) lustre-OST0000-osc-MDT0000: turns inactive
00020000:00000001:1.0:1603787411.359034:0:23957:0:(lod_qos.c:171:lod_statfs_and_check()) Process leaving (rc=18446744073709551495 : -121 : ffffffffffffff87)
00020000:00000001:1.0:1603787411.359035:0:23957:0:(lod_qos.c:108:lod_statfs_and_check()) Process entered
00000004:00000001:1.0:1603787411.359036:0:23957:0:(osp_dev.c:774:osp_statfs()) Process entered
00000004:00001000:1.0:1603787411.359037:0:23957:0:(osp_dev.c:797:osp_statfs()) lustre-OST0001-osc-MDT0000: 95744 blocks, 94976 free, 84480 avail, 4096 bsize, 1 reserved mb low, 3 reserved mb high, 12231 files, 11872 free files
00000004:00000001:1.0:1603787411.359039:0:23957:0:(osp_dev.c:816:osp_statfs()) Process leaving (rc=0 : 0 : 0)
00020000:01000000:1.0:1603787411.359040:0:23957:0:(lod_qos.c:143:lod_statfs_and_check()) lustre-OST0001-osc-MDT0000: turns inactive
00020000:00000001:1.0:1603787411.359040:0:23957:0:(lod_qos.c:171:lod_statfs_and_check()) Process leaving (rc=18446744073709551511 : -105 : ffffffffffffff97)
00020000:00000001:1.0:1603787411.359041:0:23957:0:(lod_qos.c:232:lod_qos_statfs_update()) Process leaving
00020000:00000001:1.0:1603787411.359042:0:23957:0:(lod_qos.c:2509:lod_qos_prep_create()) Process leaving via out (rc=18446744073709551582 : -34 : 0xffffffffffffffde)
00020000:00000001:1.0:1603787411.359043:0:23957:0:(lod_qos.c:2607:lod_qos_prep_create()) Process leaving (rc=18446744073709551582 : -34 : ffffffffffffffde)
00020000:00000001:1.0:1603787411.359044:0:23957:0:(lod_qos.c:2663:lod_prepare_create()) Process leaving (rc=18446744073709551582 : -34 : ffffffffffffffde)
00000004:00000001:1.0:1603787411.359045:0:23957:0:(lod_object.c:5474:lod_declare_striped_create()) Process leaving via out (rc=18446744073709551582 : -34 : 0xffffffffffffffde)
00000004:00000010:1.0:1603787411.359046:0:23957:0:(lod_lov.c:492:lod_free_comp_buffer()) kfreed &apos;entries&apos;: 120 at ffff93c7af951f80.
00000004:00000001:1.0:1603787411.359047:0:23957:0:(lod_object.c:5509:lod_declare_striped_create()) Process leaving (rc=18446744073709551582 : -34 : ffffffffffffffde)
00000004:00000001:1.0:1603787411.359048:0:23957:0:(lod_object.c:3491:lod_declare_xattr_set()) Process leaving (rc=18446744073709551582 : -34 : ffffffffffffffde)
00000004:00000001:1.0:1603787411.359049:0:23957:0:(mdd_dir.c:1923:mdd_create_data()) Process leaving via stop (rc=18446744073709551582 : -34 : 0xffffffffffffffde)

&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;</comment>
                            <comment id="289032" author="jhammond" created="Fri, 8 Jan 2021 13:58:17 +0000"  >&lt;p&gt;&lt;a href=&quot;https://testing.whamcloud.com/test_sets/a242a505-11a4-4c88-9317-21d1ce234094&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.whamcloud.com/test_sets/a242a505-11a4-4c88-9317-21d1ce234094&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="290028" author="adilger" created="Thu, 21 Jan 2021 13:54:32 +0000"  >&lt;p&gt;+1 on master &lt;a href=&quot;https://testing.whamcloud.com/test_sets/9072748b-8448-46af-a2df-0e006ed552f1&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.whamcloud.com/test_sets/9072748b-8448-46af-a2df-0e006ed552f1&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="290341" author="adilger" created="Tue, 26 Jan 2021 02:23:06 +0000"  >&lt;p&gt;There is a patch: &lt;a href=&quot;https://review.whamcloud.com/37393&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://review.whamcloud.com/37393&lt;/a&gt; &quot;&lt;tt&gt;&lt;a href=&quot;https://jira.whamcloud.com/browse/LU-13184&quot; title=&quot;conf-sanity test_112: problem creating f112.conf-sanity.0 on OST0000&quot; class=&quot;issue-link&quot; data-issue-key=&quot;LU-13184&quot;&gt;&lt;del&gt;LU-13184&lt;/del&gt;&lt;/a&gt; tests: wait for OST startup in test_112&lt;/tt&gt;&quot; which may fix this problem.&lt;/p&gt;</comment>
                    </comments>
                <issuelinks>
                            <issuelinktype id="10010">
                    <name>Duplicate</name>
                                                                <inwardlinks description="is duplicated by">
                                        <issuelink>
            <issuekey id="60946">LU-13995</issuekey>
        </issuelink>
                            </inwardlinks>
                                    </issuelinktype>
                            <issuelinktype id="10011">
                    <name>Related</name>
                                            <outwardlinks description="is related to ">
                                        <issuelink>
            <issuekey id="58029">LU-13221</issuekey>
        </issuelink>
            <issuelink>
            <issuekey id="60085">LU-13813</issuekey>
        </issuelink>
                            </outwardlinks>
                                                                <inwardlinks description="is related to">
                                        <issuelink>
            <issuekey id="56494">LU-12589</issuekey>
        </issuelink>
            <issuelink>
            <issuekey id="57959">LU-13184</issuekey>
        </issuelink>
                            </inwardlinks>
                                    </issuelinktype>
                    </issuelinks>
                <attachments>
                    </attachments>
                <subtasks>
                    </subtasks>
                <customfields>
                                                                                                                                                                                            <customfield id="customfield_10890" key="com.atlassian.jira.plugins.jira-development-integration-plugin:devsummary">
                        <customfieldname>Development</customfieldname>
                        <customfieldvalues>
                            
                        </customfieldvalues>
                    </customfield>
                                                                                                                                                                                                                                                                                                                                                        <customfield id="customfield_10390" key="com.pyxis.greenhopper.jira:gh-lexo-rank">
                        <customfieldname>Rank</customfieldname>
                        <customfieldvalues>
                            <customfieldvalue>1|i00ngf:</customfieldvalue>

                        </customfieldvalues>
                    </customfield>
                                                                <customfield id="customfield_10090" key="com.pyxis.greenhopper.jira:gh-global-rank">
                        <customfieldname>Rank (Obsolete)</customfieldname>
                        <customfieldvalues>
                            <customfieldvalue>9223372036854775807</customfieldvalue>
                        </customfieldvalues>
                    </customfield>
                                                                                            <customfield id="customfield_10060" key="com.atlassian.jira.plugin.system.customfieldtypes:select">
                        <customfieldname>Severity</customfieldname>
                        <customfieldvalues>
                                <customfieldvalue key="10022"><![CDATA[3]]></customfieldvalue>

                        </customfieldvalues>
                    </customfield>
                                                                                                                                                                                                                                                                                                                                                        </customfields>
    </item>
</channel>
</rss>