<!-- 
RSS generated by JIRA (9.4.14#940014-sha1:734e6822bbf0d45eff9af51f82432957f73aa32c) at Sat Feb 10 02:07:20 UTC 2024

It is possible to restrict the fields that are returned in this document by specifying the 'field' parameter in your request.
For example, to request only the issue key and summary append 'field=key&field=summary' to the URL of your request.
-->
<rss version="0.92" >
<channel>
    <title>Whamcloud Community JIRA</title>
    <link>https://jira.whamcloud.com</link>
    <description>This file is an XML representation of an issue</description>
    <language>en-us</language>    <build-info>
        <version>9.4.14</version>
        <build-number>940014</build-number>
        <build-date>05-12-2023</build-date>
    </build-info>


<item>
            <title>[LU-7256] sanity-lfsck TIMEOUT on umount /mnt/mds4</title>
                <link>https://jira.whamcloud.com/browse/LU-7256</link>
                <project id="10000" key="LU">Lustre</project>
                    <description>&lt;p&gt;After all the tests in sanity-lfsck run, the suite hangs at umount for /mnt/mds4. All subtests are marked as PASS. Logs are at &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/fb298f72-6a05-11e5-9d0a-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/fb298f72-6a05-11e5-9d0a-5254006e85c2&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the suite_stdout log, the last thing we see is&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;13:48:04:CMD: shadow-23vm8 grep -c /mnt/mds4&apos; &apos; /proc/mounts
13:48:04:Stopping /mnt/mds4 (opts:-f) on shadow-23vm8
13:48:04:CMD: shadow-23vm8 umount -d -f /mnt/mds4
14:47:53:********** Timeout by autotest system **********
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;Looking at the logs for the last test run, sanity-lfsck test_31h, it&#8217;s clear that something is not functioning correctly. What went wrong isn&#8217;t that clear to me. From the MDS2, 3, 4 console, it&#8217;s clear there&#8217;s problems communicating between the MDTs more than normal for this test:&lt;/p&gt;
&lt;div class=&quot;preformatted panel&quot; style=&quot;border-width: 1px;&quot;&gt;&lt;div class=&quot;preformattedContent panelContent&quot;&gt;
&lt;pre&gt;14:23:44:Lustre: lustre-MDT0003: Not available for connect from 10.1.5.32@tcp (stopping)
14:23:44:Lustre: Skipped 226 previous similar messages
14:23:44:LustreError: 137-5: lustre-MDT0001_UUID: not available for connect from 10.1.5.32@tcp (no target). If you are running an HA pair check that the target is mounted on the other server.
14:23:44:LustreError: Skipped 585 previous similar messages
14:23:44:Lustre: 4063:0:(client.c:2092:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1443880351/real 1443880351]  req@ffff880058aa2080 x1514012107003420/t0(0) o250-&amp;gt;MGC10.1.5.33@tcp@10.1.5.33@tcp:26/25 lens 520/544 e 0 to 1 dl 1443880376 ref 1 fl Rpc:XN/0/ffffffff rc 0/-1
14:23:44:Lustre: 4063:0:(client.c:2092:ptlrpc_expire_one_request()) Skipped 50 previous similar messages
14:23:44:Lustre: lustre-MDT0003: Not available for connect from 10.1.5.32@tcp (stopping)
14:23:44:Lustre: Skipped 475 previous similar messages
14:23:44:LustreError: 137-5: lustre-MDT0001_UUID: not available for connect from 10.1.5.32@tcp (no target). If you are running an HA pair check that the target is mounted on the other server.
14:23:44:LustreError: Skipped 1191 previous similar messages
14:23:44:Lustre: 4063:0:(client.c:2092:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1443880961/real 1443880961]  req@ffff880058a93cc0 x1514012107006512/t0(0) o38-&amp;gt;lustre-MDT0000-osp-MDT0003@10.1.5.33@tcp:24/4 lens 520/544 e 0 to 1 dl 1443880986 ref 1 fl Rpc:XN/0/ffffffff rc 0/-1
14:23:44:Lustre: 4063:0:(client.c:2092:ptlrpc_expire_one_request()) Skipped 40 previous similar messages
14:23:44:Lustre: lustre-MDT0003: Not available for connect from 10.1.5.32@tcp (stopping)
14:23:44:Lustre: Skipped 475 previous similar messages
14:23:44:LustreError: 137-5: lustre-MDT0001_UUID: not available for connect from 10.1.5.32@tcp (no target). If you are running an HA pair check that the target is mounted on the other server.
14:23:44:LustreError: Skipped 1191 previous similar messages
14:23:44:Lustre: 4063:0:(client.c:2092:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1443881581/real 1443881581]  req@ffff88007cb1fc80 x1514012107009652/t0(0) o250-&amp;gt;MGC10.1.5.33@tcp@10.1.5.33@tcp:26/25 lens 520/544 e 0 to 1 dl 1443881606 ref 1 fl Rpc:XN/0/ffffffff rc 0/-1
14:23:44:Lustre: 4063:0:(client.c:2092:ptlrpc_expire_one_request()) Skipped 40 previous similar messages
14:23:44:Lustre: lustre-MDT0003: Not available for connect from 10.1.5.32@tcp (stopping)
14:23:44:Lustre: Skipped 475 previous similar messages
14:23:44:LustreError: 137-5: lustre-MDT0001_UUID: not available for connect from 10.1.5.32@tcp (no target). If you are running an HA pair check that the target is mounted on the other server.
14:23:44:LustreError: Skipped 1189 previous similar messages
14:23:44:Lustre: 4063:0:(client.c:2092:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1443882191/real 1443882191]  req@ffff88006705b980 x1514012107012736/t0(0) o38-&amp;gt;lustre-MDT0000-osp-MDT0003@10.1.5.33@tcp:24/4 lens 520/544 e 0 to 1 dl 1443882216 ref 1 fl Rpc:XN/0/ffffffff rc 0/-1
&lt;/pre&gt;
&lt;/div&gt;&lt;/div&gt;

&lt;p&gt;There are also stack traces in the console logs for all nodes for test 31h, but there are no Lustre function calls anywhere close to the top of the stack traces.&lt;/p&gt;

&lt;p&gt;There have been three recent occurrences of this issues with logs at:&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/42501ed6-69be-11e5-9fbf-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/42501ed6-69be-11e5-9fbf-5254006e85c2&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/60742a32-6a19-11e5-9fbf-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/60742a32-6a19-11e5-9fbf-5254006e85c2&lt;/a&gt;&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/fb298f72-6a05-11e5-9d0a-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/fb298f72-6a05-11e5-9d0a-5254006e85c2&lt;/a&gt;&lt;br/&gt;
all on 2015-10-03 in review-dne-part-2. &lt;/p&gt;</description>
                <environment>review-dne-part-2 in autotest</environment>
        <key id="32504">LU-7256</key>
            <summary>sanity-lfsck TIMEOUT on umount /mnt/mds4</summary>
                <type id="1" iconUrl="https://jira.whamcloud.com/secure/viewavatar?size=xsmall&amp;avatarId=11303&amp;avatarType=issuetype">Bug</type>
                                            <priority id="2" iconUrl="https://jira.whamcloud.com/images/icons/priorities/critical.svg">Critical</priority>
                        <status id="5" iconUrl="https://jira.whamcloud.com/images/icons/statuses/resolved.png" description="A resolution has been taken, and it is awaiting verification by reporter. From here issues are either reopened, or are closed.">Resolved</status>
                    <statusCategory id="3" key="done" colorName="success"/>
                                    <resolution id="1">Fixed</resolution>
                                        <assignee username="yong.fan">nasf</assignee>
                                    <reporter username="jamesanunez">James Nunez</reporter>
                        <labels>
                    </labels>
                <created>Mon, 5 Oct 2015 23:54:07 +0000</created>
                <updated>Tue, 28 Feb 2017 16:49:57 +0000</updated>
                            <resolved>Mon, 14 Mar 2016 03:28:12 +0000</resolved>
                                    <version>Lustre 2.8.0</version>
                                    <fixVersion>Lustre 2.9.0</fixVersion>
                                        <due></due>
                            <votes>0</votes>
                                    <watches>5</watches>
                                                                            <comments>
                            <comment id="130266" author="standan" created="Tue, 13 Oct 2015 18:46:18 +0000"  >&lt;p&gt;Another instance but in this case it TIMEOUT on umount  /mnt/ost7:&lt;br/&gt;
&lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/9b283adc-6dee-11e5-b960-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/9b283adc-6dee-11e5-b960-5254006e85c2&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="130762" author="jamesanunez" created="Mon, 19 Oct 2015 15:26:59 +0000"  >&lt;p&gt;sanity-lfsck is timing out on unmount of mds4 frequently. Here are some recent logs:&lt;br/&gt;
2015-10-12 06:00:57 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/e61e890c-70db-11e5-95b7-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/e61e890c-70db-11e5-95b7-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2015-10-13 21:47:32 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/efa762d4-7228-11e5-b344-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/efa762d4-7228-11e5-b344-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2015-10-16 18:41:32 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/3876f938-7487-11e5-b47d-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/3876f938-7487-11e5-b47d-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2015-10-16 20:01:49 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/d52a8c14-7477-11e5-8f32-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/d52a8c14-7477-11e5-8f32-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2015-10-17 22:08:38 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/a70cb296-7551-11e5-b12f-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/a70cb296-7551-11e5-b12f-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2015-10-26 12:08:40 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/5a92f46c-7c12-11e5-88cf-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/5a92f46c-7c12-11e5-88cf-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2015-10-28 16:28:57 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/6070d91c-7dca-11e5-bca9-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/6070d91c-7dca-11e5-bca9-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2015-11-02 22:50:30 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/52c3408c-81eb-11e5-846d-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/52c3408c-81eb-11e5-846d-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2015-11-10 11:55:14 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/d6087414-87e8-11e5-b19e-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/d6087414-87e8-11e5-b19e-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2015-11-11 08:48:35 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/2ecb581e-8889-11e5-9053-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/2ecb581e-8889-11e5-9053-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2015-11-15 22:30:59 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/20febbe6-8c20-11e5-ae56-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/20febbe6-8c20-11e5-ae56-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2015-11-15 23:51:07 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/aa5507a0-8c2b-11e5-ae56-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/aa5507a0-8c2b-11e5-ae56-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2015-12-06 22:52:07 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/7b00656a-9ca4-11e5-8e88-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/7b00656a-9ca4-11e5-8e88-5254006e85c2&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="134851" author="gerrit" created="Tue, 1 Dec 2015 14:35:02 +0000"  >&lt;p&gt;Fan Yong (fan.yong@intel.com) uploaded a new patch: &lt;a href=&quot;http://review.whamcloud.com/17406&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;http://review.whamcloud.com/17406&lt;/a&gt;&lt;br/&gt;
Subject: &lt;a href=&quot;https://jira.whamcloud.com/browse/LU-7256&quot; title=&quot;sanity-lfsck TIMEOUT on umount /mnt/mds4&quot; class=&quot;issue-link&quot; data-issue-key=&quot;LU-7256&quot;&gt;&lt;del&gt;LU-7256&lt;/del&gt;&lt;/a&gt; tests: wait LFSCK to exit before next test&lt;br/&gt;
Project: fs/lustre-release&lt;br/&gt;
Branch: master&lt;br/&gt;
Current Patch Set: 1&lt;br/&gt;
Commit: 8df8ab7ce104f873ed74bbf59355b7225b42d07f&lt;/p&gt;</comment>
                            <comment id="137661" author="jamesanunez" created="Wed, 30 Dec 2015 17:33:32 +0000"  >&lt;p&gt;More failures on master:&lt;br/&gt;
2015-12-29 04:52:47 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/12276386-ae1a-11e5-aa1f-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/12276386-ae1a-11e5-aa1f-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2016-01-06 02:25:20 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/da54d49c-b464-11e5-aa1f-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/da54d49c-b464-11e5-aa1f-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2016-01-11 12:36:38 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/9fc31ea2-b896-11e5-8c15-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/9fc31ea2-b896-11e5-8c15-5254006e85c2&lt;/a&gt;&lt;br/&gt;
2016-01-20 12:27:31 - &lt;a href=&quot;https://testing.hpdd.intel.com/test_sets/f0ef1be0-bf9f-11e5-8f04-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sets/f0ef1be0-bf9f-11e5-8f04-5254006e85c2&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="139434" author="rhenwood" created="Wed, 20 Jan 2016 14:22:56 +0000"  >&lt;p&gt;another failure on review-dne-part-2:&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://testing.hpdd.intel.com/test_sessions/30aec778-bf4e-11e5-8f04-5254006e85c2&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://testing.hpdd.intel.com/test_sessions/30aec778-bf4e-11e5-8f04-5254006e85c2&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="139435" author="rhenwood" created="Wed, 20 Jan 2016 14:26:16 +0000"  >&lt;p&gt;Also, review-dne-part-2+sanity-lfsck have have caused 2 of the last 10 canaires to fail:&lt;/p&gt;

&lt;p&gt;&lt;a href=&quot;https://wiki.hpdd.intel.com/display/~rhenwood/Canary+patch&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;https://wiki.hpdd.intel.com/display/~rhenwood/Canary+patch&lt;/a&gt;&lt;/p&gt;</comment>
                            <comment id="139599" author="yong.fan" created="Thu, 21 Jan 2016 16:52:11 +0000"  >&lt;p&gt;The patch &lt;a href=&quot;http://review.whamcloud.com/#/c/17406/&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;http://review.whamcloud.com/#/c/17406/&lt;/a&gt; has been refreshed. Please try with that.&lt;/p&gt;</comment>
                            <comment id="145338" author="gerrit" created="Sun, 13 Mar 2016 06:25:49 +0000"  >&lt;p&gt;Oleg Drokin (oleg.drokin@intel.com) merged in patch &lt;a href=&quot;http://review.whamcloud.com/17406/&quot; class=&quot;external-link&quot; target=&quot;_blank&quot; rel=&quot;nofollow noopener&quot;&gt;http://review.whamcloud.com/17406/&lt;/a&gt;&lt;br/&gt;
Subject: &lt;a href=&quot;https://jira.whamcloud.com/browse/LU-7256&quot; title=&quot;sanity-lfsck TIMEOUT on umount /mnt/mds4&quot; class=&quot;issue-link&quot; data-issue-key=&quot;LU-7256&quot;&gt;&lt;del&gt;LU-7256&lt;/del&gt;&lt;/a&gt; tests: wait current LFSCK to exit before next test&lt;br/&gt;
Project: fs/lustre-release&lt;br/&gt;
Branch: master&lt;br/&gt;
Current Patch Set: &lt;br/&gt;
Commit: 6871453b053d5756aba2321122db4564df5c1c57&lt;/p&gt;</comment>
                            <comment id="145383" author="yong.fan" created="Mon, 14 Mar 2016 03:28:12 +0000"  >&lt;p&gt;The patch has been landed to master.&lt;/p&gt;</comment>
                    </comments>
                <issuelinks>
                            <issuelinktype id="10010">
                    <name>Duplicate</name>
                                                                <inwardlinks description="is duplicated by">
                                        <issuelink>
            <issuekey id="34772">LU-7793</issuekey>
        </issuelink>
                            </inwardlinks>
                                    </issuelinktype>
                            <issuelinktype id="10011">
                    <name>Related</name>
                                            <outwardlinks description="is related to ">
                                                        </outwardlinks>
                                                                <inwardlinks description="is related to">
                                        <issuelink>
            <issuekey id="32355">LU-7221</issuekey>
        </issuelink>
            <issuelink>
            <issuekey id="34036">LU-7648</issuekey>
        </issuelink>
                            </inwardlinks>
                                    </issuelinktype>
                    </issuelinks>
                <attachments>
                    </attachments>
                <subtasks>
                    </subtasks>
                <customfields>
                                                                                                                                                                                            <customfield id="customfield_10890" key="com.atlassian.jira.plugins.jira-development-integration-plugin:devsummary">
                        <customfieldname>Development</customfieldname>
                        <customfieldvalues>
                            
                        </customfieldvalues>
                    </customfield>
                                                                                                                                                                                                                                                                                                                                                        <customfield id="customfield_10390" key="com.pyxis.greenhopper.jira:gh-lexo-rank">
                        <customfieldname>Rank</customfieldname>
                        <customfieldvalues>
                            <customfieldvalue>1|hzxpnj:</customfieldvalue>

                        </customfieldvalues>
                    </customfield>
                                                                <customfield id="customfield_10090" key="com.pyxis.greenhopper.jira:gh-global-rank">
                        <customfieldname>Rank (Obsolete)</customfieldname>
                        <customfieldvalues>
                            <customfieldvalue>9223372036854775807</customfieldvalue>
                        </customfieldvalues>
                    </customfield>
                                                                                            <customfield id="customfield_10060" key="com.atlassian.jira.plugin.system.customfieldtypes:select">
                        <customfieldname>Severity</customfieldname>
                        <customfieldvalues>
                                <customfieldvalue key="10022"><![CDATA[3]]></customfieldvalue>

                        </customfieldvalues>
                    </customfield>
                                                                                                                                                                                                                                                                                                                                                        </customfields>
    </item>
</channel>
</rss>