[LU-3741] SLES11SP2 server,Failure on test suite sanityn test_16 Created: 13/Aug/13  Updated: 09/Jan/20  Resolved: 09/Jan/20

Status: Resolved
Project: Lustre
Component/s: None
Affects Version/s: Lustre 2.6.0
Fix Version/s: None

Type: Bug Priority: Critical
Reporter: Maloo Assignee: Bob Glossman (Inactive)
Resolution: Low Priority Votes: 0
Labels: ssf

Issue Links:
Related
is related to LU-3743 SLES11SP2 server,Failure on test suit... Resolved
Severity: 3
Rank (Obsolete): 9658

 Description   

This issue was created by maloo for sarah <sarah@whamcloud.com>

This issue relates to the following test suite run: http://maloo.whamcloud.com/test_sets/10fb6bd4-00c4-11e3-b06c-52540035b04c.

The sub-test test_16 failed with the following error:

test_16 returned 1

Info required for matching: sanityn 16



 Comments   
Comment by Jodi Levi (Inactive) [ 14/Aug/13 ]

Niu/Jinshan,
could one of you please have a look at this? This is an fsx data corruption.
Thank you!

Comment by Niu Yawei (Inactive) [ 15/Aug/13 ]

LU-3743 looks not duplicated of this one: This ticket is about SLES client fsx mmap read error, however, LU-3743 is about RHEL client (SLES server) fsx read error. I copied the maloo link of LU-3743 here for reference: https://maloo.whamcloud.com/test_sets/ab5e3332-00c4-11e3-b06c-52540035b04c

The LU-3743 looks likely another instance of LU-2304. Xiong, any idea?

Comment by Jian Yu [ 10/Jan/14 ]

Lustre build: http://build.whamcloud.com/job/lustre-reviews/20841/
Distro/Arch: SLES11SP3/x86_64 (both server and client)

sanity-benchmark test fsx hit the same failure:
https://maloo.whamcloud.com/test_sets/dd8224e0-7987-11e3-a27b-52540035b04c

Comment by Stephen Champion [ 15/Jan/14 ]

FYI, the last time I checked, all fsx based tests fail with SLES11SP[23] OSS using ldiskfs, regardless of client type. I have not tested using zfs, but I suspect the problem is with corruption following write after truncate in the ldiskfs derived from SLES ext4 source.

Comment by Andreas Dilger [ 09/Jan/20 ]

Close old bug

Generated at Sat Feb 10 01:36:31 UTC 2024 using Jira 9.4.14#940014-sha1:734e6822bbf0d45eff9af51f82432957f73aa32c.