[LU-5119] Failure on test suite sanity-lfsck test_18a: FAIL: (6.1) Expect 1 fixed on mds1, but got: 2 Created: 29/May/14  Updated: 09/Jan/20  Resolved: 09/Jan/20

Status: Resolved
Project: Lustre
Component/s: None
Affects Version/s: Lustre 2.6.0
Fix Version/s: None

Type: Bug Priority: Minor
Reporter: Maloo Assignee: WC Triage
Resolution: Cannot Reproduce Votes: 0
Labels: lfsck
Environment:

server and client: lustre-master build # 2052


Severity: 3
Rank (Obsolete): 14116

 Description   

This issue was created by maloo for sarah <sarah@whamcloud.com>

This issue relates to the following test suite run: http://maloo.whamcloud.com/test_sets/79b63a6c-de1f-11e3-90aa-52540035b04c.

The sub-test test_18a failed with the following error:

(6.1) Expect 1 fixed on mds1, but got: 2

Inject failure, to make the MDT-object lost its layout EA
CMD: client-6vm3 /usr/sbin/lctl set_param fail_loc=0x1615
fail_loc=0x1615
CMD: client-6vm3 /usr/sbin/lctl set_param fail_loc=0
fail_loc=0
The file size should be incorrect since layout EA is lost
Trigger layout LFSCK on all devices to find out orphan OST-object
CMD: client-6vm3 /usr/sbin/lctl lfsck_start -M lustre-MDT0000 -t layout -r -o
Started LFSCK on the device lustre-MDT0000: layout.
CMD: client-6vm3 /usr/sbin/lctl get_param -n 			mdd.lustre-MDT0000.lfsck_layout |
			awk '/^status/ { print \$2 }'
CMD: client-6vm3 /usr/sbin/lctl get_param -n 			mdd.lustre-MDT0000.lfsck_layout |
			awk '/^status/ { print \$2 }'
CMD: client-6vm4 /usr/sbin/lctl get_param -n obdfilter.lustre-OST0000.lfsck_layout
CMD: client-6vm4 /usr/sbin/lctl get_param -n obdfilter.lustre-OST0001.lfsck_layout
CMD: client-6vm4 /usr/sbin/lctl get_param -n obdfilter.lustre-OST0002.lfsck_layout
CMD: client-6vm4 /usr/sbin/lctl get_param -n obdfilter.lustre-OST0003.lfsck_layout
CMD: client-6vm3 /usr/sbin/lctl get_param -n mdd.lustre-MDT0000.lfsck_layout
 sanity-lfsck test_18a: @@@@@@ FAIL: (6.1) Expect 1 fixed on mds1, but got: 2 


 Comments   
Comment by Andreas Dilger [ 09/Jan/20 ]

Close old bug

Generated at Sat Feb 10 01:48:41 UTC 2024 using Jira 9.4.14#940014-sha1:734e6822bbf0d45eff9af51f82432957f73aa32c.