[LU-8104] Failover : replay-single test_70b: Restart of mds1 failed! Created: 04/May/16  Updated: 01/Mar/18

Status: Open
Project: Lustre
Component/s: None
Affects Version/s: Lustre 2.9.0
Fix Version/s: None

Type: Bug Priority: Minor
Reporter: Maloo Assignee: WC Triage
Resolution: Unresolved Votes: 0
Labels: None
Environment:

EL7 Server/Client - ZFS
master, build# 3350


Issue Links:
Related
is related to LU-10708 replay-single test_20b: Restart of md... Open
Severity: 3
Rank (Obsolete): 9223372036854775807

 Description   

This issue was created by maloo for Saurabh Tandan <saurabh.tandan@intel.com>

This issue relates to the following test suite run: https://testing.hpdd.intel.com/test_sets/f9e24536-09a2-11e6-9b34-5254006e85c2.

The sub-test test_70b failed with the following error:

test failed to respond and timed out

Test log:

trevis-18vm8: cannot import 'lustre-mdt1': no such pool available
 replay-single test_70b: @@@@@@ FAIL: Restart of mds1 failed! 
trevis-18vm6:    1     11393     0.07 MB/sec  execute 607 sec  latency 69690.310 ms
trevis-18vm1:    1     11393     0.07 MB/sec  execute 608 sec  latency 69713.185 ms
trevis-18vm5:    1     11451     0.07 MB/sec  execute 608 sec  latency 69984.105 ms
trevis-18vm6:    1     11393     0.07 MB/sec  execute 608 sec  latency 70690.511 ms
trevis-18vm1:    1     11393     0.07 MB/sec  execute 609 sec  latency 70713.311 ms
trevis-18vm8: invalid parameter 'dump_kernel'
trevis-18vm8: open(dump_kernel) failed: No such file or directory
Resetting fail_loc on all nodes...CMD: trevis-18vm1.trevis.hpdd.intel.com,trevis-18vm5,trevis-18vm6,trevis-18vm7,trevis-18vm8 lctl set_param -n fail_loc=0 	    fail_val=0 2>/dev/null || true
trevis-18vm5:    1     11451     0.07 MB/sec  execute 609 sec  latency 70984.367 ms
done.
trevis-18vm6:    1     11393     0.07 MB/sec  execute 609 sec  latency 71690.847 ms
trevis-18vm1:    1     11393     0.07 MB/sec  execute 610 sec  latency 71713.437 ms
trevis-18vm5:    1     11451     0.07 MB/sec  execute 610 sec  latency 71984.601 ms

Generated at Sat Feb 10 02:14:40 UTC 2024 using Jira 9.4.14#940014-sha1:734e6822bbf0d45eff9af51f82432957f73aa32c.