[LU-7657] sanity-krb5 test_151 fails with 'mount with default flavor should have failed' Created: 13/Jan/16 Updated: 14/Dec/21 Resolved: 14/Dec/21 |
|
| Status: | Resolved |
| Project: | Lustre |
| Component/s: | None |
| Affects Version/s: | Lustre 2.8.0, Lustre 2.9.0 |
| Fix Version/s: | None |
| Type: | Bug | Priority: | Minor |
| Reporter: | James Nunez (Inactive) | Assignee: | WC Triage |
| Resolution: | Cannot Reproduce | Votes: | 0 |
| Labels: | None | ||
| Environment: |
eagle cluster with Lustre tag 2.7.65 |
||
| Severity: | 3 |
| Rank (Obsolete): | 9223372036854775807 |
| Description |
|
Running the sanity-krb5 test suite on Lustre systems with a separate MGS and MDS on the same node or running with a combined MGS/MDS, test 151 fails with 'mount with default flavor should have failed' This does not happen when running with a MGS and MDS on separate nodes. From the test_log, we can see that the complaint is that the MDS is already mounted: Starting mgs: /dev/vda3 /lustre/scratch/mdt0 pdsh@eagle-51vm6: eagle-51vm1: ssh exited with exit code 1 pdsh@eagle-51vm6: eagle-51vm1: ssh exited with exit code 1 Started scratch-MDT0000 Starting mds1: /dev/vda3 /lustre/scratch/mdt0 eagle-51vm1: mount.lustre: according to /etc/mtab /dev/vda3 is already mounted on /lustre/scratch/mdt0 pdsh@eagle-51vm6: eagle-51vm1: ssh exited with exit code 17 Start of /dev/vda3 on mds1 failed 17 eagle-51vm1: mgc.*.mgs_server_uuid in FULL state after 0 sec sanity-krb5 test_151: @@@@@@ FAIL: mount with default flavor should have failed It seems like the test code expects to start the MGs and MDS separately. From test 151: stopall
# start gss daemon on mgs node
combined_mgs_mds || start_gss_daemons $mgs_HOST "$LSVCGSSD -v"
# start mgs
start mgs $(mgsdevname 1) $MDS_MOUNT_OPTS
# mount mgs with default flavor, in current framework it means mgs+mdt1\
.
# the connection of mgc of mdt1 to mgs is expected fail.
DEVNAME=$(mdsdevname 1)
start mds1 $DEVNAME $MDS_MOUNT_OPTS
Should this test be skipped for a combined MGS/MDS set up? Logs are at: |