Uploaded image for project: 'Lustre'
  1. Lustre
  2. LU-7436

conf-sanity test_91: @@@@@@ FAIL: found cc0b3805-41ce-ef63-799a-a55708b119b7 192.168.113.19@tcp on MDT

    XMLWordPrintable

Details

    • Bug
    • Resolution: Fixed
    • Minor
    • Lustre 2.8.0
    • Lustre 2.8.0
    • None
    • interop 2.5.x <-> master client
    • 3
    • 9223372036854775807

    Description

      Configuration : 4 node - 1 MDS/1OSS/2 clients
      Release
      2.6.32_431.17.1.x86_64
      2.6.32_431.29.2.el6.x86_64_g70e90c3

      Server 2.5.1.x6
      Client 2.7.62

      stdout.log
      == conf-sanity test 91: evict-by-nid support == 05:55:39 (1447566939)
      ../libcfs/libcfs/libcfs options: 'cpu_npartitions=2'
      Loading modules from /usr/lib64/lustre/tests/..
      detected 2 online CPUs by sysfs
      Force libcfs to create 2 CPU partitions
      debug=-1
      subsystem_debug=all -lnet -lnd -pinger
      ../lnet/lnet/lnet options: 'accept=all'
      ../lnet/klnds/socklnd/ksocklnd options: 'sock_timeout=10'
      gss/krb5 is not supported
      start mds service on fre1317
      Starting mds1: -o rw,user_xattr  /dev/vdb /mnt/mds1
      fre1317: mount.lustre: set /sys/block/vdb/queue/max_sectors_kb to 2147483647
      fre1317: 
      pdsh@fre1319: fre1317: ssh exited with exit code 1
      pdsh@fre1319: fre1317: ssh exited with exit code 1
      Started lustre-MDT0000
      start ost1 service on fre1318
      Starting ost1: -o user_xattr  /dev/vdb /mnt/ost1
      fre1318: mount.lustre: set /sys/block/vdb/queue/max_sectors_kb to 2147483647
      fre1318: 
      pdsh@fre1319: fre1318: ssh exited with exit code 1
      pdsh@fre1319: fre1318: ssh exited with exit code 1
      Started lustre-OST0000
      mount lustre on /mnt/lustre.....
      Starting client: fre1319:  -o user_xattr,flock fre1317@tcp:/lustre /mnt/lustre
      setup single mount lustre success
      list nids on mdt:
      mdt.lustre-MDT0000.exports.0@lo
      mdt.lustre-MDT0000.exports.192.168.113.18@tcp
      mdt.lustre-MDT0000.exports.192.168.113.19@tcp
      mdt.lustre-MDT0000.exports.clear
      uuid from 192\.168\.113\.19@tcp:
      mdt.lustre-MDT0000.exports.192.168.113.19@tcp.uuid=cc0b3805-41ce-ef63-799a-a55708b119b7
      manual umount lustre on /mnt/lustre....
      evict 192\.168\.113\.19@tcp
       conf-sanity test_91: @@@@@@ FAIL: found cc0b3805-41ce-ef63-799a-a55708b119b7 192\.168\.113\.19@tcp on MDT 
        Trace dump:
        = /usr/lib64/lustre/tests/../tests/test-framework.sh:4812:error_noexit()
        = /usr/lib64/lustre/tests/../tests/test-framework.sh:4843:error()
        = /usr/lib64/lustre/tests/conf-sanity.sh:6091:test_91()
        = /usr/lib64/lustre/tests/../tests/test-framework.sh:5090:run_one()
        = /usr/lib64/lustre/tests/../tests/test-framework.sh:5127:run_one_logged()
        = /usr/lib64/lustre/tests/../tests/test-framework.sh:4944:run_test()
        = /usr/lib64/lustre/tests/conf-sanity.sh:6104:main()
      Dumping lctl log to /tmp/test_logs/1447566903/conf-sanity.test_91.*.1447566962.log
      fre1320: Warning: Permanently added 'fre1319,192.168.113.19' (RSA) to the list of known hosts.
      fre1317: Warning: Permanently added 'fre1319,192.168.113.19' (RSA) to the list of known hosts.
      fre1318: Warning: Permanently added 'fre1319,192.168.113.19' (RSA) to the list of known hosts.
      FAIL 91 (24s)
      
      
      
      stderr.log
      fre1317: mount.lustre: set /sys/block/vdb/queue/max_sectors_kb to 2147483647
      fre1317: 
      pdsh@fre1319: fre1317: ssh exited with exit code 1
      fre1318: mount.lustre: set /sys/block/vdb/queue/max_sectors_kb to 2147483647
      fre1318: 
      pdsh@fre1319: fre1318: ssh exited with exit code 1
      pdsh@fre1319: fre1317: ssh exited with exit code 3
      
      
      
      

      Attachments

        1. 91.lctl.tgz
          1.30 MB
          parinay v kondekar

        Activity

          People

            bzzz Alex Zhuravlev
            parinay parinay v kondekar (Inactive)
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: