[Mon Dec 9 06:17:01 2019][ 0.000000] Initializing cgroup subsys cpu [Mon Dec 9 06:17:01 2019][ 0.000000] Initializing cgroup subsys cpuacct [Mon Dec 9 06:17:01 2019][ 0.000000] Linux version 3.10.0-957.27.2.el7_lustre.pl2.x86_64 (sthiell@oak-rbh01) (gcc version 4.8.5 20150623 (Red Hat 4.8.5-39) (GCC) ) #1 SMP Thu Nov 7 15:26:16 PST 2019 [Mon Dec 9 06:17:01 2019][ 0.000000] Command line: BOOT_IMAGE=/boot/vmlinuz-3.10.0-957.27.2.el7_lustre.pl2.x86_64 root=UUID=c4f754c4-e7db-49b7-baed-d6c7905c5cdc ro crashkernel=auto nomodeset console=ttyS0,115200 LANG=en_US.UTF-8 [Mon Dec 9 06:17:01 2019][ 0.000000] e820: BIOS-provided physical RAM map: [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000008efff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000000008f000-0x000000000008ffff] ACPI NVS [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000000000090000-0x000000000009ffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000004f773fff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000004f774000-0x000000005777cfff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000005777d000-0x000000006cacefff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000006cacf000-0x000000006efcefff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000006efcf000-0x000000006fdfefff] ACPI NVS [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000006fdff000-0x000000006fffefff] ACPI data [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000006ffff000-0x000000006fffffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000000070000000-0x000000008fffffff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x00000000fec10000-0x00000000fec10fff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x00000000fed80000-0x00000000fed80fff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000000100000000-0x000000107f37ffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000107f380000-0x000000107fffffff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000001080000000-0x000000207ff7ffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000207ff80000-0x000000207fffffff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000002080000000-0x000000307ff7ffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000307ff80000-0x000000307fffffff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x0000003080000000-0x000000407ff7ffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] BIOS-e820: [mem 0x000000407ff80000-0x000000407fffffff] reserved [Mon Dec 9 06:17:01 2019][ 0.000000] NX (Execute Disable) protection: active [Mon Dec 9 06:17:01 2019][ 0.000000] extended physical RAM map: [Mon Dec 9 06:17:01 2019][ 0.000000] reserve setup_data: [mem 0x0000000000000000-0x000000000008efff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] reserve setup_data: [mem 0x000000000008f000-0x000000000008ffff] ACPI NVS [Mon Dec 9 06:17:01 2019][ 0.000000] reserve setup_data: [mem 0x0000000000090000-0x000000000009ffff] usable [Mon Dec 9 06:17:01 2019][ 0.000000] reserve setup_data: [mem 0x0000000000100000-0x0000000037ac001f] usable [Mon Dec 9 06:17:01 2019][ 0.000000] reserve setup_data: [mem 0x0000000037ac0020-0x0000000037ad865f] usable [Mon Dec 9 06:17:01 2019][ 0.000000] reserve setup_data: [mem 0x0000000037ad8660-0x0000000037ad901f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037ad9020-0x0000000037b0265f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b02660-0x0000000037b0301f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b03020-0x0000000037b0b05f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b0b060-0x0000000037b0c01f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b0c020-0x0000000037b3dc5f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b3dc60-0x0000000037b3e01f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b3e020-0x0000000037b6fc5f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b6fc60-0x0000000037b7001f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037b70020-0x0000000037c11c5f] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000037c11c60-0x000000004f773fff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000004f774000-0x000000005777cfff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000005777d000-0x000000006cacefff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000006cacf000-0x000000006efcefff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000006efcf000-0x000000006fdfefff] ACPI NVS [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000006fdff000-0x000000006fffefff] ACPI data [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000006ffff000-0x000000006fffffff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000070000000-0x000000008fffffff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x00000000fec10000-0x00000000fec10fff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x00000000fed80000-0x00000000fed80fff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000000100000000-0x000000107f37ffff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000107f380000-0x000000107fffffff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000001080000000-0x000000207ff7ffff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000207ff80000-0x000000207fffffff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000002080000000-0x000000307ff7ffff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000307ff80000-0x000000307fffffff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x0000003080000000-0x000000407ff7ffff] usable [Mon Dec 9 06:17:02 2019][ 0.000000] reserve setup_data: [mem 0x000000407ff80000-0x000000407fffffff] reserved [Mon Dec 9 06:17:02 2019][ 0.000000] efi: EFI v2.50 by Dell Inc. [Mon Dec 9 06:17:02 2019][ 0.000000] efi: ACPI=0x6fffe000 ACPI 2.0=0x6fffe014 SMBIOS=0x6eab5000 SMBIOS 3.0=0x6eab3000 [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem00: type=3, attr=0xf, range=[0x0000000000000000-0x0000000000001000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem01: type=2, attr=0xf, range=[0x0000000000001000-0x0000000000002000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem02: type=7, attr=0xf, range=[0x0000000000002000-0x0000000000010000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem03: type=3, attr=0xf, range=[0x0000000000010000-0x0000000000014000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem04: type=7, attr=0xf, range=[0x0000000000014000-0x0000000000063000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem05: type=3, attr=0xf, range=[0x0000000000063000-0x000000000008f000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem06: type=10, attr=0xf, range=[0x000000000008f000-0x0000000000090000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem07: type=3, attr=0xf, range=[0x0000000000090000-0x00000000000a0000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem08: type=4, attr=0xf, range=[0x0000000000100000-0x0000000000120000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem09: type=7, attr=0xf, range=[0x0000000000120000-0x0000000000c00000) (10MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem10: type=3, attr=0xf, range=[0x0000000000c00000-0x0000000001000000) (4MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem11: type=2, attr=0xf, range=[0x0000000001000000-0x000000000267b000) (22MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem12: type=7, attr=0xf, range=[0x000000000267b000-0x0000000004000000) (25MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem13: type=4, attr=0xf, range=[0x0000000004000000-0x000000000403b000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem14: type=7, attr=0xf, range=[0x000000000403b000-0x0000000037ac0000) (826MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem15: type=2, attr=0xf, range=[0x0000000037ac0000-0x000000004edd7000) (371MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem16: type=7, attr=0xf, range=[0x000000004edd7000-0x000000004eddb000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem17: type=2, attr=0xf, range=[0x000000004eddb000-0x000000004eddd000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem18: type=1, attr=0xf, range=[0x000000004eddd000-0x000000004eefa000) (1MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem19: type=2, attr=0xf, range=[0x000000004eefa000-0x000000004f019000) (1MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem20: type=1, attr=0xf, range=[0x000000004f019000-0x000000004f128000) (1MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem21: type=3, attr=0xf, range=[0x000000004f128000-0x000000004f774000) (6MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem22: type=0, attr=0xf, range=[0x000000004f774000-0x000000005777d000) (128MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem23: type=3, attr=0xf, range=[0x000000005777d000-0x000000005796e000) (1MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem24: type=4, attr=0xf, range=[0x000000005796e000-0x000000005b4cf000) (59MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem25: type=3, attr=0xf, range=[0x000000005b4cf000-0x000000005b8cf000) (4MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem26: type=7, attr=0xf, range=[0x000000005b8cf000-0x0000000064a36000) (145MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem27: type=4, attr=0xf, range=[0x0000000064a36000-0x0000000064a43000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem28: type=7, attr=0xf, range=[0x0000000064a43000-0x0000000064a47000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem29: type=4, attr=0xf, range=[0x0000000064a47000-0x0000000065061000) (6MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem30: type=7, attr=0xf, range=[0x0000000065061000-0x0000000065062000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem31: type=4, attr=0xf, range=[0x0000000065062000-0x0000000065069000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem32: type=7, attr=0xf, range=[0x0000000065069000-0x000000006506a000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem33: type=4, attr=0xf, range=[0x000000006506a000-0x000000006506b000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem34: type=7, attr=0xf, range=[0x000000006506b000-0x000000006506c000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem35: type=4, attr=0xf, range=[0x000000006506c000-0x0000000065076000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem36: type=7, attr=0xf, range=[0x0000000065076000-0x0000000065077000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem37: type=4, attr=0xf, range=[0x0000000065077000-0x000000006507d000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem38: type=7, attr=0xf, range=[0x000000006507d000-0x000000006507e000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem39: type=4, attr=0xf, range=[0x000000006507e000-0x0000000065083000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem40: type=7, attr=0xf, range=[0x0000000065083000-0x0000000065086000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem41: type=4, attr=0xf, range=[0x0000000065086000-0x0000000065093000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem42: type=7, attr=0xf, range=[0x0000000065093000-0x0000000065094000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem43: type=4, attr=0xf, range=[0x0000000065094000-0x000000006509f000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem44: type=7, attr=0xf, range=[0x000000006509f000-0x00000000650a0000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem45: type=4, attr=0xf, range=[0x00000000650a0000-0x00000000650a1000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem46: type=7, attr=0xf, range=[0x00000000650a1000-0x00000000650a2000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem47: type=4, attr=0xf, range=[0x00000000650a2000-0x00000000650aa000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem48: type=7, attr=0xf, range=[0x00000000650aa000-0x00000000650ab000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem49: type=4, attr=0xf, range=[0x00000000650ab000-0x00000000650ad000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem50: type=7, attr=0xf, range=[0x00000000650ad000-0x00000000650ae000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem51: type=4, attr=0xf, range=[0x00000000650ae000-0x00000000650b4000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem52: type=7, attr=0xf, range=[0x00000000650b4000-0x00000000650b5000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem53: type=4, attr=0xf, range=[0x00000000650b5000-0x00000000650d5000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem54: type=7, attr=0xf, range=[0x00000000650d5000-0x00000000650d6000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem55: type=4, attr=0xf, range=[0x00000000650d6000-0x0000000065432000) (3MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem56: type=7, attr=0xf, range=[0x0000000065432000-0x0000000065433000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem57: type=4, attr=0xf, range=[0x0000000065433000-0x000000006543b000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem58: type=7, attr=0xf, range=[0x000000006543b000-0x000000006543c000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem59: type=4, attr=0xf, range=[0x000000006543c000-0x000000006544e000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem60: type=7, attr=0xf, range=[0x000000006544e000-0x000000006544f000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem61: type=4, attr=0xf, range=[0x000000006544f000-0x0000000065463000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem62: type=7, attr=0xf, range=[0x0000000065463000-0x0000000065464000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem63: type=4, attr=0xf, range=[0x0000000065464000-0x0000000065473000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem64: type=7, attr=0xf, range=[0x0000000065473000-0x0000000065474000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem65: type=4, attr=0xf, range=[0x0000000065474000-0x00000000654c5000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem66: type=7, attr=0xf, range=[0x00000000654c5000-0x00000000654c6000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem67: type=4, attr=0xf, range=[0x00000000654c6000-0x00000000654d9000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem68: type=7, attr=0xf, range=[0x00000000654d9000-0x00000000654db000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem69: type=4, attr=0xf, range=[0x00000000654db000-0x00000000654e0000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem70: type=7, attr=0xf, range=[0x00000000654e0000-0x00000000654e1000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem71: type=4, attr=0xf, range=[0x00000000654e1000-0x00000000654fa000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem72: type=7, attr=0xf, range=[0x00000000654fa000-0x00000000654fb000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem73: type=4, attr=0xf, range=[0x00000000654fb000-0x0000000065508000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem74: type=7, attr=0xf, range=[0x0000000065508000-0x0000000065509000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem75: type=4, attr=0xf, range=[0x0000000065509000-0x000000006550b000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem76: type=7, attr=0xf, range=[0x000000006550b000-0x000000006550c000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem77: type=4, attr=0xf, range=[0x000000006550c000-0x000000006550e000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem78: type=7, attr=0xf, range=[0x000000006550e000-0x000000006550f000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem79: type=4, attr=0xf, range=[0x000000006550f000-0x0000000065513000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem80: type=7, attr=0xf, range=[0x0000000065513000-0x0000000065514000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem81: type=4, attr=0xf, range=[0x0000000065514000-0x0000000065515000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem82: type=7, attr=0xf, range=[0x0000000065515000-0x0000000065516000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem83: type=4, attr=0xf, range=[0x0000000065516000-0x0000000065522000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem84: type=7, attr=0xf, range=[0x0000000065522000-0x0000000065523000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem85: type=4, attr=0xf, range=[0x0000000065523000-0x0000000065593000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem86: type=7, attr=0xf, range=[0x0000000065593000-0x0000000065594000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem87: type=4, attr=0xf, range=[0x0000000065594000-0x000000006559c000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem88: type=7, attr=0xf, range=[0x000000006559c000-0x000000006559d000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem89: type=4, attr=0xf, range=[0x000000006559d000-0x00000000655c4000) (0MB) [Mon Dec 9 06:17:02 2019][ 0.000000] efi: mem90: type=7, attr=0xf, range=[0x00000000655c4000-0x00000000655c5000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem91: type=4, attr=0xf, range=[0x00000000655c5000-0x00000000655ea000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem92: type=7, attr=0xf, range=[0x00000000655ea000-0x00000000655eb000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem93: type=4, attr=0xf, range=[0x00000000655eb000-0x00000000655f1000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem94: type=7, attr=0xf, range=[0x00000000655f1000-0x00000000655f2000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem95: type=4, attr=0xf, range=[0x00000000655f2000-0x000000006b8cf000) (98MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem96: type=7, attr=0xf, range=[0x000000006b8cf000-0x000000006b8d0000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem97: type=3, attr=0xf, range=[0x000000006b8d0000-0x000000006cacf000) (17MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem98: type=6, attr=0x800000000000000f, range=[0x000000006cacf000-0x000000006cbcf000) (1MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem99: type=5, attr=0x800000000000000f, range=[0x000000006cbcf000-0x000000006cdcf000) (2MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem100: type=0, attr=0xf, range=[0x000000006cdcf000-0x000000006efcf000) (34MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem101: type=10, attr=0xf, range=[0x000000006efcf000-0x000000006fdff000) (14MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem102: type=9, attr=0xf, range=[0x000000006fdff000-0x000000006ffff000) (2MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem103: type=4, attr=0xf, range=[0x000000006ffff000-0x0000000070000000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem104: type=7, attr=0xf, range=[0x0000000100000000-0x000000107f380000) (63475MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem105: type=7, attr=0xf, range=[0x0000001080000000-0x000000207ff80000) (65535MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem106: type=7, attr=0xf, range=[0x0000002080000000-0x000000307ff80000) (65535MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem107: type=7, attr=0xf, range=[0x0000003080000000-0x000000407ff80000) (65535MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem108: type=0, attr=0x9, range=[0x0000000070000000-0x0000000080000000) (256MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem109: type=11, attr=0x800000000000000f, range=[0x0000000080000000-0x0000000090000000) (256MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem110: type=11, attr=0x800000000000000f, range=[0x00000000fec10000-0x00000000fec11000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem111: type=11, attr=0x800000000000000f, range=[0x00000000fed80000-0x00000000fed81000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem112: type=0, attr=0x0, range=[0x000000107f380000-0x0000001080000000) (12MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem113: type=0, attr=0x0, range=[0x000000207ff80000-0x0000002080000000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem114: type=0, attr=0x0, range=[0x000000307ff80000-0x0000003080000000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] efi: mem115: type=0, attr=0x0, range=[0x000000407ff80000-0x0000004080000000) (0MB) [Mon Dec 9 06:17:03 2019][ 0.000000] SMBIOS 3.2.0 present. [Mon Dec 9 06:17:03 2019][ 0.000000] DMI: Dell Inc. PowerEdge R6415/07YXFK, BIOS 1.10.6 08/15/2019 [Mon Dec 9 06:17:03 2019][ 0.000000] e820: last_pfn = 0x407ff80 max_arch_pfn = 0x400000000 [Mon Dec 9 06:17:03 2019][ 0.000000] PAT configuration [0-7]: WB WC UC- UC WB WP UC- UC [Mon Dec 9 06:17:03 2019][ 0.000000] e820: last_pfn = 0x70000 max_arch_pfn = 0x400000000 [Mon Dec 9 06:17:03 2019][ 0.000000] Using GB pages for direct mapping [Mon Dec 9 06:17:03 2019][ 0.000000] RAMDISK: [mem 0x37c12000-0x38f51fff] [Mon Dec 9 06:17:03 2019][ 0.000000] Early table checksum verification disabled [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: RSDP 000000006fffe014 00024 (v02 DELL ) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: XSDT 000000006fffd0e8 000AC (v01 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: FACP 000000006fff0000 00114 (v06 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: DSDT 000000006ffdc000 1038C (v02 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: FACS 000000006fdd3000 00040 [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SSDT 000000006fffc000 000D2 (v02 DELL PE_SC3 00000002 MSFT 04000000) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: BERT 000000006fffb000 00030 (v01 DELL BERT 00000001 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: HEST 000000006fffa000 006DC (v01 DELL HEST 00000001 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SSDT 000000006fff9000 00294 (v01 DELL PE_SC3 00000001 AMD 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SRAT 000000006fff8000 00420 (v03 DELL PE_SC3 00000001 AMD 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: MSCT 000000006fff7000 0004E (v01 DELL PE_SC3 00000000 AMD 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SLIT 000000006fff6000 0003C (v01 DELL PE_SC3 00000001 AMD 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: CRAT 000000006fff3000 02DC0 (v01 DELL PE_SC3 00000001 AMD 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: EINJ 000000006fff2000 00150 (v01 DELL PE_SC3 00000001 AMD 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SLIC 000000006fff1000 00024 (v01 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: HPET 000000006ffef000 00038 (v01 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: APIC 000000006ffee000 004B2 (v03 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: MCFG 000000006ffed000 0003C (v01 DELL PE_SC3 00000002 DELL 00000001) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SSDT 000000006ffdb000 00629 (v02 DELL xhc_port 00000001 INTL 20170119) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: IVRS 000000006ffda000 00210 (v02 DELL PE_SC3 00000001 AMD 00000000) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: SSDT 000000006ffd8000 01658 (v01 AMD CPMCMN 00000001 INTL 20170119) [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x00 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x01 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x02 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x03 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x04 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x05 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x08 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x09 -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x0a -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x0b -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x0c -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 0 -> APIC 0x0d -> Node 0 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x10 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x11 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x12 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x13 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x14 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x15 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x18 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x19 -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x1a -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x1b -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x1c -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 1 -> APIC 0x1d -> Node 1 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x20 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x21 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x22 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x23 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x24 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x25 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x28 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x29 -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x2a -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x2b -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x2c -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 2 -> APIC 0x2d -> Node 2 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x30 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x31 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x32 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x33 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x34 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x35 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x38 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x39 -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x3a -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x3b -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x3c -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: PXM 3 -> APIC 0x3d -> Node 3 [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: Node 0 PXM 0 [mem 0x100000000-0x107fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: Node 1 PXM 1 [mem 0x1080000000-0x207fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: Node 2 PXM 2 [mem 0x2080000000-0x307fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] SRAT: Node 3 PXM 3 [mem 0x3080000000-0x407fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] NUMA: Node 0 [mem 0x00000000-0x7fffffff] + [mem 0x100000000-0x107fffffff] -> [mem 0x00000000-0x107fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] NODE_DATA(0) allocated [mem 0x107f359000-0x107f37ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] NODE_DATA(1) allocated [mem 0x207ff59000-0x207ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] NODE_DATA(2) allocated [mem 0x307ff59000-0x307ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] NODE_DATA(3) allocated [mem 0x407ff58000-0x407ff7efff] [Mon Dec 9 06:17:03 2019][ 0.000000] Reserving 176MB of memory at 704MB for crashkernel (System RAM: 261692MB) [Mon Dec 9 06:17:03 2019][ 0.000000] Zone ranges: [Mon Dec 9 06:17:03 2019][ 0.000000] DMA [mem 0x00001000-0x00ffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] DMA32 [mem 0x01000000-0xffffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] Normal [mem 0x100000000-0x407ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] Movable zone start for each node [Mon Dec 9 06:17:03 2019][ 0.000000] Early memory node ranges [Mon Dec 9 06:17:03 2019][ 0.000000] node 0: [mem 0x00001000-0x0008efff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 0: [mem 0x00090000-0x0009ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 0: [mem 0x00100000-0x4f773fff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 0: [mem 0x5777d000-0x6cacefff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 0: [mem 0x6ffff000-0x6fffffff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 0: [mem 0x100000000-0x107f37ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 1: [mem 0x1080000000-0x207ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 2: [mem 0x2080000000-0x307ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] node 3: [mem 0x3080000000-0x407ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] Initmem setup node 0 [mem 0x00001000-0x107f37ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] Initmem setup node 1 [mem 0x1080000000-0x207ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] Initmem setup node 2 [mem 0x2080000000-0x307ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] Initmem setup node 3 [mem 0x3080000000-0x407ff7ffff] [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: PM-Timer IO Port: 0x408 [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x00] lapic_id[0x00] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x01] lapic_id[0x10] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x02] lapic_id[0x20] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x03] lapic_id[0x30] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x04] lapic_id[0x08] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x05] lapic_id[0x18] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x06] lapic_id[0x28] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x07] lapic_id[0x38] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x08] lapic_id[0x02] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x09] lapic_id[0x12] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0a] lapic_id[0x22] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0b] lapic_id[0x32] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0c] lapic_id[0x0a] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0d] lapic_id[0x1a] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0e] lapic_id[0x2a] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x0f] lapic_id[0x3a] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x10] lapic_id[0x04] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x11] lapic_id[0x14] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x12] lapic_id[0x24] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x13] lapic_id[0x34] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x14] lapic_id[0x0c] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x15] lapic_id[0x1c] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x16] lapic_id[0x2c] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x17] lapic_id[0x3c] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x18] lapic_id[0x01] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x19] lapic_id[0x11] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1a] lapic_id[0x21] enabled) [Mon Dec 9 06:17:03 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1b] lapic_id[0x31] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1c] lapic_id[0x09] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1d] lapic_id[0x19] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1e] lapic_id[0x29] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x1f] lapic_id[0x39] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x20] lapic_id[0x03] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x21] lapic_id[0x13] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x22] lapic_id[0x23] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x23] lapic_id[0x33] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x24] lapic_id[0x0b] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x25] lapic_id[0x1b] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x26] lapic_id[0x2b] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x27] lapic_id[0x3b] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x28] lapic_id[0x05] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x29] lapic_id[0x15] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2a] lapic_id[0x25] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2b] lapic_id[0x35] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2c] lapic_id[0x0d] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2d] lapic_id[0x1d] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2e] lapic_id[0x2d] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x2f] lapic_id[0x3d] enabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x30] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x31] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x32] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x33] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x34] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x35] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x36] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x37] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x38] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x39] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3a] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3b] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3c] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3d] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3e] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x3f] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x40] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x41] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x42] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x43] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x44] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x45] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x46] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x47] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x48] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x49] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4a] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4b] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4c] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4d] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4e] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x4f] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x50] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x51] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x52] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x53] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x54] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x55] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x56] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x57] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x58] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x59] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5a] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5b] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5c] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5d] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5e] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x5f] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x60] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x61] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x62] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x63] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x64] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x65] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x66] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x67] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x68] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x69] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6a] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6b] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6c] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6d] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6e] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x6f] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x70] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x71] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x72] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x73] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x74] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x75] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x76] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x77] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x78] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x79] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7a] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7b] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7c] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7d] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7e] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC (acpi_id[0x7f] lapic_id[0x00] disabled) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] high edge lint[0x1]) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: IOAPIC (id[0x80] address[0xfec00000] gsi_base[0]) [Mon Dec 9 06:17:04 2019][ 0.000000] IOAPIC[0]: apic_id 128, version 33, address 0xfec00000, GSI 0-23 [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: IOAPIC (id[0x81] address[0xfd880000] gsi_base[24]) [Mon Dec 9 06:17:04 2019][ 0.000000] IOAPIC[1]: apic_id 129, version 33, address 0xfd880000, GSI 24-55 [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: IOAPIC (id[0x82] address[0xe0900000] gsi_base[56]) [Mon Dec 9 06:17:04 2019][ 0.000000] IOAPIC[2]: apic_id 130, version 33, address 0xe0900000, GSI 56-87 [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: IOAPIC (id[0x83] address[0xc5900000] gsi_base[88]) [Mon Dec 9 06:17:04 2019][ 0.000000] IOAPIC[3]: apic_id 131, version 33, address 0xc5900000, GSI 88-119 [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: IOAPIC (id[0x84] address[0xaa900000] gsi_base[120]) [Mon Dec 9 06:17:04 2019][ 0.000000] IOAPIC[4]: apic_id 132, version 33, address 0xaa900000, GSI 120-151 [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 low level) [Mon Dec 9 06:17:04 2019][ 0.000000] Using ACPI (MADT) for SMP configuration information [Mon Dec 9 06:17:04 2019][ 0.000000] ACPI: HPET id: 0x10228201 base: 0xfed00000 [Mon Dec 9 06:17:04 2019][ 0.000000] smpboot: Allowing 128 CPUs, 80 hotplug CPUs [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x0008f000-0x0008ffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x000a0000-0x000fffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37ac0000-0x37ac0fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37ad8000-0x37ad8fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37ad9000-0x37ad9fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b02000-0x37b02fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b03000-0x37b03fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b0b000-0x37b0bfff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b0c000-0x37b0cfff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b3d000-0x37b3dfff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b3e000-0x37b3efff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b6f000-0x37b6ffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37b70000-0x37b70fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x37c11000-0x37c11fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x4f774000-0x5777cfff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x6cacf000-0x6efcefff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x6efcf000-0x6fdfefff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x6fdff000-0x6fffefff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x70000000-0x8fffffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x90000000-0xfec0ffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0xfec10000-0xfec10fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0xfec11000-0xfed7ffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0xfed80000-0xfed80fff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0xfed81000-0xffffffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x107f380000-0x107fffffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x207ff80000-0x207fffffff] [Mon Dec 9 06:17:04 2019][ 0.000000] PM: Registered nosave memory: [mem 0x307ff80000-0x307fffffff] [Mon Dec 9 06:17:04 2019][ 0.000000] e820: [mem 0x90000000-0xfec0ffff] available for PCI devices [Mon Dec 9 06:17:04 2019][ 0.000000] Booting paravirtualized kernel on bare hardware [Mon Dec 9 06:17:04 2019][ 0.000000] setup_percpu: NR_CPUS:5120 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:4 [Mon Dec 9 06:17:04 2019][ 0.000000] PERCPU: Embedded 38 pages/cpu @ffff88f2fee00000 s118784 r8192 d28672 u262144 [Mon Dec 9 06:17:04 2019][ 0.000000] Built 4 zonelists in Zone order, mobility grouping on. Total pages: 65945355 [Mon Dec 9 06:17:04 2019][ 0.000000] Policy zone: Normal [Mon Dec 9 06:17:04 2019][ 0.000000] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-3.10.0-957.27.2.el7_lustre.pl2.x86_64 root=UUID=c4f754c4-e7db-49b7-baed-d6c7905c5cdc ro crashkernel=auto nomodeset console=ttyS0,115200 LANG=en_US.UTF-8 [Mon Dec 9 06:17:04 2019][ 0.000000] PID hash table entries: 4096 (order: 3, 32768 bytes) [Mon Dec 9 06:17:04 2019][ 0.000000] x86/fpu: xstate_offset[2]: 0240, xstate_sizes[2]: 0100 [Mon Dec 9 06:17:04 2019][ 0.000000] xsave: enabled xstate_bv 0x7, cntxt size 0x340 using standard form [Mon Dec 9 06:17:04 2019][ 0.000000] Memory: 9561188k/270532096k available (7676k kernel code, 2559084k absent, 4706776k reserved, 6045k data, 1876k init) [Mon Dec 9 06:17:04 2019][ 0.000000] SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=4 [Mon Dec 9 06:17:04 2019][ 0.000000] Hierarchical RCU implementation. [Mon Dec 9 06:17:04 2019][ 0.000000] RCU restricting CPUs from NR_CPUS=5120 to nr_cpu_ids=128. [Mon Dec 9 06:17:04 2019][ 0.000000] NR_IRQS:327936 nr_irqs:3624 0 [Mon Dec 9 06:17:05 2019][ 0.000000] Console: colour dummy device 80x25 [Mon Dec 9 06:17:05 2019][ 0.000000] console [ttyS0] enabled [Mon Dec 9 06:17:05 2019][ 0.000000] allocated 1072693248 bytes of page_cgroup [Mon Dec 9 06:17:05 2019][ 0.000000] please try 'cgroup_disable=memory' option if you don't want memory cgroups [Mon Dec 9 06:17:05 2019][ 0.000000] Enabling automatic NUMA balancing. Configure with numa_balancing= or the kernel.numa_balancing sysctl [Mon Dec 9 06:17:05 2019][ 0.000000] tsc: Fast TSC calibration using PIT [Mon Dec 9 06:17:05 2019][ 0.000000] tsc: Detected 1996.203 MHz processor [Mon Dec 9 06:17:05 2019][ 0.000054] Calibrating delay loop (skipped), value calculated using timer frequency.. 3992.40 BogoMIPS (lpj=1996203) [Mon Dec 9 06:17:05 2019][ 0.010696] pid_max: default: 131072 minimum: 1024 [Mon Dec 9 06:17:05 2019][ 0.016302] Security Framework initialized [Mon Dec 9 06:17:05 2019][ 0.020434] SELinux: Initializing. [Mon Dec 9 06:17:05 2019][ 0.023996] Yama: becoming mindful. [Mon Dec 9 06:17:05 2019][ 0.044183] Dentry cache hash table entries: 33554432 (order: 16, 268435456 bytes) [Mon Dec 9 06:17:05 2019][ 0.100110] Inode-cache hash table entries: 16777216 (order: 15, 134217728 bytes) [Mon Dec 9 06:17:05 2019][ 0.127915] Mount-cache hash table entries: 524288 (order: 10, 4194304 bytes) [Mon Dec 9 06:17:05 2019][ 0.135314] Mountpoint-cache hash table entries: 524288 (order: 10, 4194304 bytes) [Mon Dec 9 06:17:05 2019][ 0.144434] Initializing cgroup subsys memory [Mon Dec 9 06:17:05 2019][ 0.148828] Initializing cgroup subsys devices [Mon Dec 9 06:17:05 2019][ 0.153284] Initializing cgroup subsys freezer [Mon Dec 9 06:17:05 2019][ 0.157739] Initializing cgroup subsys net_cls [Mon Dec 9 06:17:05 2019][ 0.162192] Initializing cgroup subsys blkio [Mon Dec 9 06:17:05 2019][ 0.166473] Initializing cgroup subsys perf_event [Mon Dec 9 06:17:05 2019][ 0.171199] Initializing cgroup subsys hugetlb [Mon Dec 9 06:17:05 2019][ 0.175651] Initializing cgroup subsys pids [Mon Dec 9 06:17:05 2019][ 0.179848] Initializing cgroup subsys net_prio [Mon Dec 9 06:17:05 2019][ 0.190074] LVT offset 2 assigned for vector 0xf4 [Mon Dec 9 06:17:05 2019][ 0.194799] Last level iTLB entries: 4KB 1024, 2MB 1024, 4MB 512 [Mon Dec 9 06:17:05 2019][ 0.200811] Last level dTLB entries: 4KB 1536, 2MB 1536, 4MB 768 [Mon Dec 9 06:17:05 2019][ 0.206826] tlb_flushall_shift: 6 [Mon Dec 9 06:17:05 2019][ 0.210173] Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl and seccomp [Mon Dec 9 06:17:05 2019][ 0.219739] FEATURE SPEC_CTRL Not Present [Mon Dec 9 06:17:05 2019][ 0.223761] FEATURE IBPB_SUPPORT Present [Mon Dec 9 06:17:05 2019][ 0.227698] Spectre V2 : Enabling Indirect Branch Prediction Barrier [Mon Dec 9 06:17:05 2019][ 0.234130] Spectre V2 : Mitigation: Full retpoline [Mon Dec 9 06:17:05 2019][ 0.239995] Freeing SMP alternatives: 28k freed [Mon Dec 9 06:17:05 2019][ 0.246175] ACPI: Core revision 20130517 [Mon Dec 9 06:17:05 2019][ 0.254932] ACPI: All ACPI Tables successfully acquired [Mon Dec 9 06:17:05 2019][ 0.266870] ftrace: allocating 29216 entries in 115 pages [Mon Dec 9 06:17:05 2019][ 0.606033] Switched APIC routing to physical flat. [Mon Dec 9 06:17:05 2019][ 0.612952] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 [Mon Dec 9 06:17:05 2019][ 0.628964] smpboot: CPU0: AMD EPYC 7401P 24-Core Processor (fam: 17, model: 01, stepping: 02) [Mon Dec 9 06:17:05 2019][ 0.713416] random: fast init done [Mon Dec 9 06:17:05 2019][ 0.741417] APIC calibration not consistent with PM-Timer: 101ms instead of 100ms [Mon Dec 9 06:17:05 2019][ 0.748893] APIC delta adjusted to PM-Timer: 623826 (636296) [Mon Dec 9 06:17:05 2019][ 0.754582] Performance Events: Fam17h core perfctr, AMD PMU driver. [Mon Dec 9 06:17:05 2019][ 0.761013] ... version: 0 [Mon Dec 9 06:17:05 2019][ 0.765022] ... bit width: 48 [Mon Dec 9 06:17:05 2019][ 0.769123] ... generic registers: 6 [Mon Dec 9 06:17:05 2019][ 0.773136] ... value mask: 0000ffffffffffff [Mon Dec 9 06:17:05 2019][ 0.778448] ... max period: 00007fffffffffff [Mon Dec 9 06:17:05 2019][ 0.783761] ... fixed-purpose events: 0 [Mon Dec 9 06:17:05 2019][ 0.787774] ... event mask: 000000000000003f [Mon Dec 9 06:17:06 2019][ 0.795988] NMI watchdog: enabled on all CPUs, permanently consumes one hw-PMU counter. [Mon Dec 9 06:17:06 2019][ 0.804070] smpboot: Booting Node 1, Processors #1 OK [Mon Dec 9 06:17:06 2019][ 0.817266] smpboot: Booting Node 2, Processors #2 OK [Mon Dec 9 06:17:06 2019][ 0.830467] smpboot: Booting Node 3, Processors #3 OK [Mon Dec 9 06:17:06 2019][ 0.843661] smpboot: Booting Node 0, Processors #4 OK [Mon Dec 9 06:17:06 2019][ 0.856842] smpboot: Booting Node 1, Processors #5 OK [Mon Dec 9 06:17:06 2019][ 0.870024] smpboot: Booting Node 2, Processors #6 OK [Mon Dec 9 06:17:06 2019][ 0.883205] smpboot: Booting Node 3, Processors #7 OK [Mon Dec 9 06:17:06 2019][ 0.896387] smpboot: Booting Node 0, Processors #8 OK [Mon Dec 9 06:17:06 2019][ 0.909790] smpboot: Booting Node 1, Processors #9 OK [Mon Dec 9 06:17:06 2019][ 0.922979] smpboot: Booting Node 2, Processors #10 OK [Mon Dec 9 06:17:06 2019][ 0.936263] smpboot: Booting Node 3, Processors #11 OK [Mon Dec 9 06:17:06 2019][ 0.949533] smpboot: Booting Node 0, Processors #12 OK [Mon Dec 9 06:17:06 2019][ 0.962805] smpboot: Booting Node 1, Processors #13 OK [Mon Dec 9 06:17:06 2019][ 0.976086] smpboot: Booting Node 2, Processors #14 OK [Mon Dec 9 06:17:06 2019][ 0.989356] smpboot: Booting Node 3, Processors #15 OK [Mon Dec 9 06:17:06 2019][ 1.002629] smpboot: Booting Node 0, Processors #16 OK [Mon Dec 9 06:17:06 2019][ 1.016009] smpboot: Booting Node 1, Processors #17 OK [Mon Dec 9 06:17:06 2019][ 1.029283] smpboot: Booting Node 2, Processors #18 OK [Mon Dec 9 06:17:06 2019][ 1.042562] smpboot: Booting Node 3, Processors #19 OK [Mon Dec 9 06:17:06 2019][ 1.055835] smpboot: Booting Node 0, Processors #20 OK [Mon Dec 9 06:17:06 2019][ 1.069101] smpboot: Booting Node 1, Processors #21 OK [Mon Dec 9 06:17:06 2019][ 1.082370] smpboot: Booting Node 2, Processors #22 OK [Mon Dec 9 06:17:06 2019][ 1.095651] smpboot: Booting Node 3, Processors #23 OK [Mon Dec 9 06:17:06 2019][ 1.108919] smpboot: Booting Node 0, Processors #24 OK [Mon Dec 9 06:17:06 2019][ 1.122639] smpboot: Booting Node 1, Processors #25 OK [Mon Dec 9 06:17:06 2019][ 1.135889] smpboot: Booting Node 2, Processors #26 OK [Mon Dec 9 06:17:06 2019][ 1.149138] smpboot: Booting Node 3, Processors #27 OK [Mon Dec 9 06:17:06 2019][ 1.162374] smpboot: Booting Node 0, Processors #28 OK [Mon Dec 9 06:17:06 2019][ 1.175603] smpboot: Booting Node 1, Processors #29 OK [Mon Dec 9 06:17:06 2019][ 1.188826] smpboot: Booting Node 2, Processors #30 OK [Mon Dec 9 06:17:06 2019][ 1.202060] smpboot: Booting Node 3, Processors #31 OK [Mon Dec 9 06:17:06 2019][ 1.215284] smpboot: Booting Node 0, Processors #32 OK [Mon Dec 9 06:17:06 2019][ 1.228617] smpboot: Booting Node 1, Processors #33 OK [Mon Dec 9 06:17:06 2019][ 1.241860] smpboot: Booting Node 2, Processors #34 OK [Mon Dec 9 06:17:06 2019][ 1.255109] smpboot: Booting Node 3, Processors #35 OK [Mon Dec 9 06:17:06 2019][ 1.268334] smpboot: Booting Node 0, Processors #36 OK [Mon Dec 9 06:17:06 2019][ 1.281561] smpboot: Booting Node 1, Processors #37 OK [Mon Dec 9 06:17:06 2019][ 1.294892] smpboot: Booting Node 2, Processors #38 OK [Mon Dec 9 06:17:06 2019][ 1.308133] smpboot: Booting Node 3, Processors #39 OK [Mon Dec 9 06:17:06 2019][ 1.321359] smpboot: Booting Node 0, Processors #40 OK [Mon Dec 9 06:17:06 2019][ 1.334690] smpboot: Booting Node 1, Processors #41 OK [Mon Dec 9 06:17:06 2019][ 1.348038] smpboot: Booting Node 2, Processors #42 OK [Mon Dec 9 06:17:06 2019][ 1.361269] smpboot: Booting Node 3, Processors #43 OK [Mon Dec 9 06:17:06 2019][ 1.374495] smpboot: Booting Node 0, Processors #44 OK [Mon Dec 9 06:17:06 2019][ 1.387731] smpboot: Booting Node 1, Processors #45 OK [Mon Dec 9 06:17:06 2019][ 1.400965] smpboot: Booting Node 2, Processors #46 OK [Mon Dec 9 06:17:06 2019][ 1.414198] smpboot: Booting Node 3, Processors #47 [Mon Dec 9 06:17:06 2019][ 1.426902] Brought up 48 CPUs [Mon Dec 9 06:17:06 2019][ 1.430165] smpboot: Max logical packages: 3 [Mon Dec 9 06:17:06 2019][ 1.434442] smpboot: Total of 48 processors activated (191635.48 BogoMIPS) [Mon Dec 9 06:17:06 2019][ 1.725902] node 0 initialised, 15458277 pages in 278ms [Mon Dec 9 06:17:06 2019][ 1.731707] node 2 initialised, 15989367 pages in 280ms [Mon Dec 9 06:17:06 2019][ 1.731964] node 1 initialised, 15989367 pages in 280ms [Mon Dec 9 06:17:06 2019][ 1.731979] node 3 initialised, 15989247 pages in 279ms [Mon Dec 9 06:17:06 2019][ 1.747929] devtmpfs: initialized [Mon Dec 9 06:17:06 2019][ 1.773779] EVM: security.selinux [Mon Dec 9 06:17:06 2019][ 1.777101] EVM: security.ima [Mon Dec 9 06:17:06 2019][ 1.780073] EVM: security.capability [Mon Dec 9 06:17:07 2019][ 1.783749] PM: Registering ACPI NVS region [mem 0x0008f000-0x0008ffff] (4096 bytes) [Mon Dec 9 06:17:07 2019][ 1.791491] PM: Registering ACPI NVS region [mem 0x6efcf000-0x6fdfefff] (14876672 bytes) [Mon Dec 9 06:17:07 2019][ 1.801105] atomic64 test passed for x86-64 platform with CX8 and with SSE [Mon Dec 9 06:17:07 2019][ 1.807981] pinctrl core: initialized pinctrl subsystem [Mon Dec 9 06:17:07 2019][ 1.813308] RTC time: 14:17:06, date: 12/09/19 [Mon Dec 9 06:17:07 2019][ 1.817906] NET: Registered protocol family 16 [Mon Dec 9 06:17:07 2019][ 1.822697] ACPI FADT declares the system doesn't support PCIe ASPM, so disable it [Mon Dec 9 06:17:07 2019][ 1.830263] ACPI: bus type PCI registered [Mon Dec 9 06:17:07 2019][ 1.834276] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 [Mon Dec 9 06:17:07 2019][ 1.840854] PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0x80000000-0x8fffffff] (base 0x80000000) [Mon Dec 9 06:17:07 2019][ 1.850155] PCI: MMCONFIG at [mem 0x80000000-0x8fffffff] reserved in E820 [Mon Dec 9 06:17:07 2019][ 1.856947] PCI: Using configuration type 1 for base access [Mon Dec 9 06:17:07 2019][ 1.862529] PCI: Dell System detected, enabling pci=bfsort. [Mon Dec 9 06:17:07 2019][ 1.877653] ACPI: Added _OSI(Module Device) [Mon Dec 9 06:17:07 2019][ 1.881842] ACPI: Added _OSI(Processor Device) [Mon Dec 9 06:17:07 2019][ 1.886284] ACPI: Added _OSI(3.0 _SCP Extensions) [Mon Dec 9 06:17:07 2019][ 1.890992] ACPI: Added _OSI(Processor Aggregator Device) [Mon Dec 9 06:17:07 2019][ 1.896391] ACPI: Added _OSI(Linux-Dell-Video) [Mon Dec 9 06:17:07 2019][ 1.902621] ACPI: Executed 2 blocks of module-level executable AML code [Mon Dec 9 06:17:07 2019][ 1.914664] ACPI: Interpreter enabled [Mon Dec 9 06:17:07 2019][ 1.918342] ACPI: (supports S0 S5) [Mon Dec 9 06:17:07 2019][ 1.921750] ACPI: Using IOAPIC for interrupt routing [Mon Dec 9 06:17:07 2019][ 1.926928] HEST: Table parsing has been initialized. [Mon Dec 9 06:17:07 2019][ 1.931988] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug [Mon Dec 9 06:17:07 2019][ 1.941136] ACPI: Enabled 1 GPEs in block 00 to 1F [Mon Dec 9 06:17:07 2019][ 1.952803] ACPI: PCI Interrupt Link [LNKA] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 1.959713] ACPI: PCI Interrupt Link [LNKB] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 1.966620] ACPI: PCI Interrupt Link [LNKC] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 1.973524] ACPI: PCI Interrupt Link [LNKD] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 1.980433] ACPI: PCI Interrupt Link [LNKE] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 1.987340] ACPI: PCI Interrupt Link [LNKF] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 1.994247] ACPI: PCI Interrupt Link [LNKG] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 2.001153] ACPI: PCI Interrupt Link [LNKH] (IRQs 4 5 7 10 11 14 15) *0 [Mon Dec 9 06:17:07 2019][ 2.008204] ACPI: PCI Root Bridge [PC00] (domain 0000 [bus 00-3f]) [Mon Dec 9 06:17:07 2019][ 2.014386] acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Mon Dec 9 06:17:07 2019][ 2.022603] acpi PNP0A08:00: PCIe AER handled by firmware [Mon Dec 9 06:17:07 2019][ 2.028047] acpi PNP0A08:00: _OSC: platform does not support [SHPCHotplug] [Mon Dec 9 06:17:07 2019][ 2.034993] acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Mon Dec 9 06:17:07 2019][ 2.042646] acpi PNP0A08:00: FADT indicates ASPM is unsupported, using BIOS configuration [Mon Dec 9 06:17:07 2019][ 2.051101] PCI host bridge to bus 0000:00 [Mon Dec 9 06:17:07 2019][ 2.055205] pci_bus 0000:00: root bus resource [io 0x0000-0x03af window] [Mon Dec 9 06:17:07 2019][ 2.061990] pci_bus 0000:00: root bus resource [io 0x03e0-0x0cf7 window] [Mon Dec 9 06:17:07 2019][ 2.068777] pci_bus 0000:00: root bus resource [mem 0x000c0000-0x000c3fff window] [Mon Dec 9 06:17:07 2019][ 2.076255] pci_bus 0000:00: root bus resource [mem 0x000c4000-0x000c7fff window] [Mon Dec 9 06:17:07 2019][ 2.083734] pci_bus 0000:00: root bus resource [mem 0x000c8000-0x000cbfff window] [Mon Dec 9 06:17:07 2019][ 2.091213] pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000cffff window] [Mon Dec 9 06:17:07 2019][ 2.098694] pci_bus 0000:00: root bus resource [mem 0x000d0000-0x000d3fff window] [Mon Dec 9 06:17:07 2019][ 2.106171] pci_bus 0000:00: root bus resource [mem 0x000d4000-0x000d7fff window] [Mon Dec 9 06:17:07 2019][ 2.113653] pci_bus 0000:00: root bus resource [mem 0x000d8000-0x000dbfff window] [Mon Dec 9 06:17:07 2019][ 2.121131] pci_bus 0000:00: root bus resource [mem 0x000dc000-0x000dffff window] [Mon Dec 9 06:17:07 2019][ 2.128611] pci_bus 0000:00: root bus resource [mem 0x000e0000-0x000e3fff window] [Mon Dec 9 06:17:07 2019][ 2.136091] pci_bus 0000:00: root bus resource [mem 0x000e4000-0x000e7fff window] [Mon Dec 9 06:17:07 2019][ 2.143571] pci_bus 0000:00: root bus resource [mem 0x000e8000-0x000ebfff window] [Mon Dec 9 06:17:07 2019][ 2.151051] pci_bus 0000:00: root bus resource [mem 0x000ec000-0x000effff window] [Mon Dec 9 06:17:07 2019][ 2.158529] pci_bus 0000:00: root bus resource [mem 0x000f0000-0x000fffff window] [Mon Dec 9 06:17:07 2019][ 2.166009] pci_bus 0000:00: root bus resource [io 0x0d00-0x3fff window] [Mon Dec 9 06:17:07 2019][ 2.172796] pci_bus 0000:00: root bus resource [mem 0xe1000000-0xfebfffff window] [Mon Dec 9 06:17:07 2019][ 2.180274] pci_bus 0000:00: root bus resource [mem 0x10000000000-0x2bf3fffffff window] [Mon Dec 9 06:17:07 2019][ 2.188276] pci_bus 0000:00: root bus resource [bus 00-3f] [Mon Dec 9 06:17:07 2019][ 2.200937] pci 0000:00:03.1: PCI bridge to [bus 01] [Mon Dec 9 06:17:07 2019][ 2.206321] pci 0000:00:07.1: PCI bridge to [bus 02] [Mon Dec 9 06:17:07 2019][ 2.212092] pci 0000:00:08.1: PCI bridge to [bus 03] [Mon Dec 9 06:17:07 2019][ 2.217459] ACPI: PCI Root Bridge [PC01] (domain 0000 [bus 40-7f]) [Mon Dec 9 06:17:07 2019][ 2.223637] acpi PNP0A08:01: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Mon Dec 9 06:17:07 2019][ 2.231847] acpi PNP0A08:01: PCIe AER handled by firmware [Mon Dec 9 06:17:07 2019][ 2.237292] acpi PNP0A08:01: _OSC: platform does not support [SHPCHotplug] [Mon Dec 9 06:17:07 2019][ 2.244238] acpi PNP0A08:01: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Mon Dec 9 06:17:07 2019][ 2.251889] acpi PNP0A08:01: FADT indicates ASPM is unsupported, using BIOS configuration [Mon Dec 9 06:17:07 2019][ 2.260311] PCI host bridge to bus 0000:40 [Mon Dec 9 06:17:07 2019][ 2.264413] pci_bus 0000:40: root bus resource [io 0x4000-0x7fff window] [Mon Dec 9 06:17:07 2019][ 2.271198] pci_bus 0000:40: root bus resource [mem 0xc6000000-0xe0ffffff window] [Mon Dec 9 06:17:07 2019][ 2.278679] pci_bus 0000:40: root bus resource [mem 0x2bf40000000-0x47e7fffffff window] [Mon Dec 9 06:17:07 2019][ 2.286679] pci_bus 0000:40: root bus resource [bus 40-7f] [Mon Dec 9 06:17:07 2019][ 2.294569] pci 0000:40:07.1: PCI bridge to [bus 41] [Mon Dec 9 06:17:07 2019][ 2.299887] pci 0000:40:08.1: PCI bridge to [bus 42] [Mon Dec 9 06:17:07 2019][ 2.305046] ACPI: PCI Root Bridge [PC02] (domain 0000 [bus 80-bf]) [Mon Dec 9 06:17:07 2019][ 2.311227] acpi PNP0A08:02: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Mon Dec 9 06:17:07 2019][ 2.319434] acpi PNP0A08:02: PCIe AER handled by firmware [Mon Dec 9 06:17:07 2019][ 2.324878] acpi PNP0A08:02: _OSC: platform does not support [SHPCHotplug] [Mon Dec 9 06:17:07 2019][ 2.331826] acpi PNP0A08:02: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Mon Dec 9 06:17:07 2019][ 2.339475] acpi PNP0A08:02: FADT indicates ASPM is unsupported, using BIOS configuration [Mon Dec 9 06:17:07 2019][ 2.347918] PCI host bridge to bus 0000:80 [Mon Dec 9 06:17:07 2019][ 2.352018] pci_bus 0000:80: root bus resource [io 0x03b0-0x03df window] [Mon Dec 9 06:17:07 2019][ 2.358804] pci_bus 0000:80: root bus resource [mem 0x000a0000-0x000bffff window] [Mon Dec 9 06:17:07 2019][ 2.366284] pci_bus 0000:80: root bus resource [io 0x8000-0xbfff window] [Mon Dec 9 06:17:07 2019][ 2.373070] pci_bus 0000:80: root bus resource [mem 0xab000000-0xc5ffffff window] [Mon Dec 9 06:17:07 2019][ 2.380549] pci_bus 0000:80: root bus resource [mem 0x47e80000000-0x63dbfffffff window] [Mon Dec 9 06:17:07 2019][ 2.388549] pci_bus 0000:80: root bus resource [bus 80-bf] [Mon Dec 9 06:17:07 2019][ 2.397569] pci 0000:80:01.1: PCI bridge to [bus 81] [Mon Dec 9 06:17:07 2019][ 2.405307] pci 0000:80:01.2: PCI bridge to [bus 82-83] [Mon Dec 9 06:17:07 2019][ 2.410776] pci 0000:82:00.0: PCI bridge to [bus 83] [Mon Dec 9 06:17:07 2019][ 2.418306] pci 0000:80:03.1: PCI bridge to [bus 84] [Mon Dec 9 06:17:07 2019][ 2.423595] pci 0000:80:07.1: PCI bridge to [bus 85] [Mon Dec 9 06:17:07 2019][ 2.429315] pci 0000:80:08.1: PCI bridge to [bus 86] [Mon Dec 9 06:17:07 2019][ 2.434488] ACPI: PCI Root Bridge [PC03] (domain 0000 [bus c0-ff]) [Mon Dec 9 06:17:07 2019][ 2.440674] acpi PNP0A08:03: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI] [Mon Dec 9 06:17:07 2019][ 2.448881] acpi PNP0A08:03: PCIe AER handled by firmware [Mon Dec 9 06:17:07 2019][ 2.454318] acpi PNP0A08:03: _OSC: platform does not support [SHPCHotplug] [Mon Dec 9 06:17:07 2019][ 2.461265] acpi PNP0A08:03: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] [Mon Dec 9 06:17:07 2019][ 2.468918] acpi PNP0A08:03: FADT indicates ASPM is unsupported, using BIOS configuration [Mon Dec 9 06:17:07 2019][ 2.477248] acpi PNP0A08:03: host bridge window [mem 0x63dc0000000-0xffffffffffff window] ([0x80000000000-0xffffffffffff] ignored, not CPU addressable) [Mon Dec 9 06:17:07 2019][ 2.490882] PCI host bridge to bus 0000:c0 [Mon Dec 9 06:17:07 2019][ 2.494988] pci_bus 0000:c0: root bus resource [io 0xc000-0xffff window] [Mon Dec 9 06:17:07 2019][ 2.501772] pci_bus 0000:c0: root bus resource [mem 0x90000000-0xaaffffff window] [Mon Dec 9 06:17:07 2019][ 2.509254] pci_bus 0000:c0: root bus resource [mem 0x63dc0000000-0x7ffffffffff window] [Mon Dec 9 06:17:07 2019][ 2.517253] pci_bus 0000:c0: root bus resource [bus c0-ff] [Mon Dec 9 06:17:07 2019][ 2.524746] pci 0000:c0:01.1: PCI bridge to [bus c1] [Mon Dec 9 06:17:07 2019][ 2.530403] pci 0000:c0:07.1: PCI bridge to [bus c2] [Mon Dec 9 06:17:07 2019][ 2.535711] pci 0000:c0:08.1: PCI bridge to [bus c3] [Mon Dec 9 06:17:07 2019][ 2.542817] vgaarb: device added: PCI:0000:83:00.0,decodes=io+mem,owns=io+mem,locks=none [Mon Dec 9 06:17:07 2019][ 2.550909] vgaarb: loaded [Mon Dec 9 06:17:07 2019][ 2.553616] vgaarb: bridge control possible 0000:83:00.0 [Mon Dec 9 06:17:07 2019][ 2.559039] SCSI subsystem initialized [Mon Dec 9 06:17:07 2019][ 2.562823] ACPI: bus type USB registered [Mon Dec 9 06:17:07 2019][ 2.566850] usbcore: registered new interface driver usbfs [Mon Dec 9 06:17:07 2019][ 2.572343] usbcore: registered new interface driver hub [Mon Dec 9 06:17:07 2019][ 2.577857] usbcore: registered new device driver usb [Mon Dec 9 06:17:07 2019][ 2.583224] EDAC MC: Ver: 3.0.0 [Mon Dec 9 06:17:07 2019][ 2.586619] PCI: Using ACPI for IRQ routing [Mon Dec 9 06:17:07 2019][ 2.610186] NetLabel: Initializing [Mon Dec 9 06:17:07 2019][ 2.613594] NetLabel: domain hash size = 128 [Mon Dec 9 06:17:07 2019][ 2.617952] NetLabel: protocols = UNLABELED CIPSOv4 [Mon Dec 9 06:17:07 2019][ 2.622933] NetLabel: unlabeled traffic allowed by default [Mon Dec 9 06:17:07 2019][ 2.628701] hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 [Mon Dec 9 06:17:07 2019][ 2.633681] hpet0: 3 comparators, 32-bit 14.318180 MHz counter [Mon Dec 9 06:17:07 2019][ 2.641701] Switched to clocksource hpet [Mon Dec 9 06:17:07 2019][ 2.650499] pnp: PnP ACPI init [Mon Dec 9 06:17:07 2019][ 2.653591] ACPI: bus type PNP registered [Mon Dec 9 06:17:07 2019][ 2.657839] system 00:00: [mem 0x80000000-0x8fffffff] has been reserved [Mon Dec 9 06:17:07 2019][ 2.665065] pnp: PnP ACPI: found 4 devices [Mon Dec 9 06:17:07 2019][ 2.669174] ACPI: bus type PNP unregistered [Mon Dec 9 06:17:07 2019][ 2.680691] pci 0000:01:00.0: can't claim BAR 6 [mem 0xfff00000-0xffffffff pref]: no compatible bridge window [Mon Dec 9 06:17:07 2019][ 2.690611] pci 0000:81:00.0: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window [Mon Dec 9 06:17:07 2019][ 2.700530] pci 0000:81:00.1: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window [Mon Dec 9 06:17:07 2019][ 2.710447] pci 0000:84:00.0: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window [Mon Dec 9 06:17:07 2019][ 2.720362] pci 0000:c1:00.0: can't claim BAR 6 [mem 0xfff00000-0xffffffff pref]: no compatible bridge window [Mon Dec 9 06:17:07 2019][ 2.730301] pci 0000:00:03.1: BAR 14: assigned [mem 0xe1000000-0xe10fffff] [Mon Dec 9 06:17:07 2019][ 2.737184] pci 0000:01:00.0: BAR 6: assigned [mem 0xe1000000-0xe10fffff pref] [Mon Dec 9 06:17:07 2019][ 2.744411] pci 0000:00:03.1: PCI bridge to [bus 01] [Mon Dec 9 06:17:07 2019][ 2.749389] pci 0000:00:03.1: bridge window [mem 0xe1000000-0xe10fffff] [Mon Dec 9 06:17:07 2019][ 2.756182] pci 0000:00:03.1: bridge window [mem 0xe2000000-0xe3ffffff 64bit pref] [Mon Dec 9 06:17:07 2019][ 2.763931] pci 0000:00:07.1: PCI bridge to [bus 02] [Mon Dec 9 06:17:07 2019][ 2.768908] pci 0000:00:07.1: bridge window [mem 0xf7200000-0xf74fffff] [Mon Dec 9 06:17:07 2019][ 2.775711] pci 0000:00:08.1: PCI bridge to [bus 03] [Mon Dec 9 06:17:07 2019][ 2.780691] pci 0000:00:08.1: bridge window [mem 0xf7000000-0xf71fffff] [Mon Dec 9 06:17:08 2019][ 2.787541] pci 0000:40:07.1: PCI bridge to [bus 41] [Mon Dec 9 06:17:08 2019][ 2.792514] pci 0000:40:07.1: bridge window [mem 0xdb200000-0xdb4fffff] [Mon Dec 9 06:17:08 2019][ 2.799310] pci 0000:40:08.1: PCI bridge to [bus 42] [Mon Dec 9 06:17:08 2019][ 2.804282] pci 0000:40:08.1: bridge window [mem 0xdb000000-0xdb1fffff] [Mon Dec 9 06:17:08 2019][ 2.811119] pci 0000:80:01.1: BAR 14: assigned [mem 0xac300000-0xac3fffff] [Mon Dec 9 06:17:08 2019][ 2.818002] pci 0000:81:00.0: BAR 6: assigned [mem 0xac300000-0xac33ffff pref] [Mon Dec 9 06:17:08 2019][ 2.825230] pci 0000:81:00.1: BAR 6: assigned [mem 0xac340000-0xac37ffff pref] [Mon Dec 9 06:17:08 2019][ 2.832458] pci 0000:80:01.1: PCI bridge to [bus 81] [Mon Dec 9 06:17:08 2019][ 2.837433] pci 0000:80:01.1: bridge window [mem 0xac300000-0xac3fffff] [Mon Dec 9 06:17:08 2019][ 2.844229] pci 0000:80:01.1: bridge window [mem 0xac200000-0xac2fffff 64bit pref] [Mon Dec 9 06:17:08 2019][ 2.851977] pci 0000:82:00.0: PCI bridge to [bus 83] [Mon Dec 9 06:17:08 2019][ 2.856954] pci 0000:82:00.0: bridge window [mem 0xc0000000-0xc08fffff] [Mon Dec 9 06:17:08 2019][ 2.863748] pci 0000:82:00.0: bridge window [mem 0xab000000-0xabffffff 64bit pref] [Mon Dec 9 06:17:08 2019][ 2.871498] pci 0000:80:01.2: PCI bridge to [bus 82-83] [Mon Dec 9 06:17:08 2019][ 2.876737] pci 0000:80:01.2: bridge window [mem 0xc0000000-0xc08fffff] [Mon Dec 9 06:17:08 2019][ 2.883533] pci 0000:80:01.2: bridge window [mem 0xab000000-0xabffffff 64bit pref] [Mon Dec 9 06:17:08 2019][ 2.891282] pci 0000:84:00.0: BAR 6: no space for [mem size 0x00040000 pref] [Mon Dec 9 06:17:08 2019][ 2.898337] pci 0000:84:00.0: BAR 6: failed to assign [mem size 0x00040000 pref] [Mon Dec 9 06:17:08 2019][ 2.905736] pci 0000:80:03.1: PCI bridge to [bus 84] [Mon Dec 9 06:17:08 2019][ 2.910710] pci 0000:80:03.1: bridge window [io 0x8000-0x8fff] [Mon Dec 9 06:17:08 2019][ 2.916813] pci 0000:80:03.1: bridge window [mem 0xc0d00000-0xc0dfffff] [Mon Dec 9 06:17:08 2019][ 2.923609] pci 0000:80:03.1: bridge window [mem 0xac000000-0xac1fffff 64bit pref] [Mon Dec 9 06:17:08 2019][ 2.931357] pci 0000:80:07.1: PCI bridge to [bus 85] [Mon Dec 9 06:17:08 2019][ 2.936332] pci 0000:80:07.1: bridge window [mem 0xc0b00000-0xc0cfffff] [Mon Dec 9 06:17:08 2019][ 2.943129] pci 0000:80:08.1: PCI bridge to [bus 86] [Mon Dec 9 06:17:08 2019][ 2.948109] pci 0000:80:08.1: bridge window [mem 0xc0900000-0xc0afffff] [Mon Dec 9 06:17:08 2019][ 2.954948] pci 0000:c1:00.0: BAR 6: no space for [mem size 0x00100000 pref] [Mon Dec 9 06:17:08 2019][ 2.962001] pci 0000:c1:00.0: BAR 6: failed to assign [mem size 0x00100000 pref] [Mon Dec 9 06:17:08 2019][ 2.969404] pci 0000:c0:01.1: PCI bridge to [bus c1] [Mon Dec 9 06:17:08 2019][ 2.974378] pci 0000:c0:01.1: bridge window [io 0xc000-0xcfff] [Mon Dec 9 06:17:08 2019][ 2.980481] pci 0000:c0:01.1: bridge window [mem 0xa5400000-0xa55fffff] [Mon Dec 9 06:17:08 2019][ 2.987277] pci 0000:c0:07.1: PCI bridge to [bus c2] [Mon Dec 9 06:17:08 2019][ 2.992249] pci 0000:c0:07.1: bridge window [mem 0xa5200000-0xa53fffff] [Mon Dec 9 06:17:08 2019][ 2.999047] pci 0000:c0:08.1: PCI bridge to [bus c3] [Mon Dec 9 06:17:08 2019][ 3.004019] pci 0000:c0:08.1: bridge window [mem 0xa5000000-0xa51fffff] [Mon Dec 9 06:17:08 2019][ 3.010915] NET: Registered protocol family 2 [Mon Dec 9 06:17:08 2019][ 3.016003] TCP established hash table entries: 524288 (order: 10, 4194304 bytes) [Mon Dec 9 06:17:08 2019][ 3.024149] TCP bind hash table entries: 65536 (order: 8, 1048576 bytes) [Mon Dec 9 06:17:08 2019][ 3.030977] TCP: Hash tables configured (established 524288 bind 65536) [Mon Dec 9 06:17:08 2019][ 3.037640] TCP: reno registered [Mon Dec 9 06:17:08 2019][ 3.040983] UDP hash table entries: 65536 (order: 9, 2097152 bytes) [Mon Dec 9 06:17:08 2019][ 3.047582] UDP-Lite hash table entries: 65536 (order: 9, 2097152 bytes) [Mon Dec 9 06:17:08 2019][ 3.054784] NET: Registered protocol family 1 [Mon Dec 9 06:17:08 2019][ 3.059718] Unpacking initramfs... [Mon Dec 9 06:17:08 2019][ 3.329591] Freeing initrd memory: 19712k freed [Mon Dec 9 06:17:08 2019][ 3.336944] AMD-Vi: IOMMU performance counters supported [Mon Dec 9 06:17:08 2019][ 3.342348] AMD-Vi: IOMMU performance counters supported [Mon Dec 9 06:17:08 2019][ 3.347709] AMD-Vi: IOMMU performance counters supported [Mon Dec 9 06:17:08 2019][ 3.353070] AMD-Vi: IOMMU performance counters supported [Mon Dec 9 06:17:08 2019][ 3.359733] iommu: Adding device 0000:00:01.0 to group 0 [Mon Dec 9 06:17:08 2019][ 3.365769] iommu: Adding device 0000:00:02.0 to group 1 [Mon Dec 9 06:17:08 2019][ 3.371784] iommu: Adding device 0000:00:03.0 to group 2 [Mon Dec 9 06:17:08 2019][ 3.377877] iommu: Adding device 0000:00:03.1 to group 3 [Mon Dec 9 06:17:08 2019][ 3.383905] iommu: Adding device 0000:00:04.0 to group 4 [Mon Dec 9 06:17:08 2019][ 3.389926] iommu: Adding device 0000:00:07.0 to group 5 [Mon Dec 9 06:17:08 2019][ 3.395929] iommu: Adding device 0000:00:07.1 to group 6 [Mon Dec 9 06:17:08 2019][ 3.401936] iommu: Adding device 0000:00:08.0 to group 7 [Mon Dec 9 06:17:08 2019][ 3.407929] iommu: Adding device 0000:00:08.1 to group 8 [Mon Dec 9 06:17:08 2019][ 3.413919] iommu: Adding device 0000:00:14.0 to group 9 [Mon Dec 9 06:17:08 2019][ 3.419257] iommu: Adding device 0000:00:14.3 to group 9 [Mon Dec 9 06:17:08 2019][ 3.425367] iommu: Adding device 0000:00:18.0 to group 10 [Mon Dec 9 06:17:08 2019][ 3.430794] iommu: Adding device 0000:00:18.1 to group 10 [Mon Dec 9 06:17:08 2019][ 3.436218] iommu: Adding device 0000:00:18.2 to group 10 [Mon Dec 9 06:17:08 2019][ 3.441643] iommu: Adding device 0000:00:18.3 to group 10 [Mon Dec 9 06:17:08 2019][ 3.447067] iommu: Adding device 0000:00:18.4 to group 10 [Mon Dec 9 06:17:08 2019][ 3.452494] iommu: Adding device 0000:00:18.5 to group 10 [Mon Dec 9 06:17:08 2019][ 3.457920] iommu: Adding device 0000:00:18.6 to group 10 [Mon Dec 9 06:17:08 2019][ 3.463350] iommu: Adding device 0000:00:18.7 to group 10 [Mon Dec 9 06:17:08 2019][ 3.469514] iommu: Adding device 0000:00:19.0 to group 11 [Mon Dec 9 06:17:08 2019][ 3.474946] iommu: Adding device 0000:00:19.1 to group 11 [Mon Dec 9 06:17:08 2019][ 3.480366] iommu: Adding device 0000:00:19.2 to group 11 [Mon Dec 9 06:17:08 2019][ 3.485791] iommu: Adding device 0000:00:19.3 to group 11 [Mon Dec 9 06:17:08 2019][ 3.491220] iommu: Adding device 0000:00:19.4 to group 11 [Mon Dec 9 06:17:08 2019][ 3.496644] iommu: Adding device 0000:00:19.5 to group 11 [Mon Dec 9 06:17:08 2019][ 3.502076] iommu: Adding device 0000:00:19.6 to group 11 [Mon Dec 9 06:17:08 2019][ 3.507502] iommu: Adding device 0000:00:19.7 to group 11 [Mon Dec 9 06:17:08 2019][ 3.513666] iommu: Adding device 0000:00:1a.0 to group 12 [Mon Dec 9 06:17:08 2019][ 3.519089] iommu: Adding device 0000:00:1a.1 to group 12 [Mon Dec 9 06:17:08 2019][ 3.524517] iommu: Adding device 0000:00:1a.2 to group 12 [Mon Dec 9 06:17:08 2019][ 3.529944] iommu: Adding device 0000:00:1a.3 to group 12 [Mon Dec 9 06:17:08 2019][ 3.535369] iommu: Adding device 0000:00:1a.4 to group 12 [Mon Dec 9 06:17:08 2019][ 3.540795] iommu: Adding device 0000:00:1a.5 to group 12 [Mon Dec 9 06:17:08 2019][ 3.546219] iommu: Adding device 0000:00:1a.6 to group 12 [Mon Dec 9 06:17:08 2019][ 3.551647] iommu: Adding device 0000:00:1a.7 to group 12 [Mon Dec 9 06:17:08 2019][ 3.557837] iommu: Adding device 0000:00:1b.0 to group 13 [Mon Dec 9 06:17:08 2019][ 3.563271] iommu: Adding device 0000:00:1b.1 to group 13 [Mon Dec 9 06:17:08 2019][ 3.568706] iommu: Adding device 0000:00:1b.2 to group 13 [Mon Dec 9 06:17:08 2019][ 3.574138] iommu: Adding device 0000:00:1b.3 to group 13 [Mon Dec 9 06:17:08 2019][ 3.579560] iommu: Adding device 0000:00:1b.4 to group 13 [Mon Dec 9 06:17:08 2019][ 3.584986] iommu: Adding device 0000:00:1b.5 to group 13 [Mon Dec 9 06:17:08 2019][ 3.590415] iommu: Adding device 0000:00:1b.6 to group 13 [Mon Dec 9 06:17:08 2019][ 3.595841] iommu: Adding device 0000:00:1b.7 to group 13 [Mon Dec 9 06:17:08 2019][ 3.601985] iommu: Adding device 0000:01:00.0 to group 14 [Mon Dec 9 06:17:08 2019][ 3.608074] iommu: Adding device 0000:02:00.0 to group 15 [Mon Dec 9 06:17:08 2019][ 3.614213] iommu: Adding device 0000:02:00.2 to group 16 [Mon Dec 9 06:17:08 2019][ 3.620289] iommu: Adding device 0000:02:00.3 to group 17 [Mon Dec 9 06:17:08 2019][ 3.626367] iommu: Adding device 0000:03:00.0 to group 18 [Mon Dec 9 06:17:08 2019][ 3.632463] iommu: Adding device 0000:03:00.1 to group 19 [Mon Dec 9 06:17:08 2019][ 3.638555] iommu: Adding device 0000:40:01.0 to group 20 [Mon Dec 9 06:17:08 2019][ 3.644605] iommu: Adding device 0000:40:02.0 to group 21 [Mon Dec 9 06:17:08 2019][ 3.650721] iommu: Adding device 0000:40:03.0 to group 22 [Mon Dec 9 06:17:08 2019][ 3.656830] iommu: Adding device 0000:40:04.0 to group 23 [Mon Dec 9 06:17:08 2019][ 3.662957] iommu: Adding device 0000:40:07.0 to group 24 [Mon Dec 9 06:17:08 2019][ 3.668968] iommu: Adding device 0000:40:07.1 to group 25 [Mon Dec 9 06:17:08 2019][ 3.675007] iommu: Adding device 0000:40:08.0 to group 26 [Mon Dec 9 06:17:08 2019][ 3.681028] iommu: Adding device 0000:40:08.1 to group 27 [Mon Dec 9 06:17:08 2019][ 3.687086] iommu: Adding device 0000:41:00.0 to group 28 [Mon Dec 9 06:17:08 2019][ 3.693146] iommu: Adding device 0000:41:00.2 to group 29 [Mon Dec 9 06:17:08 2019][ 3.699180] iommu: Adding device 0000:41:00.3 to group 30 [Mon Dec 9 06:17:08 2019][ 3.705263] iommu: Adding device 0000:42:00.0 to group 31 [Mon Dec 9 06:17:08 2019][ 3.711315] iommu: Adding device 0000:42:00.1 to group 32 [Mon Dec 9 06:17:08 2019][ 3.717362] iommu: Adding device 0000:80:01.0 to group 33 [Mon Dec 9 06:17:08 2019][ 3.723370] iommu: Adding device 0000:80:01.1 to group 34 [Mon Dec 9 06:17:08 2019][ 3.729543] iommu: Adding device 0000:80:01.2 to group 35 [Mon Dec 9 06:17:08 2019][ 3.735555] iommu: Adding device 0000:80:02.0 to group 36 [Mon Dec 9 06:17:08 2019][ 3.741617] iommu: Adding device 0000:80:03.0 to group 37 [Mon Dec 9 06:17:08 2019][ 3.747635] iommu: Adding device 0000:80:03.1 to group 38 [Mon Dec 9 06:17:08 2019][ 3.753682] iommu: Adding device 0000:80:04.0 to group 39 [Mon Dec 9 06:17:08 2019][ 3.759718] iommu: Adding device 0000:80:07.0 to group 40 [Mon Dec 9 06:17:08 2019][ 3.765734] iommu: Adding device 0000:80:07.1 to group 41 [Mon Dec 9 06:17:08 2019][ 3.771799] iommu: Adding device 0000:80:08.0 to group 42 [Mon Dec 9 06:17:08 2019][ 3.777819] iommu: Adding device 0000:80:08.1 to group 43 [Mon Dec 9 06:17:08 2019][ 3.783868] iommu: Adding device 0000:81:00.0 to group 44 [Mon Dec 9 06:17:09 2019][ 3.789313] iommu: Adding device 0000:81:00.1 to group 44 [Mon Dec 9 06:17:09 2019][ 3.795370] iommu: Adding device 0000:82:00.0 to group 45 [Mon Dec 9 06:17:09 2019][ 3.800792] iommu: Adding device 0000:83:00.0 to group 45 [Mon Dec 9 06:17:09 2019][ 3.806798] iommu: Adding device 0000:84:00.0 to group 46 [Mon Dec 9 06:17:09 2019][ 3.812810] iommu: Adding device 0000:85:00.0 to group 47 [Mon Dec 9 06:17:09 2019][ 3.818840] iommu: Adding device 0000:85:00.2 to group 48 [Mon Dec 9 06:17:09 2019][ 3.824857] iommu: Adding device 0000:86:00.0 to group 49 [Mon Dec 9 06:17:09 2019][ 3.830881] iommu: Adding device 0000:86:00.1 to group 50 [Mon Dec 9 06:17:09 2019][ 3.836908] iommu: Adding device 0000:86:00.2 to group 51 [Mon Dec 9 06:17:09 2019][ 3.842960] iommu: Adding device 0000:c0:01.0 to group 52 [Mon Dec 9 06:17:09 2019][ 3.849012] iommu: Adding device 0000:c0:01.1 to group 53 [Mon Dec 9 06:17:09 2019][ 3.855075] iommu: Adding device 0000:c0:02.0 to group 54 [Mon Dec 9 06:17:09 2019][ 3.861148] iommu: Adding device 0000:c0:03.0 to group 55 [Mon Dec 9 06:17:09 2019][ 3.867227] iommu: Adding device 0000:c0:04.0 to group 56 [Mon Dec 9 06:17:09 2019][ 3.873264] iommu: Adding device 0000:c0:07.0 to group 57 [Mon Dec 9 06:17:09 2019][ 3.879291] iommu: Adding device 0000:c0:07.1 to group 58 [Mon Dec 9 06:17:09 2019][ 3.885338] iommu: Adding device 0000:c0:08.0 to group 59 [Mon Dec 9 06:17:09 2019][ 3.891360] iommu: Adding device 0000:c0:08.1 to group 60 [Mon Dec 9 06:17:09 2019][ 3.899761] iommu: Adding device 0000:c1:00.0 to group 61 [Mon Dec 9 06:17:09 2019][ 3.905789] iommu: Adding device 0000:c2:00.0 to group 62 [Mon Dec 9 06:17:09 2019][ 3.911848] iommu: Adding device 0000:c2:00.2 to group 63 [Mon Dec 9 06:17:09 2019][ 3.917927] iommu: Adding device 0000:c3:00.0 to group 64 [Mon Dec 9 06:17:09 2019][ 3.924009] iommu: Adding device 0000:c3:00.1 to group 65 [Mon Dec 9 06:17:09 2019][ 3.929651] AMD-Vi: Found IOMMU at 0000:00:00.2 cap 0x40 [Mon Dec 9 06:17:09 2019][ 3.934971] AMD-Vi: Extended features (0xf77ef22294ada): [Mon Dec 9 06:17:09 2019][ 3.940292] PPR NX GT IA GA PC GA_vAPIC [Mon Dec 9 06:17:09 2019][ 3.944425] AMD-Vi: Found IOMMU at 0000:40:00.2 cap 0x40 [Mon Dec 9 06:17:09 2019][ 3.949749] AMD-Vi: Extended features (0xf77ef22294ada): [Mon Dec 9 06:17:09 2019][ 3.955071] PPR NX GT IA GA PC GA_vAPIC [Mon Dec 9 06:17:09 2019][ 3.959204] AMD-Vi: Found IOMMU at 0000:80:00.2 cap 0x40 [Mon Dec 9 06:17:09 2019][ 3.964524] AMD-Vi: Extended features (0xf77ef22294ada): [Mon Dec 9 06:17:09 2019][ 3.969847] PPR NX GT IA GA PC GA_vAPIC [Mon Dec 9 06:17:09 2019][ 3.973988] AMD-Vi: Found IOMMU at 0000:c0:00.2 cap 0x40 [Mon Dec 9 06:17:09 2019][ 3.979312] AMD-Vi: Extended features (0xf77ef22294ada): [Mon Dec 9 06:17:09 2019][ 3.984632] PPR NX GT IA GA PC GA_vAPIC [Mon Dec 9 06:17:09 2019][ 3.988774] AMD-Vi: Interrupt remapping enabled [Mon Dec 9 06:17:09 2019][ 3.993315] AMD-Vi: virtual APIC enabled [Mon Dec 9 06:17:09 2019][ 3.997653] AMD-Vi: Lazy IO/TLB flushing enabled [Mon Dec 9 06:17:09 2019][ 4.003999] perf: AMD NB counters detected [Mon Dec 9 06:17:09 2019][ 4.008151] perf: AMD LLC counters detected [Mon Dec 9 06:17:09 2019][ 4.018402] sha1_ssse3: Using SHA-NI optimized SHA-1 implementation [Mon Dec 9 06:17:09 2019][ 4.024780] sha256_ssse3: Using SHA-256-NI optimized SHA-256 implementation [Mon Dec 9 06:17:09 2019][ 4.033366] futex hash table entries: 32768 (order: 9, 2097152 bytes) [Mon Dec 9 06:17:09 2019][ 4.040000] Initialise system trusted keyring [Mon Dec 9 06:17:09 2019][ 4.044403] audit: initializing netlink socket (disabled) [Mon Dec 9 06:17:09 2019][ 4.049824] type=2000 audit(1575901024.202:1): initialized [Mon Dec 9 06:17:09 2019][ 4.080727] HugeTLB registered 1 GB page size, pre-allocated 0 pages [Mon Dec 9 06:17:09 2019][ 4.087091] HugeTLB registered 2 MB page size, pre-allocated 0 pages [Mon Dec 9 06:17:09 2019][ 4.094745] zpool: loaded [Mon Dec 9 06:17:09 2019][ 4.097380] zbud: loaded [Mon Dec 9 06:17:09 2019][ 4.100291] VFS: Disk quotas dquot_6.6.0 [Mon Dec 9 06:17:09 2019][ 4.104320] Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [Mon Dec 9 06:17:09 2019][ 4.111124] msgmni has been set to 32768 [Mon Dec 9 06:17:09 2019][ 4.115148] Key type big_key registered [Mon Dec 9 06:17:09 2019][ 4.121386] NET: Registered protocol family 38 [Mon Dec 9 06:17:09 2019][ 4.125848] Key type asymmetric registered [Mon Dec 9 06:17:09 2019][ 4.129960] Asymmetric key parser 'x509' registered [Mon Dec 9 06:17:09 2019][ 4.134897] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 248) [Mon Dec 9 06:17:09 2019][ 4.142452] io scheduler noop registered [Mon Dec 9 06:17:09 2019][ 4.146390] io scheduler deadline registered (default) [Mon Dec 9 06:17:09 2019][ 4.151575] io scheduler cfq registered [Mon Dec 9 06:17:09 2019][ 4.155421] io scheduler mq-deadline registered [Mon Dec 9 06:17:09 2019][ 4.159963] io scheduler kyber registered [Mon Dec 9 06:17:09 2019][ 4.170680] pcieport 0000:00:03.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.177657] pci 0000:01:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.184205] pcieport 0000:00:07.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.191175] pci 0000:02:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.197718] pci 0000:02:00.2: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.204251] pci 0000:02:00.3: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.210798] pcieport 0000:00:08.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.217765] pci 0000:03:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.224300] pci 0000:03:00.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.230855] pcieport 0000:40:07.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.237818] pci 0000:41:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.244351] pci 0000:41:00.2: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.250888] pci 0000:41:00.3: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.257442] pcieport 0000:40:08.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.264409] pci 0000:42:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.270941] pci 0000:42:00.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.277492] pcieport 0000:80:01.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.284464] pci 0000:81:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.290999] pci 0000:81:00.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.297549] pcieport 0000:80:01.2: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.304516] pci 0000:82:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.311054] pci 0000:83:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.317602] pcieport 0000:80:03.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.324573] pci 0000:84:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.331119] pcieport 0000:80:07.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.338087] pci 0000:85:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.344622] pci 0000:85:00.2: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.351169] pcieport 0000:80:08.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.358130] pci 0000:86:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.364664] pci 0000:86:00.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.371201] pci 0000:86:00.2: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.377756] pcieport 0000:c0:01.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.384721] pci 0000:c1:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.391272] pcieport 0000:c0:07.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.398242] pci 0000:c2:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.404777] pci 0000:c2:00.2: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.411330] pcieport 0000:c0:08.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.418297] pci 0000:c3:00.0: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.424835] pci 0000:c3:00.1: Signaling PME through PCIe PME interrupt [Mon Dec 9 06:17:09 2019][ 4.431391] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [Mon Dec 9 06:17:09 2019][ 4.436981] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [Mon Dec 9 06:17:09 2019][ 4.443631] shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 [Mon Dec 9 06:17:09 2019][ 4.450438] efifb: probing for efifb [Mon Dec 9 06:17:09 2019][ 4.454032] efifb: framebuffer at 0xab000000, mapped to 0xffffa48f59800000, using 3072k, total 3072k [Mon Dec 9 06:17:09 2019][ 4.463163] efifb: mode is 1024x768x32, linelength=4096, pages=1 [Mon Dec 9 06:17:09 2019][ 4.469178] efifb: scrolling: redraw [Mon Dec 9 06:17:09 2019][ 4.472771] efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 [Mon Dec 9 06:17:09 2019][ 4.494050] Console: switching to colour frame buffer device 128x48 [Mon Dec 9 06:17:09 2019][ 4.515784] fb0: EFI VGA frame buffer device [Mon Dec 9 06:17:09 2019][ 4.520168] input: Power Button as /devices/LNXSYSTM:00/device:00/PNP0C0C:00/input/input0 [Mon Dec 9 06:17:09 2019][ 4.528352] ACPI: Power Button [PWRB] [Mon Dec 9 06:17:09 2019][ 4.532073] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input1 [Mon Dec 9 06:17:09 2019][ 4.539478] ACPI: Power Button [PWRF] [Mon Dec 9 06:17:09 2019][ 4.544365] GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. [Mon Dec 9 06:17:09 2019][ 4.551850] Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled [Mon Dec 9 06:17:09 2019][ 4.579045] 00:02: ttyS1 at I/O 0x2f8 (irq = 3) is a 16550A [Mon Dec 9 06:17:09 2019][ 4.605587] 00:03: ttyS0 at I/O 0x3f8 (irq = 4) is a 16550A [Mon Dec 9 06:17:09 2019][ 4.611666] Non-volatile memory driver v1.3 [Mon Dec 9 06:17:09 2019][ 4.615896] Linux agpgart interface v0.103 [Mon Dec 9 06:17:09 2019][ 4.622439] crash memory driver: version 1.1 [Mon Dec 9 06:17:09 2019][ 4.626950] rdac: device handler registered [Mon Dec 9 06:17:09 2019][ 4.631203] hp_sw: device handler registered [Mon Dec 9 06:17:09 2019][ 4.635483] emc: device handler registered [Mon Dec 9 06:17:09 2019][ 4.639743] alua: device handler registered [Mon Dec 9 06:17:09 2019][ 4.643975] libphy: Fixed MDIO Bus: probed [Mon Dec 9 06:17:09 2019][ 4.648134] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [Mon Dec 9 06:17:09 2019][ 4.654672] ehci-pci: EHCI PCI platform driver [Mon Dec 9 06:17:09 2019][ 4.659140] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [Mon Dec 9 06:17:09 2019][ 4.665331] ohci-pci: OHCI PCI platform driver [Mon Dec 9 06:17:09 2019][ 4.669798] uhci_hcd: USB Universal Host Controller Interface driver [Mon Dec 9 06:17:09 2019][ 4.676292] xhci_hcd 0000:02:00.3: xHCI Host Controller [Mon Dec 9 06:17:09 2019][ 4.681595] xhci_hcd 0000:02:00.3: new USB bus registered, assigned bus number 1 [Mon Dec 9 06:17:09 2019][ 4.689101] xhci_hcd 0000:02:00.3: hcc params 0x0270f665 hci version 0x100 quirks 0x00000410 [Mon Dec 9 06:17:09 2019][ 4.697869] usb usb1: New USB device found, idVendor=1d6b, idProduct=0002 [Mon Dec 9 06:17:09 2019][ 4.704662] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Mon Dec 9 06:17:09 2019][ 4.711889] usb usb1: Product: xHCI Host Controller [Mon Dec 9 06:17:09 2019][ 4.716776] usb usb1: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Mon Dec 9 06:17:09 2019][ 4.724871] usb usb1: SerialNumber: 0000:02:00.3 [Mon Dec 9 06:17:09 2019][ 4.729607] hub 1-0:1.0: USB hub found [Mon Dec 9 06:17:09 2019][ 4.733368] hub 1-0:1.0: 2 ports detected [Mon Dec 9 06:17:09 2019][ 4.737616] xhci_hcd 0000:02:00.3: xHCI Host Controller [Mon Dec 9 06:17:09 2019][ 4.742908] xhci_hcd 0000:02:00.3: new USB bus registered, assigned bus number 2 [Mon Dec 9 06:17:09 2019][ 4.750334] usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. [Mon Dec 9 06:17:09 2019][ 4.758443] usb usb2: New USB device found, idVendor=1d6b, idProduct=0003 [Mon Dec 9 06:17:09 2019][ 4.765234] usb usb2: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Mon Dec 9 06:17:09 2019][ 4.772461] usb usb2: Product: xHCI Host Controller [Mon Dec 9 06:17:09 2019][ 4.777350] usb usb2: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Mon Dec 9 06:17:09 2019][ 4.785444] usb usb2: SerialNumber: 0000:02:00.3 [Mon Dec 9 06:17:10 2019][ 4.790160] hub 2-0:1.0: USB hub found [Mon Dec 9 06:17:10 2019][ 4.793926] hub 2-0:1.0: 2 ports detected [Mon Dec 9 06:17:10 2019][ 4.798267] xhci_hcd 0000:41:00.3: xHCI Host Controller [Mon Dec 9 06:17:10 2019][ 4.803578] xhci_hcd 0000:41:00.3: new USB bus registered, assigned bus number 3 [Mon Dec 9 06:17:10 2019][ 4.811082] xhci_hcd 0000:41:00.3: hcc params 0x0270f665 hci version 0x100 quirks 0x00000410 [Mon Dec 9 06:17:10 2019][ 4.819869] usb usb3: New USB device found, idVendor=1d6b, idProduct=0002 [Mon Dec 9 06:17:10 2019][ 4.826664] usb usb3: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Mon Dec 9 06:17:10 2019][ 4.833893] usb usb3: Product: xHCI Host Controller [Mon Dec 9 06:17:10 2019][ 4.838780] usb usb3: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Mon Dec 9 06:17:10 2019][ 4.846874] usb usb3: SerialNumber: 0000:41:00.3 [Mon Dec 9 06:17:10 2019][ 4.851606] hub 3-0:1.0: USB hub found [Mon Dec 9 06:17:10 2019][ 4.855374] hub 3-0:1.0: 2 ports detected [Mon Dec 9 06:17:10 2019][ 4.859641] xhci_hcd 0000:41:00.3: xHCI Host Controller [Mon Dec 9 06:17:10 2019][ 4.864916] xhci_hcd 0000:41:00.3: new USB bus registered, assigned bus number 4 [Mon Dec 9 06:17:10 2019][ 4.872357] usb usb4: We don't know the algorithms for LPM for this host, disabling LPM. [Mon Dec 9 06:17:10 2019][ 4.880469] usb usb4: New USB device found, idVendor=1d6b, idProduct=0003 [Mon Dec 9 06:17:10 2019][ 4.887264] usb usb4: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [Mon Dec 9 06:17:10 2019][ 4.894491] usb usb4: Product: xHCI Host Controller [Mon Dec 9 06:17:10 2019][ 4.899381] usb usb4: Manufacturer: Linux 3.10.0-957.27.2.el7_lustre.pl2.x86_64 xhci-hcd [Mon Dec 9 06:17:10 2019][ 4.907475] usb usb4: SerialNumber: 0000:41:00.3 [Mon Dec 9 06:17:10 2019][ 4.912191] hub 4-0:1.0: USB hub found [Mon Dec 9 06:17:10 2019][ 4.915953] hub 4-0:1.0: 2 ports detected [Mon Dec 9 06:17:10 2019][ 4.920215] usbcore: registered new interface driver usbserial_generic [Mon Dec 9 06:17:10 2019][ 4.926756] usbserial: USB Serial support registered for generic [Mon Dec 9 06:17:10 2019][ 4.932808] i8042: PNP: No PS/2 controller found. Probing ports directly. [Mon Dec 9 06:17:10 2019][ 5.048749] usb 1-1: new high-speed USB device number 2 using xhci_hcd [Mon Dec 9 06:17:10 2019][ 5.170748] usb 3-1: new high-speed USB device number 2 using xhci_hcd [Mon Dec 9 06:17:10 2019][ 5.180593] usb 1-1: New USB device found, idVendor=0424, idProduct=2744 [Mon Dec 9 06:17:10 2019][ 5.187307] usb 1-1: New USB device strings: Mfr=1, Product=2, SerialNumber=0 [Mon Dec 9 06:17:10 2019][ 5.194453] usb 1-1: Product: USB2734 [Mon Dec 9 06:17:10 2019][ 5.198123] usb 1-1: Manufacturer: Microchip Tech [Mon Dec 9 06:17:10 2019][ 5.230491] hub 1-1:1.0: USB hub found [Mon Dec 9 06:17:10 2019][ 5.234468] hub 1-1:1.0: 4 ports detected [Mon Dec 9 06:17:10 2019][ 5.291842] usb 2-1: new SuperSpeed USB device number 2 using xhci_hcd [Mon Dec 9 06:17:10 2019][ 5.300747] usb 3-1: New USB device found, idVendor=1604, idProduct=10c0 [Mon Dec 9 06:17:10 2019][ 5.307459] usb 3-1: New USB device strings: Mfr=0, Product=0, SerialNumber=0 [Mon Dec 9 06:17:10 2019][ 5.312969] usb 2-1: New USB device found, idVendor=0424, idProduct=5744 [Mon Dec 9 06:17:10 2019][ 5.312970] usb 2-1: New USB device strings: Mfr=2, Product=3, SerialNumber=0 [Mon Dec 9 06:17:10 2019][ 5.312971] usb 2-1: Product: USB5734 [Mon Dec 9 06:17:10 2019][ 5.312972] usb 2-1: Manufacturer: Microchip Tech [Mon Dec 9 06:17:10 2019][ 5.326487] hub 2-1:1.0: USB hub found [Mon Dec 9 06:17:10 2019][ 5.326843] hub 2-1:1.0: 4 ports detected [Mon Dec 9 06:17:10 2019][ 5.327898] usb: port power management may be unreliable [Mon Dec 9 06:17:10 2019][ 5.352513] hub 3-1:1.0: USB hub found [Mon Dec 9 06:17:10 2019][ 5.356496] hub 3-1:1.0: 4 ports detected [Mon Dec 9 06:17:11 2019][ 5.973232] i8042: No controller found [Mon Dec 9 06:17:11 2019][ 5.977008] tsc: Refined TSC clocksource calibration: 1996.249 MHz [Mon Dec 9 06:17:11 2019][ 5.977072] mousedev: PS/2 mouse device common for all mice [Mon Dec 9 06:17:11 2019][ 5.977261] rtc_cmos 00:01: RTC can wake from S4 [Mon Dec 9 06:17:11 2019][ 5.977606] rtc_cmos 00:01: rtc core: registered rtc_cmos as rtc0 [Mon Dec 9 06:17:11 2019][ 5.977706] rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram, hpet irqs [Mon Dec 9 06:17:11 2019][ 5.977768] cpuidle: using governor menu [Mon Dec 9 06:17:11 2019][ 5.978018] EFI Variables Facility v0.08 2004-May-17 [Mon Dec 9 06:17:11 2019][ 5.998627] hidraw: raw HID events driver (C) Jiri Kosina [Mon Dec 9 06:17:11 2019][ 5.998723] usbcore: registered new interface driver usbhid [Mon Dec 9 06:17:11 2019][ 5.998723] usbhid: USB HID core driver [Mon Dec 9 06:17:11 2019][ 5.998840] drop_monitor: Initializing network drop monitor service [Mon Dec 9 06:17:11 2019][ 5.998985] TCP: cubic registered [Mon Dec 9 06:17:11 2019][ 5.998990] Initializing XFRM netlink socket [Mon Dec 9 06:17:11 2019][ 5.999196] NET: Registered protocol family 10 [Mon Dec 9 06:17:11 2019][ 5.999731] NET: Registered protocol family 17 [Mon Dec 9 06:17:11 2019][ 5.999734] mpls_gso: MPLS GSO support [Mon Dec 9 06:17:11 2019][ 6.000786] mce: Using 23 MCE banks [Mon Dec 9 06:17:11 2019][ 6.000832] microcode: CPU0: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000847] microcode: CPU1: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000861] microcode: CPU2: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000873] microcode: CPU3: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000885] microcode: CPU4: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000900] microcode: CPU5: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000912] microcode: CPU6: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000928] microcode: CPU7: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000937] microcode: CPU8: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000946] microcode: CPU9: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000954] microcode: CPU10: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000964] microcode: CPU11: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000975] microcode: CPU12: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000986] microcode: CPU13: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.000997] microcode: CPU14: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001008] microcode: CPU15: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001019] microcode: CPU16: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001030] microcode: CPU17: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001041] microcode: CPU18: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001051] microcode: CPU19: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001061] microcode: CPU20: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001073] microcode: CPU21: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001083] microcode: CPU22: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001094] microcode: CPU23: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001104] microcode: CPU24: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001115] microcode: CPU25: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001123] microcode: CPU26: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001134] microcode: CPU27: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001144] microcode: CPU28: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001152] microcode: CPU29: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001308] microcode: CPU30: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001318] microcode: CPU31: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001329] microcode: CPU32: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001337] microcode: CPU33: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001348] microcode: CPU34: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001358] microcode: CPU35: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001369] microcode: CPU36: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001380] microcode: CPU37: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001391] microcode: CPU38: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001402] microcode: CPU39: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001410] microcode: CPU40: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001419] microcode: CPU41: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001427] microcode: CPU42: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001438] microcode: CPU43: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001445] microcode: CPU44: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001454] microcode: CPU45: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001462] microcode: CPU46: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001473] microcode: CPU47: patch_level=0x08001250 [Mon Dec 9 06:17:11 2019][ 6.001518] microcode: Microcode Update Driver: v2.01 , Peter Oruba [Mon Dec 9 06:17:11 2019][ 6.001659] Loading compiled-in X.509 certificates [Mon Dec 9 06:17:11 2019][ 6.001683] Loaded X.509 cert 'CentOS Linux kpatch signing key: ea0413152cde1d98ebdca3fe6f0230904c9ef717' [Mon Dec 9 06:17:11 2019][ 6.001697] Loaded X.509 cert 'CentOS Linux Driver update signing key: 7f421ee0ab69461574bb358861dbe77762a4201b' [Mon Dec 9 06:17:11 2019][ 6.002084] Loaded X.509 cert 'CentOS Linux kernel signing key: 468656045a39b52ff2152c315f6198c3e658f24d' [Mon Dec 9 06:17:11 2019][ 6.002097] registered taskstats version 1 [Mon Dec 9 06:17:11 2019][ 6.004213] Key type trusted registered [Mon Dec 9 06:17:11 2019][ 6.005787] Key type encrypted registered [Mon Dec 9 06:17:11 2019][ 6.005834] IMA: No TPM chip found, activating TPM-bypass! (rc=-19) [Mon Dec 9 06:17:11 2019][ 6.007298] Magic number: 15:915:282 [Mon Dec 9 06:17:11 2019][ 6.007380] tty tty19: hash matches [Mon Dec 9 06:17:11 2019][ 6.007519] memory memory148: hash matches [Mon Dec 9 06:17:11 2019][ 6.015092] rtc_cmos 00:01: setting system clock to 2019-12-09 14:17:10 UTC (1575901030) [Mon Dec 9 06:17:11 2019] ²ršÂº¢ºÊêusb 3-1.1: new high-speed USB device number 3 using xhci_hcd [Mon Dec 9 06:17:11 2019][ 6.387500] Switched to clocksource tsc [Mon Dec 9 06:17:11 2019][ 6.399037] Freeing unused kernel memory: 1876k freed [Mon Dec 9 06:17:11 2019][ 6.404317] Write protecting the kernel read-only data: 12288k [Mon Dec 9 06:17:11 2019][ 6.411541] Freeing unused kernel memory: 504k freed [Mon Dec 9 06:17:11 2019][ 6.417887] Freeing unused kernel memory: 596k freed [Mon Dec 9 06:17:11 2019][ 6.470116] systemd[1]: systemd 219 running in system mode. (+PAM +AUDIT +SELINUX +IMA -APPARMOR +SMACK +SYSVINIT +UTMP +LIBCRYPTSETUP +GCRYPT +GNUTLS +ACL +XZ +LZ4 -SECCOMP +BLKID +ELFUTILS +KMOD +IDN) [Mon Dec 9 06:17:11 2019][ 6.473772] usb 3-1.1: New USB device found, idVendor=1604, idProduct=10c0 [Mon Dec 9 06:17:11 2019][ 6.473773] usb 3-1.1: New USB device strings: Mfr=0, Product=0, SerialNumber=0 [Mon Dec 9 06:17:11 2019][ 6.503414] systemd[1]: Detected architecture x86-64. [Mon Dec 9 06:17:11 2019][ 6.504537] hub 3-1.1:1.0: USB hub found [Mon Dec 9 06:17:11 2019][ 6.504895] hub 3-1.1:1.0: 4 ports detected [Mon Dec 9 06:17:11 2019][ 6.516595] systemd[1]: Running in initial RAM disk. [Mon Dec 9 06:17:11 2019] [Mon Dec 9 06:17:11 2019]Welcome to CentOS Linux 7 (Core) dracut-033-554.el7 (Initramfs)! [Mon Dec 9 06:17:11 2019] [Mon Dec 9 06:17:11 2019][ 6.529901] systemd[1]: Set hostname to . [Mon Dec 9 06:17:11 2019][ 6.565168] systemd[1]: Reached target Local File Systems. [Mon Dec 9 06:17:11 2019][ 6.569767] usb 3-1.4: new high-speed USB device number 4 using xhci_hcd [Mon Dec 9 06:17:11 2019][ OK ] Reached target Local File Systems. [Mon Dec 9 06:17:11 2019][ 6.582855] systemd[1]: Reached target Swap. [Mon Dec 9 06:17:11 2019][ OK ] Reached target Swap. [Mon Dec 9 06:17:11 2019][ 6.591821] systemd[1]: Reached target Timers. [Mon Dec 9 06:17:11 2019][ OK ] Reached target Timers. [Mon Dec 9 06:17:11 2019][ 6.601036] systemd[1]: Created slice Root Slice. [Mon Dec 9 06:17:11 2019][ OK ] Created slice Root Slice. [Mon Dec 9 06:17:11 2019][ 6.611865] systemd[1]: Listening on udev Control Socket. [Mon Dec 9 06:17:11 2019][ OK ] Listening on udev Control Socket. [Mon Dec 9 06:17:11 2019][ 6.622848] systemd[1]: Listening on udev Kernel Socket. [Mon Dec 9 06:17:11 2019][ OK ] Listening on udev Kernel Socket. [Mon Dec 9 06:17:11 2019][ 6.633883] systemd[1]: Created slice System Slice. [Mon Dec 9 06:17:11 2019][ OK ] Created slice System Slice. [Mon Dec 9 06:17:11 2019][ 6.643778] usb 3-1.4: New USB device found, idVendor=1604, idProduct=10c0 [Mon Dec 9 06:17:11 2019][ 6.650657] usb 3-1.4: New USB device strings: Mfr=0, Product=0, SerialNumber=0 [Mon Dec 9 06:17:11 2019][ 6.658073] systemd[1]: Listening on Journal Socket. [Mon Dec 9 06:17:11 2019][ OK ] Listening on J[ 6.664547] hub 3-1.4:1.0: USB hub found [Mon Dec 9 06:17:11 2019]ournal Socket. [Mon Dec 9 06:17:11 2019][ 6.670149] hub 3-1.4:1.0: 4 ports detected [Mon Dec 9 06:17:11 2019][ 6.675994] systemd[1]: Starting Setup Virtual Console... [Mon Dec 9 06:17:11 2019] Starting Setup Virtual Console... [Mon Dec 9 06:17:11 2019][ 6.686254] systemd[1]: Starting Create list of required static device nodes for the current kernel... [Mon Dec 9 06:17:11 2019] Starting Create list of required st... nodes for the current kernel... [Mon Dec 9 06:17:11 2019][ 6.717827] systemd[1]: Reached target Sockets. [Mon Dec 9 06:17:11 2019][ OK ] Reached target Sockets. [Mon Dec 9 06:17:11 2019][ 6.727301] systemd[1]: Starting Apply Kernel Variables... [Mon Dec 9 06:17:11 2019] Starting Apply Kernel Variables... [Mon Dec 9 06:17:11 2019][ 6.738860] systemd[1]: Reached target Slices. [Mon Dec 9 06:17:11 2019][ OK ] Reached target Slices. [Mon Dec 9 06:17:11 2019][ 6.748191] systemd[1]: Starting Journal Service... [Mon Dec 9 06:17:11 2019] Starting Journal Service... [Mon Dec 9 06:17:11 2019][ 6.758310] systemd[1]: Starting dracut cmdline hook... [Mon Dec 9 06:17:11 2019] Starting dracut cmdline hook... [Mon Dec 9 06:17:11 2019][ 6.768194] systemd[1]: Started Setup Virtual Console. [Mon Dec 9 06:17:11 2019][ OK ] Started Setup Virtual Console. [Mon Dec 9 06:17:12 2019][ 6.779123] systemd[1]: Started Create list of required static device nodes for the current kernel. [Mon Dec 9 06:17:12 2019][ OK ] Started Create list of required sta...ce nodes for the current kernel. [Mon Dec 9 06:17:12 2019][ 6.797008] systemd[1]: Started Apply Kernel Variables. [Mon Dec 9 06:17:12 2019][ OK ] Started Apply Kernel Variables. [Mon Dec 9 06:17:12 2019][ 6.808097] systemd[1]: Started Journal Service. [Mon Dec 9 06:17:12 2019][ OK ] Started Journal Service. [Mon Dec 9 06:17:12 2019] Starting Create Static Device Nodes in /dev... [Mon Dec 9 06:17:12 2019][ OK ] Started Create Static Device Nodes in /dev. [Mon Dec 9 06:17:12 2019][ OK ] Started dracut cmdline hook. [Mon Dec 9 06:17:12 2019] Starting dracut pre-udev hook... [Mon Dec 9 06:17:12 2019][ OK ] Started dracut pre-udev hook. [Mon Dec 9 06:17:12 2019] Starting udev Kernel Device Manager... [Mon Dec 9 06:17:12 2019][ OK ] Started udev Kernel Device Manager. [Mon Dec 9 06:17:12 2019] Starting udev Coldplug all Devices... [Mon Dec 9 06:17:12 2019] Mounting Configuration File System... [Mon Dec 9 06:17:12 2019][ OK ] Mounted Configuration File System. [Mon Dec 9 06:17:12 2019][ 6.943776] pps_core: LinuxPPS API ver. 1 registered [Mon Dec 9 06:17:12 2019][ 6.948745] pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti [Mon Dec 9 06:17:12 2019][ 6.961430] megasas: 07.705.02.00-rh1 [Mon Dec 9 06:17:12 2019][ 6.965430] megaraid_sas 0000:c1:00.0: FW now in Ready state [Mon Dec 9 06:17:12 2019][ 6.971107] megaraid_sas 0000:c1:00.0: 64 bit DMA mask and 32 bit consistent mask [Mon Dec 9 06:17:12 2019][ 6.972575] megaraid_sas 0000:c1:00.0: firmware supports msix : (96) [Mon Dec 9 06:17:12 2019][ 6.972576] megaraid_sas 0000:c1:00.0: current msix/online cpus : (48/48) [Mon Dec 9 06:17:12 2019][ 6.972577] megaraid_sas 0000:c1:00.0: RDPQ mode : (disabled) [Mon Dec 9 06:17:12 2019][ 6.972580] megaraid_sas 0000:c1:00.0: Current firmware supports maximum commands: 928 LDIO threshold: 237 [Mon Dec 9 06:17:12 2019][ 6.972847] megaraid_sas 0000:c1:00.0: Configured max firmware commands: 927 [Mon Dec 9 06:17:12 2019][ 6.974885] megaraid_sas 0000:c1:00.0: FW supports sync cache : No [Mon Dec 9 06:17:12 2019][ 6.978810] PTP clock support registered [Mon Dec 9 06:17:12 2019][ OK ] Started udev Coldplug all Devi[ 7.028321] mpt3sas: loading out-of-tree module taints kernel. [Mon Dec 9 06:17:12 2019]ces. [Mon Dec 9 06:17:12 2019][ 7.039156] tg3.c:v3.137 (May 11, 2014) [Mon Dec 9 06:17:12 2019][ 7.039203] mlx_compat: module verification failed: signature and/or required key missing - tainting kernel [Mon Dec 9 06:17:12 2019] Starting Show Plymouth Boot Screen... [Mon Dec 9 06:17:12 2019][ 7.056141] Compat-mlnx-ofed backport release: 1c4bf42 [Mon Dec 9 06:17:12 2019] 7.062229] Backport based on mlnx_ofed/mlnx-ofa_kernel-4.0.git 1c4bf42 [Mon Dec 9 06:17:12 2019][ 7.062230] compat.git: mlnx_ofed/mlnx-ofa_kernel-4.0.git [Mon Dec 9 06:17:12 2019][ 7.066699] tg3 0000:81:00.0 eth0: Tigon3 [partno(BCM95720) rev 5720000] (PCI Express) MAC address 4c:d9:8f:7d:ad:d7 [Mon Dec 9 06:17:12 2019][ 7.066703] tg3 0000:81:00.0 eth0: attached PHY is 5720C (10/100/1000Base-T Ethernet) (WireSpeed[1], EEE[1]) [Mon Dec 9 06:17:12 2019][ 7.066705] tg3 0000:81:00.0 eth0: RXcsums[1] LinkChgREG[0] MIirq[0] ASF[1] TSOcap[1] [Mon Dec 9 06:17:12 2019][ 7.066707] tg3 0000:81:00.0 eth0: dma_rwctrl[00000001] dma_mask[64-bit] [Mon Dec 9 06:17:12 2019][ 7.096872] tg3 0000:81:00.1 eth1: Tigon3 [partno(BCM95720) rev 5720000] (PCI Express) MAC address 4c:d9:8f:7d:ad:d8 [Mon Dec 9 06:17:12 2019][ 7.096875] tg3 0000:81:00.1 eth1: attached PHY is 5720C (10/100/1000Base-T Ethernet) (WireSpeed[1], EEE[1]) [Mon Dec 9 06:17:12 2019][ 7.096877] tg3 0000:81:00.1 eth1: RXcsums[1] LinkChgREG[0] MIirq[0] ASF[1] TSOcap[1] [Mon Dec 9 06:17:12 2019][ 7.096879] tg3 0000:81:00.1 eth1: dma_rwctrl[00000001] dma_mask[64-bit] [Mon Dec 9 06:17:12 2019][ 7.144503] mpt3sas version 31.00.00.00 loaded [Mon Dec 9 06:17:12 2019][ 7.150357] mpt3sas_cm0: 63 BIT PCI BUS DMA ADDRESSING SUPPORTED, total mem (263564416 kB) [Mon Dec 9 06:17:12 2019][ OK ] Reached target System Initiali[ 7.168033] ahci 0000:86:00.2: AHCI 0001.0301 32 slots 1 ports 6 Gbps 0x1 impl SATA mode [Mon Dec 9 06:17:12 2019]zation. [Mon Dec 9 06:17:12 2019] [ 7.177094] ahci 0000:86:00.2: flags: 64bit ncq sntf ilck pm led clo only pmp fbs pio slum part [Mon Dec 9 06:17:12 2019] Starting drac[ 7.187545] scsi host2: ahci [Mon Dec 9 06:17:12 2019]ut initqueue hoo[ 7.192196] ata1: SATA max UDMA/133 abar m4096@0xc0a02000 port 0xc0a02100 irq 120 [Mon Dec 9 06:17:12 2019]k... [Mon Dec 9 06:17:12 2019][ OK ] Started[ 7.202263] mlx5_core 0000:01:00.0: firmware version: 20.26.1040 [Mon Dec 9 06:17:12 2019] Show Plymouth B[ 7.209296] mlx5_core 0000:01:00.0: 126.016 Gb/s available PCIe bandwidth, limited by 8 GT/s x16 link at 0000:00:03.1 (capable of 252.048 Gb/s with 16 GT/s x16 link) [Mon Dec 9 06:17:12 2019]oot Screen. [Mon Dec 9 06:17:12 2019][ OK ] Started Forward Password Requests to Plymouth Directory Watch. [Mon Dec 9 06:17:12 2019][ OK ] Reached targe[ 7.235790] mpt3sas_cm0: IOC Number : 0 [Mon Dec 9 06:17:12 2019]t Paths. [Mon Dec 9 06:17:12 2019][ OK ] Rea[ 7.242790] mpt3sas0-msix0: PCI-MSI-X enabled: IRQ 137 [Mon Dec 9 06:17:12 2019]ched target Basi[ 7.248115] mpt3sas0-msix1: PCI-MSI-X enabled: IRQ 138 [Mon Dec 9 06:17:12 2019]c System. [Mon Dec 9 06:17:12 2019][ 7.254588] mpt3sas0-msix2: PCI-MSI-X enabled: IRQ 139 [Mon Dec 9 06:17:12 2019][ 7.260763] mpt3sas0-msix3: PCI-MSI-X enabled: IRQ 140 [Mon Dec 9 06:17:12 2019][ 7.265911] mpt3sas0-msix4: PCI-MSI-X enabled: IRQ 141 [Mon Dec 9 06:17:12 2019][ 7.271062] mpt3sas0-msix5: PCI-MSI-X enabled: IRQ 142 [Mon Dec 9 06:17:12 2019][ 7.271064] mpt3sas0-msix6: PCI-MSI-X enabled: IRQ 143 [Mon Dec 9 06:17:12 2019][ 7.271065] mpt3sas0-msix7: PCI-MSI-X enabled: IRQ 144 [Mon Dec 9 06:17:12 2019][ 7.271068] mpt3sas0-msix8: PCI-MSI-X enabled: IRQ 145 [Mon Dec 9 06:17:12 2019][ 7.271071] mpt3sas0-msix9: PCI-MSI-X enabled: IRQ 146 [Mon Dec 9 06:17:12 2019][ 7.271072] mpt3sas0-msix10: PCI-MSI-X enabled: IRQ 147 [Mon Dec 9 06:17:12 2019][ 7.271072] mpt3sas0-msix11: PCI-MSI-X enabled: IRQ 148 [Mon Dec 9 06:17:12 2019][ 7.271073] mpt3sas0-msix12: PCI-MSI-X enabled: IRQ 149 [Mon Dec 9 06:17:12 2019][ 7.271074] mpt3sas0-msix13: PCI-MSI-X enabled: IRQ 150 [Mon Dec 9 06:17:12 2019][ 7.271074] mpt3sas0-msix14: PCI-MSI-X enabled: IRQ 151 [Mon Dec 9 06:17:12 2019][ 7.271075] mpt3sas0-msix15: PCI-MSI-X enabled: IRQ 152 [Mon Dec 9 06:17:12 2019][ 7.271075] mpt3sas0-msix16: PCI-MSI-X enabled: IRQ 153 [Mon Dec 9 06:17:12 2019][ 7.271076] mpt3sas0-msix17: PCI-MSI-X enabled: IRQ 154 [Mon Dec 9 06:17:12 2019][ 7.271076] mpt3sas0-msix18: PCI-MSI-X enabled: IRQ 155 [Mon Dec 9 06:17:12 2019][ 7.271077] mpt3sas0-msix19: PCI-MSI-X enabled: IRQ 156 [Mon Dec 9 06:17:12 2019][ 7.271077] mpt3sas0-msix20: PCI-MSI-X enabled: IRQ 157 [Mon Dec 9 06:17:12 2019][ 7.271079] mpt3sas0-msix21: PCI-MSI-X enabled: IRQ 158 [Mon Dec 9 06:17:12 2019][ 7.271080] mpt3sas0-msix22: PCI-MSI-X enabled: IRQ 159 [Mon Dec 9 06:17:12 2019][ 7.271082] mpt3sas0-msix23: PCI-MSI-X enabled: IRQ 160 [Mon Dec 9 06:17:12 2019][ 7.271083] mpt3sas0-msix24: PCI-MSI-X enabled: IRQ 161 [Mon Dec 9 06:17:12 2019][ 7.271083] mpt3sas0-msix25: PCI-MSI-X enabled: IRQ 162 [Mon Dec 9 06:17:12 2019][ 7.271084] mpt3sas0-msix26: PCI-MSI-X enabled: IRQ 163 [Mon Dec 9 06:17:12 2019][ 7.271084] mpt3sas0-msix27: PCI-MSI-X enabled: IRQ 164 [Mon Dec 9 06:17:12 2019][ 7.271085] mpt3sas0-msix28: PCI-MSI-X enabled: IRQ 165 [Mon Dec 9 06:17:12 2019][ 7.271085] mpt3sas0-msix29: PCI-MSI-X enabled: IRQ 166 [Mon Dec 9 06:17:12 2019][ 7.271086] mpt3sas0-msix30: PCI-MSI-X enabled: IRQ 167 [Mon Dec 9 06:17:12 2019][ 7.271086] mpt3sas0-msix31: PCI-MSI-X enabled: IRQ 168 [Mon Dec 9 06:17:12 2019][ 7.271087] mpt3sas0-msix32: PCI-MSI-X enabled: IRQ 169 [Mon Dec 9 06:17:12 2019][ 7.271087] mpt3sas0-msix33: PCI-MSI-X enabled: IRQ 170 [Mon Dec 9 06:17:12 2019][ 7.271088] mpt3sas0-msix34: PCI-MSI-X enabled: IRQ 171 [Mon Dec 9 06:17:12 2019][ 7.271089] mpt3sas0-msix35: PCI-MSI-X enabled: IRQ 172 [Mon Dec 9 06:17:12 2019][ 7.271089] mpt3sas0-msix36: PCI-MSI-X enabled: IRQ 173 [Mon Dec 9 06:17:12 2019][ 7.271090] mpt3sas0-msix37: PCI-MSI-X enabled: IRQ 174 [Mon Dec 9 06:17:12 2019][ 7.271091] mpt3sas0-msix38: PCI-MSI-X enabled: IRQ 175 [Mon Dec 9 06:17:12 2019][ 7.271093] mpt3sas0-msix39: PCI-MSI-X enabled: IRQ 176 [Mon Dec 9 06:17:12 2019][ 7.271094] mpt3sas0-msix40: PCI-MSI-X enabled: IRQ 177 [Mon Dec 9 06:17:12 2019][ 7.271095] mpt3sas0-msix41: PCI-MSI-X enabled: IRQ 178 [Mon Dec 9 06:17:12 2019][ 7.271095] mpt3sas0-msix42: PCI-MSI-X enabled: IRQ 179 [Mon Dec 9 06:17:12 2019][ 7.271096] mpt3sas0-msix43: PCI-MSI-X enabled: IRQ 180 [Mon Dec 9 06:17:12 2019][ 7.271096] mpt3sas0-msix44: PCI-MSI-X enabled: IRQ 181 [Mon Dec 9 06:17:12 2019][ 7.271097] mpt3sas0-msix45: PCI-MSI-X enabled: IRQ 182 [Mon Dec 9 06:17:12 2019][ 7.271097] mpt3sas0-msix46: PCI-MSI-X enabled: IRQ 183 [Mon Dec 9 06:17:12 2019][ 7.271098] mpt3sas0-msix47: PCI-MSI-X enabled: IRQ 184 [Mon Dec 9 06:17:12 2019][ 7.271100] mpt3sas_cm0: iomem(0x00000000ac000000), mapped(0xffffa48f5a000000), size(1048576) [Mon Dec 9 06:17:12 2019][ 7.271101] mpt3sas_cm0: ioport(0x0000000000008000), size(256) [Mon Dec 9 06:17:12 2019][ 7.332790] megaraid_sas 0000:c1:00.0: Init cmd return status SUCCESS for SCSI host 0 [Mon Dec 9 06:17:12 2019][ 7.349787] mpt3sas_cm0: IOC Number : 0 [Mon Dec 9 06:17:12 2019][ 7.349790] mpt3sas_cm0: sending message unit reset !! [Mon Dec 9 06:17:12 2019][ 7.351785] mpt3sas_cm0: message unit reset: SUCCESS [Mon Dec 9 06:17:12 2019][ 7.353787] megaraid_sas 0000:c1:00.0: firmware type : Legacy(64 VD) firmware [Mon Dec 9 06:17:12 2019][ 7.353788] megaraid_sas 0000:c1:00.0: controller type : iMR(0MB) [Mon Dec 9 06:17:12 2019][ 7.353790] megaraid_sas 0000:c1:00.0: Online Controller Reset(OCR) : Enabled [Mon Dec 9 06:17:12 2019][ 7.353791] megaraid_sas 0000:c1:00.0: Secure JBOD support : No [Mon Dec 9 06:17:12 2019][ 7.353792] megaraid_sas 0000:c1:00.0: NVMe passthru support : No [Mon Dec 9 06:17:12 2019][ 7.375309] megaraid_sas 0000:c1:00.0: INIT adapter done [Mon Dec 9 06:17:12 2019][ 7.375311] megaraid_sas 0000:c1:00.0: Jbod map is not supported megasas_setup_jbod_map 5146 [Mon Dec 9 06:17:12 2019][ 7.401656] megaraid_sas 0000:c1:00.0: pci id : (0x1000)/(0x005f)/(0x1028)/(0x1f4b) [Mon Dec 9 06:17:12 2019][ 7.401657] megaraid_sas 0000:c1:00.0: unevenspan support : yes [Mon Dec 9 06:17:12 2019][ 7.401658] megaraid_sas 0000:c1:00.0: firmware crash dump : no [Mon Dec 9 06:17:12 2019][ 7.401659] megaraid_sas 0000:c1:00.0: jbod sync map : no [Mon Dec 9 06:17:12 2019][ 7.401664] scsi host0: Avago SAS based MegaRAID driver [Mon Dec 9 06:17:12 2019][ 7.421497] scsi 0:2:0:0: Direct-Access DELL PERC H330 Mini 4.30 PQ: 0 ANSI: 5 [Mon Dec 9 06:17:12 2019][ 7.487112] mlx5_core 0000:01:00.0: Port module event: module 0, Cable plugged [Mon Dec 9 06:17:12 2019][ 7.487375] mlx5_core 0000:01:00.0: mlx5_pcie_event:303:(pid 319): PCIe slot advertised sufficient power (27W). [Mon Dec 9 06:17:12 2019][ 7.494745] mlx5_core 0000:01:00.0: mlx5_fw_tracer_start:776:(pid 300): FWTracer: Ownership granted and active [Mon Dec 9 06:17:12 2019][ 7.504801] ata1: SATA link down (SStatus 0 SControl 300) [Mon Dec 9 06:17:12 2019][ 7.518414] mpt3sas_cm0: Allocated physical memory: size(38831 kB) [Mon Dec 9 06:17:12 2019][ 7.518416] mpt3sas_cm0: Current Controller Queue Depth(7564), Max Controller Queue Depth(7680) [Mon Dec 9 06:17:12 2019][ 7.518416] mpt3sas_cm0: Scatter Gather Elements per IO(128) [Mon Dec 9 06:17:12 2019][ 7.662398] mpt3sas_cm0: FW Package Version(12.00.00.00) [Mon Dec 9 06:17:12 2019][ 7.662689] mpt3sas_cm0: SAS3616: FWVersion(12.00.00.00), ChipRevision(0x02), BiosVersion(09.21.00.00) [Mon Dec 9 06:17:12 2019][ 7.662693] mpt3sas_cm0: Protocol=(Initiator,Target,NVMe), Capabilities=(TLR,EEDP,Diag Trace Buffer,Task Set Full,NCQ) [Mon Dec 9 06:17:12 2019][ 7.662761] mpt3sas 0000:84:00.0: Enabled Extended Tags as Controller Supports [Mon Dec 9 06:17:12 2019][ 7.662776] mpt3sas_cm0: : host protection capabilities enabled DIF1 DIF2 DIF3 [Mon Dec 9 06:17:12 2019][ 7.662791] scsi host1: Fusion MPT SAS Host [Mon Dec 9 06:17:12 2019][ 7.663034] mpt3sas_cm0: registering trace buffer support [Mon Dec 9 06:17:12 2019][ 7.667351] mpt3sas_cm0: Trace buffer memory 2048 KB allocated [Mon Dec 9 06:17:12 2019][ 7.667351] mpt3sas_cm0: sending port enable !! [Mon Dec 9 06:17:12 2019][ 7.667992] mpt3sas_cm0: hba_port entry: ffff8912eff13780, port: 255 is added to hba_port list [Mon Dec 9 06:17:12 2019][ 7.670505] mpt3sas_cm0: host_add: handle(0x0001), sas_addr(0x500605b00e718b40), phys(21) [Mon Dec 9 06:17:12 2019][ 7.672841] mpt3sas_cm0: detecting: handle(0x0011), sas_address(0x300705b00e718b40), phy(16) [Mon Dec 9 06:17:12 2019][ 7.672845] mpt3sas_cm0: REPORT_LUNS: handle(0x0011), retries(0) [Mon Dec 9 06:17:12 2019][ 7.672869] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0011), lun(0) [Mon Dec 9 06:17:12 2019][ 7.673248] scsi 1:0:0:0: Enclosure LSI VirtualSES 03 PQ: 0 ANSI: 7 [Mon Dec 9 06:17:12 2019][ 7.673272] scsi 1:0:0:0: set ignore_delay_remove for handle(0x0011) [Mon Dec 9 06:17:12 2019][ 7.673275] scsi 1:0:0:0: SES: handle(0x0011), sas_addr(0x300705b00e718b40), phy(16), device_name(0x300705b00e718b40) [Mon Dec 9 06:17:13 2019][ 7.673276] scsi 1:0:0:0: enclosure logical id(0x300605b00e118b40), slot(16) [Mon Dec 9 06:17:13 2019][ 7.673278] scsi 1:0:0:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 7.673279] scsi 1:0:0:0: serial_number(300605B00E118B40) [Mon Dec 9 06:17:13 2019][ 7.673281] scsi 1:0:0:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(8), cmd_que(0) [Mon Dec 9 06:17:13 2019][ 7.673302] mpt3sas_cm0: log_info(0x31200206): originator(PL), code(0x20), sub_code(0x0206) [Mon Dec 9 06:17:13 2019]%G%G[ 7.875511] mpt3sas_cm0: expander_add: handle(0x0058), parent(0x0001), sas_addr(0x5000ccab040371fd), phys(49) [Mon Dec 9 06:17:13 2019][ 7.885870] mlx5_ib: Mellanox Connect-IB Infiniband driver v4.7-1.0.0 [Mon Dec 9 06:17:13 2019][ 7.896333] mpt3sas_cm0: detecting: handle(0x005c), sas_address(0x5000ccab040371fc), phy(48) [Mon Dec 9 06:17:13 2019][ 7.904779] mpt3sas_cm0: REPORT_LUNS: handle(0x005c), retries(0) [Mon Dec 9 06:17:13 2019][ 7.906143] mpt3sas_cm0: TEST_UNIT_READY: handle(0x005c), lun(0) [Mon Dec 9 06:17:13 2019][ 7.907155] scsi 1:0:1:0: Enclosure HGST H4060-J 2033 PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 7.907348] scsi 1:0:1:0: set ignore_delay_remove for handle(0x005c) [Mon Dec 9 06:17:13 2019][ 7.907351] scsi 1:0:1:0: SES: handle(0x005c), sas_addr(0x5000ccab040371fc), phy(48), device_name(0x0000000000000000) [Mon Dec 9 06:17:13 2019][ 7.907352] scsi 1:0:1:0: enclosure logical id(0x5000ccab04037180), slot(60) [Mon Dec 9 06:17:13 2019][ 7.907353] scsi 1:0:1:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 7.907354] scsi 1:0:1:0: serial_number(USWSJ03918EZ0028 ) [Mon Dec 9 06:17:13 2019][ 7.907356] scsi 1:0:1:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 7.975890] sd 0:2:0:0: [sda] 467664896 512-byte logical blocks: (239 GB/223 GiB) [Mon Dec 9 06:17:13 2019][ 7.983560] sd 0:2:0:0: [sda] Write Protect is off [Mon Dec 9 06:17:13 2019][ 7.988423] sd 0:2:0:0: [sda] Write cache: disabled, read cache: disabled, supports DPO and FUA [Mon Dec 9 06:17:13 2019][ 7.999378] sda: sda1 sda2 sda3 [Mon Dec 9 06:17:13 2019][ 8.003064] sd 0:2:0:0: [sda] Attached SCSI disk [Mon Dec 9 06:17:13 2019][ 8.004409] mpt3sas_cm0: expander_add: handle(0x005a), parent(0x0058), sas_addr(0x5000ccab040371f9), phys(68) [Mon Dec 9 06:17:13 2019][ 8.014187] mpt3sas_cm0: detecting: handle(0x005d), sas_address(0x5000cca2525f2a26), phy(0) [Mon Dec 9 06:17:13 2019][ 8.014189] mpt3sas_cm0: REPORT_LUNS: handle(0x005d), retries(0) [Mon Dec 9 06:17:13 2019][ 8.014331] mpt3sas_cm0: TEST_UNIT_READY: handle(0x005d), lun(0) [Mon Dec 9 06:17:13 2019][ 8.015159] scsi 1:0:2:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.015265] scsi 1:0:2:0: SSP: handle(0x005d), sas_addr(0x5000cca2525f2a26), phy(0), device_name(0x5000cca2525f2a27) [Mon Dec 9 06:17:13 2019][ 8.015266] scsi 1:0:2:0: enclosure logical id(0x5000ccab04037180), slot(0) [Mon Dec 9 06:17:13 2019][ 8.015268] scsi 1:0:2:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.015269] scsi 1:0:2:0: serial_number( 7SHPAG1W) [Mon Dec 9 06:17:13 2019][ 8.015271] scsi 1:0:2:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.028932] random: crng init done [Mon Dec 9 06:17:13 2019][ 8.101982] mpt3sas_cm0: detecting: handle(0x005e), sas_address(0x5000cca2525e977e), phy(1) [Mon Dec 9 06:17:13 2019][ 8.110337] mpt3sas_cm0: REPORT_LUNS: handle(0x005e), retries(0) [Mon Dec 9 06:17:13 2019][ 8.116464] mpt3sas_cm0: TEST_UNIT_READY: handle(0x005e), lun(0) [Mon Dec 9 06:17:13 2019][ 8.123114] scsi 1:0:3:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.131363] scsi 1:0:3:0: SSP: handle(0x005e), sas_addr(0x5000cca2525e977e), phy(1), device_name(0x5000cca2525e977f) [Mon Dec 9 06:17:13 2019][ 8.141878] scsi 1:0:3:0: enclosure logical id(0x5000ccab04037180), slot(2) [Mon Dec 9 06:17:13 2019][ 8.148923] scsi 1:0:3:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.155644] scsi 1:0:3:0: serial_number( 7SHP0P8W) [Mon Dec 9 06:17:13 2019][ 8.161042] scsi 1:0:3:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.180986] mpt3sas_cm0: detecting: handle(0x005f), sas_address(0x5000cca2525ed2be), phy(2) [Mon Dec 9 06:17:13 2019][ 8.189343] mpt3sas_cm0: REPORT_LUNS: handle(0x005f), retries(0) [Mon Dec 9 06:17:13 2019][ 8.195505] mpt3sas_cm0: TEST_UNIT_READY: handle(0x005f), lun(0) [Mon Dec 9 06:17:13 2019][ 8.202347] scsi 1:0:4:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.210576] scsi 1:0:4:0: SSP: handle(0x005f), sas_addr(0x5000cca2525ed2be), phy(2), device_name(0x5000cca2525ed2bf) [Mon Dec 9 06:17:13 2019][ 8.221092] scsi 1:0:4:0: enclosure logical id(0x5000ccab04037180), slot(11) [Mon Dec 9 06:17:13 2019][ 8.228226] scsi 1:0:4:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.234945] scsi 1:0:4:0: serial_number( 7SHP4MLW) [Mon Dec 9 06:17:13 2019][ 8.240345] scsi 1:0:4:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.259985] mpt3sas_cm0: detecting: handle(0x0060), sas_address(0x5000cca2525ec04a), phy(3) [Mon Dec 9 06:17:13 2019][ 8.268332] mpt3sas_cm0: REPORT_LUNS: handle(0x0060), retries(0) [Mon Dec 9 06:17:13 2019][ 8.274464] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0060), lun(0) [Mon Dec 9 06:17:13 2019][ 8.281104] scsi 1:0:5:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.289316] scsi 1:0:5:0: SSP: handle(0x0060), sas_addr(0x5000cca2525ec04a), phy(3), device_name(0x5000cca2525ec04b) [Mon Dec 9 06:17:13 2019][ 8.299832] scsi 1:0:5:0: enclosure logical id(0x5000ccab04037180), slot(12) [Mon Dec 9 06:17:13 2019][ 8.306963] scsi 1:0:5:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.313683] scsi 1:0:5:0: serial_number( 7SHP3DHW) [Mon Dec 9 06:17:13 2019][ 8.319083] scsi 1:0:5:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.342020] mpt3sas_cm0: detecting: handle(0x0061), sas_address(0x5000cca2525ff612), phy(4) [Mon Dec 9 06:17:13 2019][ 8.350373] mpt3sas_cm0: REPORT_LUNS: handle(0x0061), retries(0) [Mon Dec 9 06:17:13 2019][ 8.356520] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0061), lun(0) [Mon Dec 9 06:17:13 2019][ 8.363432] scsi 1:0:6:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.371654] scsi 1:0:6:0: SSP: handle(0x0061), sas_addr(0x5000cca2525ff612), phy(4), device_name(0x5000cca2525ff613) [Mon Dec 9 06:17:13 2019][ 8.382166] scsi 1:0:6:0: enclosure logical id(0x5000ccab04037180), slot(13) [Mon Dec 9 06:17:13 2019][ 8.389299] scsi 1:0:6:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.396020] scsi 1:0:6:0: serial_number( 7SHPT11W) [Mon Dec 9 06:17:13 2019][ 8.401418] scsi 1:0:6:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.424003] mpt3sas_cm0: detecting: handle(0x0062), sas_address(0x5000cca2526016ee), phy(5) [Mon Dec 9 06:17:13 2019][ 8.432355] mpt3sas_cm0: REPORT_LUNS: handle(0x0062), retries(0) [Mon Dec 9 06:17:13 2019][ 8.438525] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0062), lun(0) [Mon Dec 9 06:17:13 2019][ 8.445150] scsi 1:0:7:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.453366] scsi 1:0:7:0: SSP: handle(0x0062), sas_addr(0x5000cca2526016ee), phy(5), device_name(0x5000cca2526016ef) [Mon Dec 9 06:17:13 2019][ 8.463879] scsi 1:0:7:0: enclosure logical id(0x5000ccab04037180), slot(14) [Mon Dec 9 06:17:13 2019][ 8.471012] scsi 1:0:7:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.477729] scsi 1:0:7:0: serial_number( 7SHPV6WW) [Mon Dec 9 06:17:13 2019][ 8.483129] scsi 1:0:7:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.506002] mpt3sas_cm0: detecting: handle(0x0063), sas_address(0x5000cca2525f4872), phy(6) [Mon Dec 9 06:17:13 2019][ 8.514351] mpt3sas_cm0: REPORT_LUNS: handle(0x0063), retries(0) [Mon Dec 9 06:17:13 2019][ 8.520518] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0063), lun(0) [Mon Dec 9 06:17:13 2019][ 8.527324] scsi 1:0:8:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.535539] scsi 1:0:8:0: SSP: handle(0x0063), sas_addr(0x5000cca2525f4872), phy(6), device_name(0x5000cca2525f4873) [Mon Dec 9 06:17:13 2019][ 8.546049] scsi 1:0:8:0: enclosure logical id(0x5000ccab04037180), slot(15) [Mon Dec 9 06:17:13 2019][ 8.553182] scsi 1:0:8:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.559900] scsi 1:0:8:0: serial_number( 7SHPDGLW) [Mon Dec 9 06:17:13 2019][ 8.565302] scsi 1:0:8:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.588007] mpt3sas_cm0: detecting: handle(0x0064), sas_address(0x5000cca2525f568e), phy(7) [Mon Dec 9 06:17:13 2019][ 8.596357] mpt3sas_cm0: REPORT_LUNS: handle(0x0064), retries(0) [Mon Dec 9 06:17:13 2019][ 8.602528] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0064), lun(0) [Mon Dec 9 06:17:13 2019][ 8.609165] scsi 1:0:9:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.617378] scsi 1:0:9:0: SSP: handle(0x0064), sas_addr(0x5000cca2525f568e), phy(7), device_name(0x5000cca2525f568f) [Mon Dec 9 06:17:13 2019][ 8.627891] scsi 1:0:9:0: enclosure logical id(0x5000ccab04037180), slot(16) [Mon Dec 9 06:17:13 2019][ 8.635023] scsi 1:0:9:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.641740] scsi 1:0:9:0: serial_number( 7SHPEDRW) [Mon Dec 9 06:17:13 2019][ 8.647141] scsi 1:0:9:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.666999] mpt3sas_cm0: detecting: handle(0x0065), sas_address(0x5000cca2525f6c26), phy(8) [Mon Dec 9 06:17:13 2019][ 8.675346] mpt3sas_cm0: REPORT_LUNS: handle(0x0065), retries(0) [Mon Dec 9 06:17:13 2019][ 8.681507] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0065), lun(0) [Mon Dec 9 06:17:13 2019][ 8.688143] scsi 1:0:10:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:13 2019][ 8.696441] scsi 1:0:10:0: SSP: handle(0x0065), sas_addr(0x5000cca2525f6c26), phy(8), device_name(0x5000cca2525f6c27) [Mon Dec 9 06:17:13 2019][ 8.707035] scsi 1:0:10:0: enclosure logical id(0x5000ccab04037180), slot(17) [Mon Dec 9 06:17:13 2019][ 8.714255] scsi 1:0:10:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:13 2019][ 8.721059] scsi 1:0:10:0: serial_number( 7SHPGV9W) [Mon Dec 9 06:17:13 2019][ 8.726548] scsi 1:0:10:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:13 2019][ 8.748996] mpt3sas_cm0: detecting: handle(0x0066), sas_address(0x5000cca2525ed402), phy(9) [Mon Dec 9 06:17:13 2019][ 8.757344] mpt3sas_cm0: REPORT_LUNS: handle(0x0066), retries(0) [Mon Dec 9 06:17:13 2019][ 8.763511] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0066), lun(0) [Mon Dec 9 06:17:14 2019][ 8.786731] scsi 1:0:11:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 8.795017] scsi 1:0:11:0: SSP: handle(0x0066), sas_addr(0x5000cca2525ed402), phy(9), device_name(0x5000cca2525ed403) [Mon Dec 9 06:17:14 2019][ 8.805614] scsi 1:0:11:0: enclosure logical id(0x5000ccab04037180), slot(18) [Mon Dec 9 06:17:14 2019][ 8.812834] scsi 1:0:11:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 8.819623] scsi 1:0:11:0: serial_number( 7SHP4R6W) [Mon Dec 9 06:17:14 2019][ 8.825107] scsi 1:0:11:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 8.844998] mpt3sas_cm0: detecting: handle(0x0067), sas_address(0x5000cca2525e0406), phy(10) [Mon Dec 9 06:17:14 2019][ 8.853433] mpt3sas_cm0: REPORT_LUNS: handle(0x0067), retries(0) [Mon Dec 9 06:17:14 2019][ 8.859594] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0067), lun(0) [Mon Dec 9 06:17:14 2019][ 8.866219] scsi 1:0:12:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 8.874521] scsi 1:0:12:0: SSP: handle(0x0067), sas_addr(0x5000cca2525e0406), phy(10), device_name(0x5000cca2525e0407) [Mon Dec 9 06:17:14 2019][ 8.885211] scsi 1:0:12:0: enclosure logical id(0x5000ccab04037180), slot(19) [Mon Dec 9 06:17:14 2019][ 8.892431] scsi 1:0:12:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 8.899234] scsi 1:0:12:0: serial_number( 7SHNPVUW) [Mon Dec 9 06:17:14 2019][ 8.904723] scsi 1:0:12:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 8.928001] mpt3sas_cm0: detecting: handle(0x0068), sas_address(0x5000cca2525ea9e6), phy(11) [Mon Dec 9 06:17:14 2019][ 8.936436] mpt3sas_cm0: REPORT_LUNS: handle(0x0068), retries(0) [Mon Dec 9 06:17:14 2019][ 8.942610] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0068), lun(0) [Mon Dec 9 06:17:14 2019][ 8.949232] scsi 1:0:13:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 8.957533] scsi 1:0:13:0: SSP: handle(0x0068), sas_addr(0x5000cca2525ea9e6), phy(11), device_name(0x5000cca2525ea9e7) [Mon Dec 9 06:17:14 2019][ 8.968222] scsi 1:0:13:0: enclosure logical id(0x5000ccab04037180), slot(20) [Mon Dec 9 06:17:14 2019][ 8.975440] scsi 1:0:13:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 8.982245] scsi 1:0:13:0: serial_number( 7SHP1X8W) [Mon Dec 9 06:17:14 2019][ 8.987734] scsi 1:0:13:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.011014] mpt3sas_cm0: detecting: handle(0x0069), sas_address(0x5000cca2525f1d3a), phy(12) [Mon Dec 9 06:17:14 2019][ 9.019446] mpt3sas_cm0: REPORT_LUNS: handle(0x0069), retries(0) [Mon Dec 9 06:17:14 2019][ 9.025579] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0069), lun(0) [Mon Dec 9 06:17:14 2019][ 9.032209] scsi 1:0:14:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.040511] scsi 1:0:14:0: SSP: handle(0x0069), sas_addr(0x5000cca2525f1d3a), phy(12), device_name(0x5000cca2525f1d3b) [Mon Dec 9 06:17:14 2019][ 9.051199] scsi 1:0:14:0: enclosure logical id(0x5000ccab04037180), slot(21) [Mon Dec 9 06:17:14 2019][ 9.058418] scsi 1:0:14:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.065227] scsi 1:0:14:0: serial_number( 7SHP9LBW) [Mon Dec 9 06:17:14 2019][ 9.070721] scsi 1:0:14:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.093009] mpt3sas_cm0: detecting: handle(0x006a), sas_address(0x5000cca2525ea49a), phy(13) [Mon Dec 9 06:17:14 2019][ 9.101446] mpt3sas_cm0: REPORT_LUNS: handle(0x006a), retries(0) [Mon Dec 9 06:17:14 2019][ 9.107619] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006a), lun(0) [Mon Dec 9 06:17:14 2019][ 9.114444] scsi 1:0:15:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.122753] scsi 1:0:15:0: SSP: handle(0x006a), sas_addr(0x5000cca2525ea49a), phy(13), device_name(0x5000cca2525ea49b) [Mon Dec 9 06:17:14 2019][ 9.133438] scsi 1:0:15:0: enclosure logical id(0x5000ccab04037180), slot(22) [Mon Dec 9 06:17:14 2019][ 9.140657] scsi 1:0:15:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.147461] scsi 1:0:15:0: serial_number( 7SHP1KAW) [Mon Dec 9 06:17:14 2019][ 9.152951] scsi 1:0:15:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.324024] mpt3sas_cm0: detecting: handle(0x006b), sas_address(0x5000cca2525fba06), phy(14) [Mon Dec 9 06:17:14 2019][ 9.332459] mpt3sas_cm0: REPORT_LUNS: handle(0x006b), retries(0) [Mon Dec 9 06:17:14 2019][ 9.338592] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006b), lun(0) [Mon Dec 9 06:17:14 2019][ 9.355556] scsi 1:0:16:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.363907] scsi 1:0:16:0: SSP: handle(0x006b), sas_addr(0x5000cca2525fba06), phy(14), device_name(0x5000cca2525fba07) [Mon Dec 9 06:17:14 2019][ 9.374594] scsi 1:0:16:0: enclosure logical id(0x5000ccab04037180), slot(23) [Mon Dec 9 06:17:14 2019][ 9.381814] scsi 1:0:16:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.388619] scsi 1:0:16:0: serial_number( 7SHPN12W) [Mon Dec 9 06:17:14 2019][ 9.394105] scsi 1:0:16:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.420024] mpt3sas_cm0: detecting: handle(0x006c), sas_address(0x5000cca2525e121e), phy(15) [Mon Dec 9 06:17:14 2019][ 9.428466] mpt3sas_cm0: REPORT_LUNS: handle(0x006c), retries(0) [Mon Dec 9 06:17:14 2019][ 9.434607] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006c), lun(0) [Mon Dec 9 06:17:14 2019][ 9.441234] scsi 1:0:17:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.449532] scsi 1:0:17:0: SSP: handle(0x006c), sas_addr(0x5000cca2525e121e), phy(15), device_name(0x5000cca2525e121f) [Mon Dec 9 06:17:14 2019][ 9.460214] scsi 1:0:17:0: enclosure logical id(0x5000ccab04037180), slot(24) [Mon Dec 9 06:17:14 2019][ 9.467433] scsi 1:0:17:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.474238] scsi 1:0:17:0: serial_number( 7SHNRTXW) [Mon Dec 9 06:17:14 2019][ 9.479728] scsi 1:0:17:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.500015] mpt3sas_cm0: detecting: handle(0x006d), sas_address(0x5000cca2525e98f6), phy(16) [Mon Dec 9 06:17:14 2019][ 9.508450] mpt3sas_cm0: REPORT_LUNS: handle(0x006d), retries(0) [Mon Dec 9 06:17:14 2019][ 9.514580] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006d), lun(0) [Mon Dec 9 06:17:14 2019][ 9.528640] scsi 1:0:18:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.536926] scsi 1:0:18:0: SSP: handle(0x006d), sas_addr(0x5000cca2525e98f6), phy(16), device_name(0x5000cca2525e98f7) [Mon Dec 9 06:17:14 2019][ 9.547611] scsi 1:0:18:0: enclosure logical id(0x5000ccab04037180), slot(25) [Mon Dec 9 06:17:14 2019][ 9.554831] scsi 1:0:18:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.561624] scsi 1:0:18:0: serial_number( 7SHP0T9W) [Mon Dec 9 06:17:14 2019][ 9.567117] scsi 1:0:18:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.590261] mpt3sas_cm0: detecting: handle(0x006e), sas_address(0x5000cca2525f8176), phy(17) [Mon Dec 9 06:17:14 2019][ 9.598700] mpt3sas_cm0: REPORT_LUNS: handle(0x006e), retries(0) [Mon Dec 9 06:17:14 2019][ 9.604839] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006e), lun(0) [Mon Dec 9 06:17:14 2019][ 9.611468] scsi 1:0:19:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.619774] scsi 1:0:19:0: SSP: handle(0x006e), sas_addr(0x5000cca2525f8176), phy(17), device_name(0x5000cca2525f8177) [Mon Dec 9 06:17:14 2019][ 9.630457] scsi 1:0:19:0: enclosure logical id(0x5000ccab04037180), slot(26) [Mon Dec 9 06:17:14 2019][ 9.637677] scsi 1:0:19:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.644484] scsi 1:0:19:0: serial_number( 7SHPJ89W) [Mon Dec 9 06:17:14 2019][ 9.649970] scsi 1:0:19:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.670019] mpt3sas_cm0: detecting: handle(0x006f), sas_address(0x5000cca2525fb01e), phy(18) [Mon Dec 9 06:17:14 2019][ 9.678459] mpt3sas_cm0: REPORT_LUNS: handle(0x006f), retries(0) [Mon Dec 9 06:17:14 2019][ 9.684593] mpt3sas_cm0: TEST_UNIT_READY: handle(0x006f), lun(0) [Mon Dec 9 06:17:14 2019][ 9.691231] scsi 1:0:20:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.699531] scsi 1:0:20:0: SSP: handle(0x006f), sas_addr(0x5000cca2525fb01e), phy(18), device_name(0x5000cca2525fb01f) [Mon Dec 9 06:17:14 2019][ 9.710218] scsi 1:0:20:0: enclosure logical id(0x5000ccab04037180), slot(27) [Mon Dec 9 06:17:14 2019][ 9.717438] scsi 1:0:20:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:14 2019][ 9.724242] scsi 1:0:20:0: serial_number( 7SHPMBMW) [Mon Dec 9 06:17:14 2019][ 9.729732] scsi 1:0:20:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:14 2019][ 9.750021] mpt3sas_cm0: detecting: handle(0x0070), sas_address(0x5000cca2525ed54a), phy(19) [Mon Dec 9 06:17:14 2019][ 9.758455] mpt3sas_cm0: REPORT_LUNS: handle(0x0070), retries(0) [Mon Dec 9 06:17:14 2019][ 9.764619] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0070), lun(0) [Mon Dec 9 06:17:14 2019][ 9.771265] scsi 1:0:21:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:14 2019][ 9.779567] scsi 1:0:21:0: SSP: handle(0x0070), sas_addr(0x5000cca2525ed54a), phy(19), device_name(0x5000cca2525ed54b) [Mon Dec 9 06:17:15 2019][ 9.790249] scsi 1:0:21:0: enclosure logical id(0x5000ccab04037180), slot(28) [Mon Dec 9 06:17:15 2019][ 9.797469] scsi 1:0:21:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 9.804273] scsi 1:0:21:0: serial_number( 7SHP4TVW) [Mon Dec 9 06:17:15 2019][ 9.809762] scsi 1:0:21:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 9.832022] mpt3sas_cm0: detecting: handle(0x0071), sas_address(0x5000cca2525fa036), phy(20) [Mon Dec 9 06:17:15 2019][ 9.840462] mpt3sas_cm0: REPORT_LUNS: handle(0x0071), retries(0) [Mon Dec 9 06:17:15 2019][ 9.846635] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0071), lun(0) [Mon Dec 9 06:17:15 2019][ 9.853275] scsi 1:0:22:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 9.861574] scsi 1:0:22:0: SSP: handle(0x0071), sas_addr(0x5000cca2525fa036), phy(20), device_name(0x5000cca2525fa037) [Mon Dec 9 06:17:15 2019][ 9.872255] scsi 1:0:22:0: enclosure logical id(0x5000ccab04037180), slot(29) [Mon Dec 9 06:17:15 2019][ 9.879475] scsi 1:0:22:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 9.886279] scsi 1:0:22:0: serial_number( 7SHPL9TW) [Mon Dec 9 06:17:15 2019][ 9.891767] scsi 1:0:22:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 9.912024] mpt3sas_cm0: detecting: handle(0x0072), sas_address(0x5000cca2525fb942), phy(21) [Mon Dec 9 06:17:15 2019][ 9.920464] mpt3sas_cm0: REPORT_LUNS: handle(0x0072), retries(0) [Mon Dec 9 06:17:15 2019][ 9.926599] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0072), lun(0) [Mon Dec 9 06:17:15 2019][ 9.944464] scsi 1:0:23:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 9.952764] scsi 1:0:23:0: SSP: handle(0x0072), sas_addr(0x5000cca2525fb942), phy(21), device_name(0x5000cca2525fb943) [Mon Dec 9 06:17:15 2019][ 9.963448] scsi 1:0:23:0: enclosure logical id(0x5000ccab04037180), slot(30) [Mon Dec 9 06:17:15 2019][ 9.970667] scsi 1:0:23:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 9.977457] scsi 1:0:23:0: serial_number( 7SHPMZHW) [Mon Dec 9 06:17:15 2019][ 9.982944] scsi 1:0:23:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.012617] mpt3sas_cm0: detecting: handle(0x0073), sas_address(0x5000cca2525e22e6), phy(22) [Mon Dec 9 06:17:15 2019][ 10.021052] mpt3sas_cm0: REPORT_LUNS: handle(0x0073), retries(0) [Mon Dec 9 06:17:15 2019][ 10.027184] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0073), lun(0) [Mon Dec 9 06:17:15 2019][ 10.033814] scsi 1:0:24:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.042120] scsi 1:0:24:0: SSP: handle(0x0073), sas_addr(0x5000cca2525e22e6), phy(22), device_name(0x5000cca2525e22e7) [Mon Dec 9 06:17:15 2019][ 10.052804] scsi 1:0:24:0: enclosure logical id(0x5000ccab04037180), slot(31) [Mon Dec 9 06:17:15 2019][ 10.060024] scsi 1:0:24:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.066833] scsi 1:0:24:0: serial_number( 7SHNSXKW) [Mon Dec 9 06:17:15 2019][ 10.072326] scsi 1:0:24:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.092033] mpt3sas_cm0: detecting: handle(0x0074), sas_address(0x5000cca2525fb5be), phy(23) [Mon Dec 9 06:17:15 2019][ 10.100470] mpt3sas_cm0: REPORT_LUNS: handle(0x0074), retries(0) [Mon Dec 9 06:17:15 2019][ 10.106611] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0074), lun(0) [Mon Dec 9 06:17:15 2019][ 10.148418] scsi 1:0:25:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.156702] scsi 1:0:25:0: SSP: handle(0x0074), sas_addr(0x5000cca2525fb5be), phy(23), device_name(0x5000cca2525fb5bf) [Mon Dec 9 06:17:15 2019][ 10.167389] scsi 1:0:25:0: enclosure logical id(0x5000ccab04037180), slot(32) [Mon Dec 9 06:17:15 2019][ 10.174610] scsi 1:0:25:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.181397] scsi 1:0:25:0: serial_number( 7SHPMS7W) [Mon Dec 9 06:17:15 2019][ 10.186884] scsi 1:0:25:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.210035] mpt3sas_cm0: detecting: handle(0x0075), sas_address(0x5000cca2525eb77e), phy(24) [Mon Dec 9 06:17:15 2019][ 10.218475] mpt3sas_cm0: REPORT_LUNS: handle(0x0075), retries(0) [Mon Dec 9 06:17:15 2019][ 10.224608] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0075), lun(0) [Mon Dec 9 06:17:15 2019][ 10.231244] scsi 1:0:26:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.239553] scsi 1:0:26:0: SSP: handle(0x0075), sas_addr(0x5000cca2525eb77e), phy(24), device_name(0x5000cca2525eb77f) [Mon Dec 9 06:17:15 2019][ 10.250237] scsi 1:0:26:0: enclosure logical id(0x5000ccab04037180), slot(33) [Mon Dec 9 06:17:15 2019][ 10.257454] scsi 1:0:26:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.264260] scsi 1:0:26:0: serial_number( 7SHP2UAW) [Mon Dec 9 06:17:15 2019][ 10.269749] scsi 1:0:26:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.290043] mpt3sas_cm0: detecting: handle(0x0076), sas_address(0x5000cca2525e113a), phy(25) [Mon Dec 9 06:17:15 2019][ 10.298480] mpt3sas_cm0: REPORT_LUNS: handle(0x0076), retries(0) [Mon Dec 9 06:17:15 2019][ 10.304624] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0076), lun(0) [Mon Dec 9 06:17:15 2019][ 10.311472] scsi 1:0:27:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.319785] scsi 1:0:27:0: SSP: handle(0x0076), sas_addr(0x5000cca2525e113a), phy(25), device_name(0x5000cca2525e113b) [Mon Dec 9 06:17:15 2019][ 10.330475] scsi 1:0:27:0: enclosure logical id(0x5000ccab04037180), slot(34) [Mon Dec 9 06:17:15 2019][ 10.337692] scsi 1:0:27:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.344498] scsi 1:0:27:0: serial_number( 7SHNRS2W) [Mon Dec 9 06:17:15 2019][ 10.349986] scsi 1:0:27:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.370037] mpt3sas_cm0: detecting: handle(0x0077), sas_address(0x5000cca2526014fa), phy(26) [Mon Dec 9 06:17:15 2019][ 10.378477] mpt3sas_cm0: REPORT_LUNS: handle(0x0077), retries(0) [Mon Dec 9 06:17:15 2019][ 10.384643] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0077), lun(0) [Mon Dec 9 06:17:15 2019][ 10.391254] scsi 1:0:28:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.399570] scsi 1:0:28:0: SSP: handle(0x0077), sas_addr(0x5000cca2526014fa), phy(26), device_name(0x5000cca2526014fb) [Mon Dec 9 06:17:15 2019][ 10.410253] scsi 1:0:28:0: enclosure logical id(0x5000ccab04037180), slot(35) [Mon Dec 9 06:17:15 2019][ 10.417473] scsi 1:0:28:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.424276] scsi 1:0:28:0: serial_number( 7SHPV2VW) [Mon Dec 9 06:17:15 2019][ 10.429765] scsi 1:0:28:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.452040] mpt3sas_cm0: detecting: handle(0x0078), sas_address(0x5000cca252598786), phy(27) [Mon Dec 9 06:17:15 2019][ 10.460481] mpt3sas_cm0: REPORT_LUNS: handle(0x0078), retries(0) [Mon Dec 9 06:17:15 2019][ 10.466613] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0078), lun(0) [Mon Dec 9 06:17:15 2019][ 10.473260] scsi 1:0:29:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.481560] scsi 1:0:29:0: SSP: handle(0x0078), sas_addr(0x5000cca252598786), phy(27), device_name(0x5000cca252598787) [Mon Dec 9 06:17:15 2019][ 10.492250] scsi 1:0:29:0: enclosure logical id(0x5000ccab04037180), slot(36) [Mon Dec 9 06:17:15 2019][ 10.499470] scsi 1:0:29:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.506275] scsi 1:0:29:0: serial_number( 7SHL7BRW) [Mon Dec 9 06:17:15 2019][ 10.511761] scsi 1:0:29:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.535039] mpt3sas_cm0: detecting: handle(0x0079), sas_address(0x5000cca2525f5366), phy(28) [Mon Dec 9 06:17:15 2019][ 10.543476] mpt3sas_cm0: REPORT_LUNS: handle(0x0079), retries(0) [Mon Dec 9 06:17:15 2019][ 10.549642] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0079), lun(0) [Mon Dec 9 06:17:15 2019][ 10.556254] scsi 1:0:30:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.564556] scsi 1:0:30:0: SSP: handle(0x0079), sas_addr(0x5000cca2525f5366), phy(28), device_name(0x5000cca2525f5367) [Mon Dec 9 06:17:15 2019][ 10.575244] scsi 1:0:30:0: enclosure logical id(0x5000ccab04037180), slot(37) [Mon Dec 9 06:17:15 2019][ 10.582463] scsi 1:0:30:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.589274] scsi 1:0:30:0: serial_number( 7SHPE66W) [Mon Dec 9 06:17:15 2019][ 10.594765] scsi 1:0:30:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.622825] mpt3sas_cm0: detecting: handle(0x007a), sas_address(0x5000cca2525e263e), phy(29) [Mon Dec 9 06:17:15 2019][ 10.631262] mpt3sas_cm0: REPORT_LUNS: handle(0x007a), retries(0) [Mon Dec 9 06:17:15 2019][ 10.637394] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007a), lun(0) [Mon Dec 9 06:17:15 2019][ 10.643989] scsi 1:0:31:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.652293] scsi 1:0:31:0: SSP: handle(0x007a), sas_addr(0x5000cca2525e263e), phy(29), device_name(0x5000cca2525e263f) [Mon Dec 9 06:17:15 2019][ 10.662980] scsi 1:0:31:0: enclosure logical id(0x5000ccab04037180), slot(38) [Mon Dec 9 06:17:15 2019][ 10.670197] scsi 1:0:31:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.677002] scsi 1:0:31:0: serial_number( 7SHNT4GW) [Mon Dec 9 06:17:15 2019][ 10.682491] scsi 1:0:31:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:15 2019][ 10.705041] mpt3sas_cm0: detecting: handle(0x007b), sas_address(0x5000cca2525f6082), phy(30) [Mon Dec 9 06:17:15 2019][ 10.713475] mpt3sas_cm0: REPORT_LUNS: handle(0x007b), retries(0) [Mon Dec 9 06:17:15 2019][ 10.723431] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007b), lun(0) [Mon Dec 9 06:17:15 2019][ 10.732342] scsi 1:0:32:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:15 2019][ 10.740647] scsi 1:0:32:0: SSP: handle(0x007b), sas_addr(0x5000cca2525f6082), phy(30), device_name(0x5000cca2525f6083) [Mon Dec 9 06:17:15 2019][ 10.751329] scsi 1:0:32:0: enclosure logical id(0x5000ccab04037180), slot(39) [Mon Dec 9 06:17:15 2019][ 10.758548] scsi 1:0:32:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:15 2019][ 10.765354] scsi 1:0:32:0: serial_number( 7SHPG28W) [Mon Dec 9 06:17:15 2019][ 10.770840] scsi 1:0:32:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 10.795056] mpt3sas_cm0: detecting: handle(0x007c), sas_address(0x5000cca2525ec83e), phy(31) [Mon Dec 9 06:17:16 2019][ 10.803490] mpt3sas_cm0: REPORT_LUNS: handle(0x007c), retries(0) [Mon Dec 9 06:17:16 2019][ 10.809653] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007c), lun(0) [Mon Dec 9 06:17:16 2019][ 10.816252] scsi 1:0:33:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 10.824552] scsi 1:0:33:0: SSP: handle(0x007c), sas_addr(0x5000cca2525ec83e), phy(31), device_name(0x5000cca2525ec83f) [Mon Dec 9 06:17:16 2019][ 10.835241] scsi 1:0:33:0: enclosure logical id(0x5000ccab04037180), slot(40) [Mon Dec 9 06:17:16 2019][ 10.842461] scsi 1:0:33:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 10.849265] scsi 1:0:33:0: serial_number( 7SHP3XXW) [Mon Dec 9 06:17:16 2019][ 10.854753] scsi 1:0:33:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 10.877045] mpt3sas_cm0: detecting: handle(0x007d), sas_address(0x5000cca2525ec01a), phy(32) [Mon Dec 9 06:17:16 2019][ 10.885479] mpt3sas_cm0: REPORT_LUNS: handle(0x007d), retries(0) [Mon Dec 9 06:17:16 2019][ 10.891641] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007d), lun(0) [Mon Dec 9 06:17:16 2019][ 10.898245] scsi 1:0:34:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 10.906541] scsi 1:0:34:0: SSP: handle(0x007d), sas_addr(0x5000cca2525ec01a), phy(32), device_name(0x5000cca2525ec01b) [Mon Dec 9 06:17:16 2019][ 10.917231] scsi 1:0:34:0: enclosure logical id(0x5000ccab04037180), slot(41) [Mon Dec 9 06:17:16 2019][ 10.924450] scsi 1:0:34:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 10.931254] scsi 1:0:34:0: serial_number( 7SHP3D3W) [Mon Dec 9 06:17:16 2019][ 10.936743] scsi 1:0:34:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 10.963054] mpt3sas_cm0: detecting: handle(0x007e), sas_address(0x5000cca2525ec55a), phy(33) [Mon Dec 9 06:17:16 2019][ 10.971491] mpt3sas_cm0: REPORT_LUNS: handle(0x007e), retries(0) [Mon Dec 9 06:17:16 2019][ 10.977824] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007e), lun(0) [Mon Dec 9 06:17:16 2019][ 10.985415] scsi 1:0:35:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 10.993749] scsi 1:0:35:0: SSP: handle(0x007e), sas_addr(0x5000cca2525ec55a), phy(33), device_name(0x5000cca2525ec55b) [Mon Dec 9 06:17:16 2019][ 11.004436] scsi 1:0:35:0: enclosure logical id(0x5000ccab04037180), slot(42) [Mon Dec 9 06:17:16 2019][ 11.011656] scsi 1:0:35:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.018461] scsi 1:0:35:0: serial_number( 7SHP3RYW) [Mon Dec 9 06:17:16 2019][ 11.023948] scsi 1:0:35:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.044054] mpt3sas_cm0: detecting: handle(0x007f), sas_address(0x5000cca2525fd4a2), phy(34) [Mon Dec 9 06:17:16 2019][ 11.052493] mpt3sas_cm0: REPORT_LUNS: handle(0x007f), retries(0) [Mon Dec 9 06:17:16 2019][ 11.058632] mpt3sas_cm0: TEST_UNIT_READY: handle(0x007f), lun(0) [Mon Dec 9 06:17:16 2019][ 11.065390] scsi 1:0:36:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.080944] scsi 1:0:36:0: SSP: handle(0x007f), sas_addr(0x5000cca2525fd4a2), phy(34), device_name(0x5000cca2525fd4a3) [Mon Dec 9 06:17:16 2019][ 11.091634] scsi 1:0:36:0: enclosure logical id(0x5000ccab04037180), slot(43) [Mon Dec 9 06:17:16 2019][ 11.098853] scsi 1:0:36:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.105661] scsi 1:0:36:0: serial_number( 7SHPPU0W) [Mon Dec 9 06:17:16 2019][ 11.111145] scsi 1:0:36:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.131054] mpt3sas_cm0: detecting: handle(0x0080), sas_address(0x5000cca2525eb5f6), phy(35) [Mon Dec 9 06:17:16 2019][ 11.139488] mpt3sas_cm0: REPORT_LUNS: handle(0x0080), retries(0) [Mon Dec 9 06:17:16 2019][ 11.145650] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0080), lun(0) [Mon Dec 9 06:17:16 2019][ 11.152338] scsi 1:0:37:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.160646] scsi 1:0:37:0: SSP: handle(0x0080), sas_addr(0x5000cca2525eb5f6), phy(35), device_name(0x5000cca2525eb5f7) [Mon Dec 9 06:17:16 2019][ 11.171335] scsi 1:0:37:0: enclosure logical id(0x5000ccab04037180), slot(44) [Mon Dec 9 06:17:16 2019][ 11.178553] scsi 1:0:37:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.185358] scsi 1:0:37:0: serial_number( 7SHP2R5W) [Mon Dec 9 06:17:16 2019][ 11.190847] scsi 1:0:37:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.211056] mpt3sas_cm0: detecting: handle(0x0081), sas_address(0x5000cca2525ebeb2), phy(36) [Mon Dec 9 06:17:16 2019][ 11.219496] mpt3sas_cm0: REPORT_LUNS: handle(0x0081), retries(0) [Mon Dec 9 06:17:16 2019][ 11.225633] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0081), lun(0) [Mon Dec 9 06:17:16 2019][ 11.232233] scsi 1:0:38:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.240537] scsi 1:0:38:0: SSP: handle(0x0081), sas_addr(0x5000cca2525ebeb2), phy(36), device_name(0x5000cca2525ebeb3) [Mon Dec 9 06:17:16 2019][ 11.251227] scsi 1:0:38:0: enclosure logical id(0x5000ccab04037180), slot(45) [Mon Dec 9 06:17:16 2019][ 11.258446] scsi 1:0:38:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.265250] scsi 1:0:38:0: serial_number( 7SHP396W) [Mon Dec 9 06:17:16 2019][ 11.270739] scsi 1:0:38:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.293054] mpt3sas_cm0: detecting: handle(0x0082), sas_address(0x5000cca2525f291a), phy(37) [Mon Dec 9 06:17:16 2019][ 11.301491] mpt3sas_cm0: REPORT_LUNS: handle(0x0082), retries(0) [Mon Dec 9 06:17:16 2019][ 11.307654] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0082), lun(0) [Mon Dec 9 06:17:16 2019][ 11.314407] scsi 1:0:39:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.322746] scsi 1:0:39:0: SSP: handle(0x0082), sas_addr(0x5000cca2525f291a), phy(37), device_name(0x5000cca2525f291b) [Mon Dec 9 06:17:16 2019][ 11.333433] scsi 1:0:39:0: enclosure logical id(0x5000ccab04037180), slot(46) [Mon Dec 9 06:17:16 2019][ 11.340652] scsi 1:0:39:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.347457] scsi 1:0:39:0: serial_number( 7SHPABWW) [Mon Dec 9 06:17:16 2019][ 11.352944] scsi 1:0:39:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.373069] mpt3sas_cm0: detecting: handle(0x0083), sas_address(0x5000cca252602c0e), phy(38) [Mon Dec 9 06:17:16 2019][ 11.381503] mpt3sas_cm0: REPORT_LUNS: handle(0x0083), retries(0) [Mon Dec 9 06:17:16 2019][ 11.387645] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0083), lun(0) [Mon Dec 9 06:17:16 2019][ 11.394246] scsi 1:0:40:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.402548] scsi 1:0:40:0: SSP: handle(0x0083), sas_addr(0x5000cca252602c0e), phy(38), device_name(0x5000cca252602c0f) [Mon Dec 9 06:17:16 2019][ 11.413235] scsi 1:0:40:0: enclosure logical id(0x5000ccab04037180), slot(47) [Mon Dec 9 06:17:16 2019][ 11.420455] scsi 1:0:40:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.427259] scsi 1:0:40:0: serial_number( 7SHPWMHW) [Mon Dec 9 06:17:16 2019][ 11.432748] scsi 1:0:40:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.455071] mpt3sas_cm0: detecting: handle(0x0084), sas_address(0x5000cca2525e7cfe), phy(39) [Mon Dec 9 06:17:16 2019][ 11.463507] mpt3sas_cm0: REPORT_LUNS: handle(0x0084), retries(0) [Mon Dec 9 06:17:16 2019][ 11.469641] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0084), lun(0) [Mon Dec 9 06:17:16 2019][ 11.476236] scsi 1:0:41:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.484533] scsi 1:0:41:0: SSP: handle(0x0084), sas_addr(0x5000cca2525e7cfe), phy(39), device_name(0x5000cca2525e7cff) [Mon Dec 9 06:17:16 2019][ 11.495217] scsi 1:0:41:0: enclosure logical id(0x5000ccab04037180), slot(48) [Mon Dec 9 06:17:16 2019][ 11.502437] scsi 1:0:41:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.509241] scsi 1:0:41:0: serial_number( 7SHNYXKW) [Mon Dec 9 06:17:16 2019][ 11.514729] scsi 1:0:41:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.537061] mpt3sas_cm0: detecting: handle(0x0085), sas_address(0x5000cca2525f6a32), phy(40) [Mon Dec 9 06:17:16 2019][ 11.545498] mpt3sas_cm0: REPORT_LUNS: handle(0x0085), retries(0) [Mon Dec 9 06:17:16 2019][ 11.551973] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0085), lun(0) [Mon Dec 9 06:17:16 2019][ 11.570231] scsi 1:0:42:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.578526] scsi 1:0:42:0: SSP: handle(0x0085), sas_addr(0x5000cca2525f6a32), phy(40), device_name(0x5000cca2525f6a33) [Mon Dec 9 06:17:16 2019][ 11.589208] scsi 1:0:42:0: enclosure logical id(0x5000ccab04037180), slot(49) [Mon Dec 9 06:17:16 2019][ 11.596427] scsi 1:0:42:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.603232] scsi 1:0:42:0: serial_number( 7SHPGR8W) [Mon Dec 9 06:17:16 2019][ 11.608721] scsi 1:0:42:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.631070] mpt3sas_cm0: detecting: handle(0x0086), sas_address(0x5000cca2525f7f26), phy(41) [Mon Dec 9 06:17:16 2019][ 11.639507] mpt3sas_cm0: REPORT_LUNS: handle(0x0086), retries(0) [Mon Dec 9 06:17:16 2019][ 11.645641] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0086), lun(0) [Mon Dec 9 06:17:16 2019][ 11.652244] scsi 1:0:43:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.660565] scsi 1:0:43:0: SSP: handle(0x0086), sas_addr(0x5000cca2525f7f26), phy(41), device_name(0x5000cca2525f7f27) [Mon Dec 9 06:17:16 2019][ 11.671249] scsi 1:0:43:0: enclosure logical id(0x5000ccab04037180), slot(50) [Mon Dec 9 06:17:16 2019][ 11.678469] scsi 1:0:43:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.685275] scsi 1:0:43:0: serial_number( 7SHPJ3JW) [Mon Dec 9 06:17:16 2019][ 11.690762] scsi 1:0:43:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.711072] mpt3sas_cm0: detecting: handle(0x0087), sas_address(0x5000cca2525eb4b2), phy(42) [Mon Dec 9 06:17:16 2019][ 11.719510] mpt3sas_cm0: REPORT_LUNS: handle(0x0087), retries(0) [Mon Dec 9 06:17:16 2019][ 11.725669] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0087), lun(0) [Mon Dec 9 06:17:16 2019][ 11.732371] scsi 1:0:44:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:16 2019][ 11.740677] scsi 1:0:44:0: SSP: handle(0x0087), sas_addr(0x5000cca2525eb4b2), phy(42), device_name(0x5000cca2525eb4b3) [Mon Dec 9 06:17:16 2019][ 11.751366] scsi 1:0:44:0: enclosure logical id(0x5000ccab04037180), slot(51) [Mon Dec 9 06:17:16 2019][ 11.758586] scsi 1:0:44:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:16 2019][ 11.765391] scsi 1:0:44:0: serial_number( 7SHP2MKW) [Mon Dec 9 06:17:16 2019][ 11.770877] scsi 1:0:44:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:16 2019][ 11.791067] mpt3sas_cm0: detecting: handle(0x0088), sas_address(0x5000cca2525e1f9e), phy(43) [Mon Dec 9 06:17:17 2019][ 11.799508] mpt3sas_cm0: REPORT_LUNS: handle(0x0088), retries(0) [Mon Dec 9 06:17:17 2019][ 11.805646] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0088), lun(0) [Mon Dec 9 06:17:17 2019][ 11.812246] scsi 1:0:45:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 11.820543] scsi 1:0:45:0: SSP: handle(0x0088), sas_addr(0x5000cca2525e1f9e), phy(43), device_name(0x5000cca2525e1f9f) [Mon Dec 9 06:17:17 2019][ 11.831230] scsi 1:0:45:0: enclosure logical id(0x5000ccab04037180), slot(52) [Mon Dec 9 06:17:17 2019][ 11.838450] scsi 1:0:45:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 11.845255] scsi 1:0:45:0: serial_number( 7SHNSPTW) [Mon Dec 9 06:17:17 2019][ 11.850744] scsi 1:0:45:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 11.871662] mpt3sas_cm0: detecting: handle(0x0089), sas_address(0x5000cca2525e52fe), phy(44) [Mon Dec 9 06:17:17 2019][ 11.880100] mpt3sas_cm0: REPORT_LUNS: handle(0x0089), retries(0) [Mon Dec 9 06:17:17 2019][ 11.886256] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0089), lun(0) [Mon Dec 9 06:17:17 2019][ 11.892848] scsi 1:0:46:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 11.901148] scsi 1:0:46:0: SSP: handle(0x0089), sas_addr(0x5000cca2525e52fe), phy(44), device_name(0x5000cca2525e52ff) [Mon Dec 9 06:17:17 2019][ 11.911833] scsi 1:0:46:0: enclosure logical id(0x5000ccab04037180), slot(53) [Mon Dec 9 06:17:17 2019][ 11.919052] scsi 1:0:46:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 11.925860] scsi 1:0:46:0: serial_number( 7SHNW3VW) [Mon Dec 9 06:17:17 2019][ 11.931346] scsi 1:0:46:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 11.951075] mpt3sas_cm0: detecting: handle(0x008a), sas_address(0x5000cca2525f4e72), phy(45) [Mon Dec 9 06:17:17 2019][ 11.959520] mpt3sas_cm0: REPORT_LUNS: handle(0x008a), retries(0) [Mon Dec 9 06:17:17 2019][ 11.965681] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008a), lun(0) [Mon Dec 9 06:17:17 2019][ 11.972270] scsi 1:0:47:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 11.980568] scsi 1:0:47:0: SSP: handle(0x008a), sas_addr(0x5000cca2525f4e72), phy(45), device_name(0x5000cca2525f4e73) [Mon Dec 9 06:17:17 2019][ 11.991258] scsi 1:0:47:0: enclosure logical id(0x5000ccab04037180), slot(54) [Mon Dec 9 06:17:17 2019][ 11.998477] scsi 1:0:47:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.005281] scsi 1:0:47:0: serial_number( 7SHPDVZW) [Mon Dec 9 06:17:17 2019][ 12.010767] scsi 1:0:47:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.033091] mpt3sas_cm0: detecting: handle(0x008b), sas_address(0x5000cca2525fd49a), phy(46) [Mon Dec 9 06:17:17 2019][ 12.041529] mpt3sas_cm0: REPORT_LUNS: handle(0x008b), retries(0) [Mon Dec 9 06:17:17 2019][ 12.047688] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008b), lun(0) [Mon Dec 9 06:17:17 2019][ 12.054320] scsi 1:0:48:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.062620] scsi 1:0:48:0: SSP: handle(0x008b), sas_addr(0x5000cca2525fd49a), phy(46), device_name(0x5000cca2525fd49b) [Mon Dec 9 06:17:17 2019][ 12.073307] scsi 1:0:48:0: enclosure logical id(0x5000ccab04037180), slot(55) [Mon Dec 9 06:17:17 2019][ 12.080527] scsi 1:0:48:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.087331] scsi 1:0:48:0: serial_number( 7SHPPTYW) [Mon Dec 9 06:17:17 2019][ 12.092817] scsi 1:0:48:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.115074] mpt3sas_cm0: detecting: handle(0x008c), sas_address(0x5000cca2525e787a), phy(47) [Mon Dec 9 06:17:17 2019][ 12.123509] mpt3sas_cm0: REPORT_LUNS: handle(0x008c), retries(0) [Mon Dec 9 06:17:17 2019][ 12.129669] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008c), lun(0) [Mon Dec 9 06:17:17 2019][ 12.136257] scsi 1:0:49:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.144554] scsi 1:0:49:0: SSP: handle(0x008c), sas_addr(0x5000cca2525e787a), phy(47), device_name(0x5000cca2525e787b) [Mon Dec 9 06:17:17 2019][ 12.155244] scsi 1:0:49:0: enclosure logical id(0x5000ccab04037180), slot(56) [Mon Dec 9 06:17:17 2019][ 12.162463] scsi 1:0:49:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.169269] scsi 1:0:49:0: serial_number( 7SHNYM7W) [Mon Dec 9 06:17:17 2019][ 12.174756] scsi 1:0:49:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.202979] mpt3sas_cm0: detecting: handle(0x008d), sas_address(0x5000cca2525ca19a), phy(48) [Mon Dec 9 06:17:17 2019][ 12.211416] mpt3sas_cm0: REPORT_LUNS: handle(0x008d), retries(0) [Mon Dec 9 06:17:17 2019][ 12.217550] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008d), lun(0) [Mon Dec 9 06:17:17 2019][ 12.273401] scsi 1:0:50:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.281695] scsi 1:0:50:0: SSP: handle(0x008d), sas_addr(0x5000cca2525ca19a), phy(48), device_name(0x5000cca2525ca19b) [Mon Dec 9 06:17:17 2019][ 12.292379] scsi 1:0:50:0: enclosure logical id(0x5000ccab04037180), slot(57) [Mon Dec 9 06:17:17 2019][ 12.299598] scsi 1:0:50:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.306389] scsi 1:0:50:0: serial_number( 7SHMY83W) [Mon Dec 9 06:17:17 2019][ 12.311872] scsi 1:0:50:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.349544] mpt3sas_cm0: detecting: handle(0x008e), sas_address(0x5000cca2525ffb8a), phy(49) [Mon Dec 9 06:17:17 2019][ 12.357984] mpt3sas_cm0: REPORT_LUNS: handle(0x008e), retries(0) [Mon Dec 9 06:17:17 2019][ 12.364163] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008e), lun(0) [Mon Dec 9 06:17:17 2019][ 12.371027] scsi 1:0:51:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.379344] scsi 1:0:51:0: SSP: handle(0x008e), sas_addr(0x5000cca2525ffb8a), phy(49), device_name(0x5000cca2525ffb8b) [Mon Dec 9 06:17:17 2019][ 12.390029] scsi 1:0:51:0: enclosure logical id(0x5000ccab04037180), slot(58) [Mon Dec 9 06:17:17 2019][ 12.397248] scsi 1:0:51:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.404052] scsi 1:0:51:0: serial_number( 7SHPTDAW) [Mon Dec 9 06:17:17 2019][ 12.409541] scsi 1:0:51:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.432103] mpt3sas_cm0: detecting: handle(0x008f), sas_address(0x5000cca2525f266a), phy(50) [Mon Dec 9 06:17:17 2019][ 12.440542] mpt3sas_cm0: REPORT_LUNS: handle(0x008f), retries(0) [Mon Dec 9 06:17:17 2019][ 12.446685] mpt3sas_cm0: TEST_UNIT_READY: handle(0x008f), lun(0) [Mon Dec 9 06:17:17 2019][ 12.453503] scsi 1:0:52:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.461817] scsi 1:0:52:0: SSP: handle(0x008f), sas_addr(0x5000cca2525f266a), phy(50), device_name(0x5000cca2525f266b) [Mon Dec 9 06:17:17 2019][ 12.472502] scsi 1:0:52:0: enclosure logical id(0x5000ccab04037180), slot(59) [Mon Dec 9 06:17:17 2019][ 12.479722] scsi 1:0:52:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.486527] scsi 1:0:52:0: serial_number( 7SHPA6AW) [Mon Dec 9 06:17:17 2019][ 12.492015] scsi 1:0:52:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.535558] mpt3sas_cm0: expander_add: handle(0x005b), parent(0x0058), sas_addr(0x5000ccab040371fb), phys(68) [Mon Dec 9 06:17:17 2019][ 12.554902] mpt3sas_cm0: detecting: handle(0x0090), sas_address(0x5000cca2525eacc2), phy(42) [Mon Dec 9 06:17:17 2019][ 12.563352] mpt3sas_cm0: REPORT_LUNS: handle(0x0090), retries(0) [Mon Dec 9 06:17:17 2019][ 12.569509] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0090), lun(0) [Mon Dec 9 06:17:17 2019][ 12.576175] scsi 1:0:53:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.584481] scsi 1:0:53:0: SSP: handle(0x0090), sas_addr(0x5000cca2525eacc2), phy(42), device_name(0x5000cca2525eacc3) [Mon Dec 9 06:17:17 2019][ 12.595166] scsi 1:0:53:0: enclosure logical id(0x5000ccab04037180), slot(1) [Mon Dec 9 06:17:17 2019][ 12.602298] scsi 1:0:53:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.609101] scsi 1:0:53:0: serial_number( 7SHP235W) [Mon Dec 9 06:17:17 2019][ 12.614590] scsi 1:0:53:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.637095] mpt3sas_cm0: detecting: handle(0x0091), sas_address(0x5000cca2525f8152), phy(43) [Mon Dec 9 06:17:17 2019][ 12.645532] mpt3sas_cm0: REPORT_LUNS: handle(0x0091), retries(0) [Mon Dec 9 06:17:17 2019][ 12.651665] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0091), lun(0) [Mon Dec 9 06:17:17 2019][ 12.658264] scsi 1:0:54:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.666562] scsi 1:0:54:0: SSP: handle(0x0091), sas_addr(0x5000cca2525f8152), phy(43), device_name(0x5000cca2525f8153) [Mon Dec 9 06:17:17 2019][ 12.677249] scsi 1:0:54:0: enclosure logical id(0x5000ccab04037180), slot(3) [Mon Dec 9 06:17:17 2019][ 12.684381] scsi 1:0:54:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.691185] scsi 1:0:54:0: serial_number( 7SHPJ80W) [Mon Dec 9 06:17:17 2019][ 12.696673] scsi 1:0:54:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:17 2019][ 12.719681] mpt3sas_cm0: detecting: handle(0x0092), sas_address(0x5000cca2525ef83a), phy(44) [Mon Dec 9 06:17:17 2019][ 12.728119] mpt3sas_cm0: REPORT_LUNS: handle(0x0092), retries(0) [Mon Dec 9 06:17:17 2019][ 12.734251] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0092), lun(0) [Mon Dec 9 06:17:17 2019][ 12.740857] scsi 1:0:55:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:17 2019][ 12.749158] scsi 1:0:55:0: SSP: handle(0x0092), sas_addr(0x5000cca2525ef83a), phy(44), device_name(0x5000cca2525ef83b) [Mon Dec 9 06:17:17 2019][ 12.759845] scsi 1:0:55:0: enclosure logical id(0x5000ccab04037180), slot(4) [Mon Dec 9 06:17:17 2019][ 12.766979] scsi 1:0:55:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:17 2019][ 12.773786] scsi 1:0:55:0: serial_number( 7SHP73ZW) [Mon Dec 9 06:17:17 2019][ 12.779270] scsi 1:0:55:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 12.799089] mpt3sas_cm0: detecting: handle(0x0093), sas_address(0x5000cca2525e72aa), phy(45) [Mon Dec 9 06:17:18 2019][ 12.807527] mpt3sas_cm0: REPORT_LUNS: handle(0x0093), retries(0) [Mon Dec 9 06:17:18 2019][ 12.813698] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0093), lun(0) [Mon Dec 9 06:17:18 2019][ 12.820484] scsi 1:0:56:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 12.829509] scsi 1:0:56:0: SSP: handle(0x0093), sas_addr(0x5000cca2525e72aa), phy(45), device_name(0x5000cca2525e72ab) [Mon Dec 9 06:17:18 2019][ 12.840196] scsi 1:0:56:0: enclosure logical id(0x5000ccab04037180), slot(5) [Mon Dec 9 06:17:18 2019][ 12.847329] scsi 1:0:56:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:18 2019][ 12.854134] scsi 1:0:56:0: serial_number( 7SHNY77W) [Mon Dec 9 06:17:18 2019][ 12.859621] scsi 1:0:56:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 12.882095] mpt3sas_cm0: detecting: handle(0x0094), sas_address(0x5000cca2525d3c8a), phy(46) [Mon Dec 9 06:17:18 2019][ 12.890532] mpt3sas_cm0: REPORT_LUNS: handle(0x0094), retries(0) [Mon Dec 9 06:17:18 2019][ 12.896670] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0094), lun(0) [Mon Dec 9 06:17:18 2019][ 12.903291] scsi 1:0:57:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 12.911597] scsi 1:0:57:0: SSP: handle(0x0094), sas_addr(0x5000cca2525d3c8a), phy(46), device_name(0x5000cca2525d3c8b) [Mon Dec 9 06:17:18 2019][ 12.922281] scsi 1:0:57:0: enclosure logical id(0x5000ccab04037180), slot(6) [Mon Dec 9 06:17:18 2019][ 12.929413] scsi 1:0:57:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:18 2019][ 12.936216] scsi 1:0:57:0: serial_number( 7SHN8KZW) [Mon Dec 9 06:17:18 2019][ 12.941705] scsi 1:0:57:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 12.964105] mpt3sas_cm0: detecting: handle(0x0095), sas_address(0x5000cca2525fae0e), phy(47) [Mon Dec 9 06:17:18 2019][ 12.972543] mpt3sas_cm0: REPORT_LUNS: handle(0x0095), retries(0) [Mon Dec 9 06:17:18 2019][ 12.978676] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0095), lun(0) [Mon Dec 9 06:17:18 2019][ 12.985254] scsi 1:0:58:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 12.993553] scsi 1:0:58:0: SSP: handle(0x0095), sas_addr(0x5000cca2525fae0e), phy(47), device_name(0x5000cca2525fae0f) [Mon Dec 9 06:17:18 2019][ 13.004242] scsi 1:0:58:0: enclosure logical id(0x5000ccab04037180), slot(7) [Mon Dec 9 06:17:18 2019][ 13.011375] scsi 1:0:58:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:18 2019][ 13.018179] scsi 1:0:58:0: serial_number( 7SHPM7BW) [Mon Dec 9 06:17:18 2019][ 13.023667] scsi 1:0:58:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.049097] mpt3sas_cm0: detecting: handle(0x0096), sas_address(0x5000cca2525efdae), phy(48) [Mon Dec 9 06:17:18 2019][ 13.057532] mpt3sas_cm0: REPORT_LUNS: handle(0x0096), retries(0) [Mon Dec 9 06:17:18 2019][ 13.063687] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0096), lun(0) [Mon Dec 9 06:17:18 2019][ 13.073156] scsi 1:0:59:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.083763] scsi 1:0:59:0: SSP: handle(0x0096), sas_addr(0x5000cca2525efdae), phy(48), device_name(0x5000cca2525efdaf) [Mon Dec 9 06:17:18 2019][ 13.094448] scsi 1:0:59:0: enclosure logical id(0x5000ccab04037180), slot(8) [Mon Dec 9 06:17:18 2019][ 13.101580] scsi 1:0:59:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:18 2019][ 13.108385] scsi 1:0:59:0: serial_number( 7SHP7H7W) [Mon Dec 9 06:17:18 2019][ 13.113871] scsi 1:0:59:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.228484] mpt3sas_cm0: detecting: handle(0x0097), sas_address(0x5000cca2525fa302), phy(49) [Mon Dec 9 06:17:18 2019][ 13.236918] mpt3sas_cm0: REPORT_LUNS: handle(0x0097), retries(0) [Mon Dec 9 06:17:18 2019][ 13.243322] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0097), lun(0) [Mon Dec 9 06:17:18 2019][ 13.256804] scsi 1:0:60:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.265098] scsi 1:0:60:0: SSP: handle(0x0097), sas_addr(0x5000cca2525fa302), phy(49), device_name(0x5000cca2525fa303) [Mon Dec 9 06:17:18 2019][ 13.275785] scsi 1:0:60:0: enclosure logical id(0x5000ccab04037180), slot(9) [Mon Dec 9 06:17:18 2019][ 13.282917] scsi 1:0:60:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:18 2019][ 13.289707] scsi 1:0:60:0: serial_number( 7SHPLHKW) [Mon Dec 9 06:17:18 2019][ 13.295191] scsi 1:0:60:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.316118] mpt3sas_cm0: detecting: handle(0x0098), sas_address(0x5000cca2525fb4be), phy(50) [Mon Dec 9 06:17:18 2019][ 13.324558] mpt3sas_cm0: REPORT_LUNS: handle(0x0098), retries(0) [Mon Dec 9 06:17:18 2019][ 13.330725] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0098), lun(0) [Mon Dec 9 06:17:18 2019][ 13.337397] scsi 1:0:61:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.345697] scsi 1:0:61:0: SSP: handle(0x0098), sas_addr(0x5000cca2525fb4be), phy(50), device_name(0x5000cca2525fb4bf) [Mon Dec 9 06:17:18 2019][ 13.356387] scsi 1:0:61:0: enclosure logical id(0x5000ccab04037180), slot(10) [Mon Dec 9 06:17:18 2019][ 13.363607] scsi 1:0:61:0: enclosure level(0x0000), connector name( C3 ) [Mon Dec 9 06:17:18 2019][ 13.370410] scsi 1:0:61:0: serial_number( 7SHPMP5W) [Mon Dec 9 06:17:18 2019][ 13.375900] scsi 1:0:61:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.398625] mpt3sas_cm0: expander_add: handle(0x00da), parent(0x0002), sas_addr(0x5000ccab040371bd), phys(49) [Mon Dec 9 06:17:18 2019][ 13.419014] mpt3sas_cm0: detecting: handle(0x00de), sas_address(0x5000ccab040371bc), phy(48) [Mon Dec 9 06:17:18 2019][ 13.427450] mpt3sas_cm0: REPORT_LUNS: handle(0x00de), retries(0) [Mon Dec 9 06:17:18 2019][ 13.435324] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00de), lun(0) [Mon Dec 9 06:17:18 2019][ 13.442184] scsi 1:0:62:0: Enclosure HGST H4060-J 2033 PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.450673] scsi 1:0:62:0: set ignore_delay_remove for handle(0x00de) [Mon Dec 9 06:17:18 2019][ 13.457114] scsi 1:0:62:0: SES: handle(0x00de), sas_addr(0x5000ccab040371bc), phy(48), device_name(0x0000000000000000) [Mon Dec 9 06:17:18 2019][ 13.467800] scsi 1:0:62:0: enclosure logical id(0x5000ccab04037180), slot(60) [Mon Dec 9 06:17:18 2019][ 13.475018] scsi 1:0:62:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:18 2019][ 13.481822] scsi 1:0:62:0: serial_number(USWSJ03918EZ0028 ) [Mon Dec 9 06:17:18 2019][ 13.487659] scsi 1:0:62:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.513530] mpt3sas_cm0: expander_add: handle(0x00dc), parent(0x00da), sas_addr(0x5000ccab040371bf), phys(68) [Mon Dec 9 06:17:18 2019][ 13.534517] mpt3sas_cm0: detecting: handle(0x00df), sas_address(0x5000cca2525f2a25), phy(0) [Mon Dec 9 06:17:18 2019][ 13.542870] mpt3sas_cm0: REPORT_LUNS: handle(0x00df), retries(0) [Mon Dec 9 06:17:18 2019][ 13.548998] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00df), lun(0) [Mon Dec 9 06:17:18 2019][ 13.555809] scsi 1:0:63:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.564123] scsi 1:0:63:0: SSP: handle(0x00df), sas_addr(0x5000cca2525f2a25), phy(0), device_name(0x5000cca2525f2a27) [Mon Dec 9 06:17:18 2019][ 13.574724] scsi 1:0:63:0: enclosure logical id(0x5000ccab04037180), slot(0) [Mon Dec 9 06:17:18 2019][ 13.581855] scsi 1:0:63:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:18 2019][ 13.588663] scsi 1:0:63:0: serial_number( 7SHPAG1W) [Mon Dec 9 06:17:18 2019][ 13.594148] scsi 1:0:63:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.614122] mpt3sas_cm0: detecting: handle(0x00e0), sas_address(0x5000cca2525e977d), phy(1) [Mon Dec 9 06:17:18 2019][ 13.622472] mpt3sas_cm0: REPORT_LUNS: handle(0x00e0), retries(0) [Mon Dec 9 06:17:18 2019][ 13.628632] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e0), lun(0) [Mon Dec 9 06:17:18 2019][ 13.655932] scsi 1:0:64:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.664219] scsi 1:0:64:0: SSP: handle(0x00e0), sas_addr(0x5000cca2525e977d), phy(1), device_name(0x5000cca2525e977f) [Mon Dec 9 06:17:18 2019][ 13.674816] scsi 1:0:64:0: enclosure logical id(0x5000ccab04037180), slot(2) [Mon Dec 9 06:17:18 2019][ 13.681949] scsi 1:0:64:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:18 2019][ 13.688738] scsi 1:0:64:0: serial_number( 7SHP0P8W) [Mon Dec 9 06:17:18 2019][ 13.694223] scsi 1:0:64:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:18 2019][ 13.714119] mpt3sas_cm0: detecting: handle(0x00e1), sas_address(0x5000cca2525ed2bd), phy(2) [Mon Dec 9 06:17:18 2019][ 13.722472] mpt3sas_cm0: REPORT_LUNS: handle(0x00e1), retries(0) [Mon Dec 9 06:17:18 2019][ 13.728617] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e1), lun(0) [Mon Dec 9 06:17:18 2019][ 13.735337] scsi 1:0:65:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:18 2019][ 13.743641] scsi 1:0:65:0: SSP: handle(0x00e1), sas_addr(0x5000cca2525ed2bd), phy(2), device_name(0x5000cca2525ed2bf) [Mon Dec 9 06:17:18 2019][ 13.754241] scsi 1:0:65:0: enclosure logical id(0x5000ccab04037180), slot(11) [Mon Dec 9 06:17:18 2019][ 13.761461] scsi 1:0:65:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:18 2019][ 13.768264] scsi 1:0:65:0: serial_number( 7SHP4MLW) [Mon Dec 9 06:17:19 2019][ 13.773754] scsi 1:0:65:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 13.796138] mpt3sas_cm0: detecting: handle(0x00e2), sas_address(0x5000cca2525ec049), phy(3) [Mon Dec 9 06:17:19 2019][ 13.804485] mpt3sas_cm0: REPORT_LUNS: handle(0x00e2), retries(0) [Mon Dec 9 06:17:19 2019][ 13.810617] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e2), lun(0) [Mon Dec 9 06:17:19 2019][ 13.817238] scsi 1:0:66:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 13.825540] scsi 1:0:66:0: SSP: handle(0x00e2), sas_addr(0x5000cca2525ec049), phy(3), device_name(0x5000cca2525ec04b) [Mon Dec 9 06:17:19 2019][ 13.836142] scsi 1:0:66:0: enclosure logical id(0x5000ccab04037180), slot(12) [Mon Dec 9 06:17:19 2019][ 13.843361] scsi 1:0:66:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 13.850166] scsi 1:0:66:0: serial_number( 7SHP3DHW) [Mon Dec 9 06:17:19 2019][ 13.855653] scsi 1:0:66:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 13.878127] mpt3sas_cm0: detecting: handle(0x00e3), sas_address(0x5000cca2525ff611), phy(4) [Mon Dec 9 06:17:19 2019][ 13.886474] mpt3sas_cm0: REPORT_LUNS: handle(0x00e3), retries(0) [Mon Dec 9 06:17:19 2019][ 13.892637] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e3), lun(0) [Mon Dec 9 06:17:19 2019][ 13.899409] scsi 1:0:67:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 13.907720] scsi 1:0:67:0: SSP: handle(0x00e3), sas_addr(0x5000cca2525ff611), phy(4), device_name(0x5000cca2525ff613) [Mon Dec 9 06:17:19 2019][ 13.918320] scsi 1:0:67:0: enclosure logical id(0x5000ccab04037180), slot(13) [Mon Dec 9 06:17:19 2019][ 13.925540] scsi 1:0:67:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 13.932344] scsi 1:0:67:0: serial_number( 7SHPT11W) [Mon Dec 9 06:17:19 2019][ 13.937833] scsi 1:0:67:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 13.959124] mpt3sas_cm0: detecting: handle(0x00e4), sas_address(0x5000cca2526016ed), phy(5) [Mon Dec 9 06:17:19 2019][ 13.967477] mpt3sas_cm0: REPORT_LUNS: handle(0x00e4), retries(0) [Mon Dec 9 06:17:19 2019][ 13.973660] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e4), lun(0) [Mon Dec 9 06:17:19 2019][ 14.005489] scsi 1:0:68:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.013799] scsi 1:0:68:0: SSP: handle(0x00e4), sas_addr(0x5000cca2526016ed), phy(5), device_name(0x5000cca2526016ef) [Mon Dec 9 06:17:19 2019][ 14.024396] scsi 1:0:68:0: enclosure logical id(0x5000ccab04037180), slot(14) [Mon Dec 9 06:17:19 2019][ 14.031616] scsi 1:0:68:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.038407] scsi 1:0:68:0: serial_number( 7SHPV6WW) [Mon Dec 9 06:17:19 2019][ 14.043899] scsi 1:0:68:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.064130] mpt3sas_cm0: detecting: handle(0x00e5), sas_address(0x5000cca2525f4871), phy(6) [Mon Dec 9 06:17:19 2019][ 14.072485] mpt3sas_cm0: REPORT_LUNS: handle(0x00e5), retries(0) [Mon Dec 9 06:17:19 2019][ 14.078627] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e5), lun(0) [Mon Dec 9 06:17:19 2019][ 14.085235] scsi 1:0:69:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.093540] scsi 1:0:69:0: SSP: handle(0x00e5), sas_addr(0x5000cca2525f4871), phy(6), device_name(0x5000cca2525f4873) [Mon Dec 9 06:17:19 2019][ 14.104140] scsi 1:0:69:0: enclosure logical id(0x5000ccab04037180), slot(15) [Mon Dec 9 06:17:19 2019][ 14.111360] scsi 1:0:69:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.118161] scsi 1:0:69:0: serial_number( 7SHPDGLW) [Mon Dec 9 06:17:19 2019][ 14.123650] scsi 1:0:69:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.146130] mpt3sas_cm0: detecting: handle(0x00e6), sas_address(0x5000cca2525f568d), phy(7) [Mon Dec 9 06:17:19 2019][ 14.154483] mpt3sas_cm0: REPORT_LUNS: handle(0x00e6), retries(0) [Mon Dec 9 06:17:19 2019][ 14.160652] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e6), lun(0) [Mon Dec 9 06:17:19 2019][ 14.167273] scsi 1:0:70:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.175580] scsi 1:0:70:0: SSP: handle(0x00e6), sas_addr(0x5000cca2525f568d), phy(7), device_name(0x5000cca2525f568f) [Mon Dec 9 06:17:19 2019][ 14.186180] scsi 1:0:70:0: enclosure logical id(0x5000ccab04037180), slot(16) [Mon Dec 9 06:17:19 2019][ 14.193399] scsi 1:0:70:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.200206] scsi 1:0:70:0: serial_number( 7SHPEDRW) [Mon Dec 9 06:17:19 2019][ 14.205692] scsi 1:0:70:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.228132] mpt3sas_cm0: detecting: handle(0x00e7), sas_address(0x5000cca2525f6c25), phy(8) [Mon Dec 9 06:17:19 2019][ 14.236481] mpt3sas_cm0: REPORT_LUNS: handle(0x00e7), retries(0) [Mon Dec 9 06:17:19 2019][ 14.242622] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e7), lun(0) [Mon Dec 9 06:17:19 2019][ 14.254153] scsi 1:0:71:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.262443] scsi 1:0:71:0: SSP: handle(0x00e7), sas_addr(0x5000cca2525f6c25), phy(8), device_name(0x5000cca2525f6c27) [Mon Dec 9 06:17:19 2019][ 14.273040] scsi 1:0:71:0: enclosure logical id(0x5000ccab04037180), slot(17) [Mon Dec 9 06:17:19 2019][ 14.280259] scsi 1:0:71:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.287050] scsi 1:0:71:0: serial_number( 7SHPGV9W) [Mon Dec 9 06:17:19 2019][ 14.292533] scsi 1:0:71:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.315135] mpt3sas_cm0: detecting: handle(0x00e8), sas_address(0x5000cca2525ed401), phy(9) [Mon Dec 9 06:17:19 2019][ 14.323488] mpt3sas_cm0: REPORT_LUNS: handle(0x00e8), retries(0) [Mon Dec 9 06:17:19 2019][ 14.329628] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e8), lun(0) [Mon Dec 9 06:17:19 2019][ 14.336258] scsi 1:0:72:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.344556] scsi 1:0:72:0: SSP: handle(0x00e8), sas_addr(0x5000cca2525ed401), phy(9), device_name(0x5000cca2525ed403) [Mon Dec 9 06:17:19 2019][ 14.355158] scsi 1:0:72:0: enclosure logical id(0x5000ccab04037180), slot(18) [Mon Dec 9 06:17:19 2019][ 14.362378] scsi 1:0:72:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.369182] scsi 1:0:72:0: serial_number( 7SHP4R6W) [Mon Dec 9 06:17:19 2019][ 14.374671] scsi 1:0:72:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.397136] mpt3sas_cm0: detecting: handle(0x00e9), sas_address(0x5000cca2525e0405), phy(10) [Mon Dec 9 06:17:19 2019][ 14.405570] mpt3sas_cm0: REPORT_LUNS: handle(0x00e9), retries(0) [Mon Dec 9 06:17:19 2019][ 14.411737] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00e9), lun(0) [Mon Dec 9 06:17:19 2019][ 14.418352] scsi 1:0:73:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.426667] scsi 1:0:73:0: SSP: handle(0x00e9), sas_addr(0x5000cca2525e0405), phy(10), device_name(0x5000cca2525e0407) [Mon Dec 9 06:17:19 2019][ 14.437355] scsi 1:0:73:0: enclosure logical id(0x5000ccab04037180), slot(19) [Mon Dec 9 06:17:19 2019][ 14.444575] scsi 1:0:73:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.451380] scsi 1:0:73:0: serial_number( 7SHNPVUW) [Mon Dec 9 06:17:19 2019][ 14.456868] scsi 1:0:73:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.478140] mpt3sas_cm0: detecting: handle(0x00ea), sas_address(0x5000cca2525ea9e5), phy(11) [Mon Dec 9 06:17:19 2019][ 14.486577] mpt3sas_cm0: REPORT_LUNS: handle(0x00ea), retries(0) [Mon Dec 9 06:17:19 2019][ 14.493255] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ea), lun(0) [Mon Dec 9 06:17:19 2019][ 14.499876] scsi 1:0:74:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.508186] scsi 1:0:74:0: SSP: handle(0x00ea), sas_addr(0x5000cca2525ea9e5), phy(11), device_name(0x5000cca2525ea9e7) [Mon Dec 9 06:17:19 2019][ 14.518875] scsi 1:0:74:0: enclosure logical id(0x5000ccab04037180), slot(20) [Mon Dec 9 06:17:19 2019][ 14.526095] scsi 1:0:74:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.532904] scsi 1:0:74:0: serial_number( 7SHP1X8W) [Mon Dec 9 06:17:19 2019][ 14.538397] scsi 1:0:74:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.558139] mpt3sas_cm0: detecting: handle(0x00eb), sas_address(0x5000cca2525f1d39), phy(12) [Mon Dec 9 06:17:19 2019][ 14.566584] mpt3sas_cm0: REPORT_LUNS: handle(0x00eb), retries(0) [Mon Dec 9 06:17:19 2019][ 14.572745] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00eb), lun(0) [Mon Dec 9 06:17:19 2019][ 14.579363] scsi 1:0:75:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.587665] scsi 1:0:75:0: SSP: handle(0x00eb), sas_addr(0x5000cca2525f1d39), phy(12), device_name(0x5000cca2525f1d3b) [Mon Dec 9 06:17:19 2019][ 14.598352] scsi 1:0:75:0: enclosure logical id(0x5000ccab04037180), slot(21) [Mon Dec 9 06:17:19 2019][ 14.605569] scsi 1:0:75:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.612376] scsi 1:0:75:0: serial_number( 7SHP9LBW) [Mon Dec 9 06:17:19 2019][ 14.617862] scsi 1:0:75:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.640140] mpt3sas_cm0: detecting: handle(0x00ec), sas_address(0x5000cca2525ea499), phy(13) [Mon Dec 9 06:17:19 2019][ 14.648580] mpt3sas_cm0: REPORT_LUNS: handle(0x00ec), retries(0) [Mon Dec 9 06:17:19 2019][ 14.654746] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ec), lun(0) [Mon Dec 9 06:17:19 2019][ 14.661569] scsi 1:0:76:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:19 2019][ 14.669878] scsi 1:0:76:0: SSP: handle(0x00ec), sas_addr(0x5000cca2525ea499), phy(13), device_name(0x5000cca2525ea49b) [Mon Dec 9 06:17:19 2019][ 14.680565] scsi 1:0:76:0: enclosure logical id(0x5000ccab04037180), slot(22) [Mon Dec 9 06:17:19 2019][ 14.687784] scsi 1:0:76:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:19 2019][ 14.694592] scsi 1:0:76:0: serial_number( 7SHP1KAW) [Mon Dec 9 06:17:19 2019][ 14.700079] scsi 1:0:76:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:19 2019][ 14.725144] mpt3sas_cm0: detecting: handle(0x00ed), sas_address(0x5000cca2525fba05), phy(14) [Mon Dec 9 06:17:19 2019][ 14.733584] mpt3sas_cm0: REPORT_LUNS: handle(0x00ed), retries(0) [Mon Dec 9 06:17:19 2019][ 14.739718] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ed), lun(0) [Mon Dec 9 06:17:19 2019][ 14.789076] scsi 1:0:77:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 14.797376] scsi 1:0:77:0: SSP: handle(0x00ed), sas_addr(0x5000cca2525fba05), phy(14), device_name(0x5000cca2525fba07) [Mon Dec 9 06:17:20 2019][ 14.808063] scsi 1:0:77:0: enclosure logical id(0x5000ccab04037180), slot(23) [Mon Dec 9 06:17:20 2019][ 14.815283] scsi 1:0:77:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 14.822073] scsi 1:0:77:0: serial_number( 7SHPN12W) [Mon Dec 9 06:17:20 2019][ 14.827558] scsi 1:0:77:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 14.856154] mpt3sas_cm0: detecting: handle(0x00ee), sas_address(0x5000cca2525e121d), phy(15) [Mon Dec 9 06:17:20 2019][ 14.864595] mpt3sas_cm0: REPORT_LUNS: handle(0x00ee), retries(0) [Mon Dec 9 06:17:20 2019][ 14.870737] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ee), lun(0) [Mon Dec 9 06:17:20 2019][ 14.877356] scsi 1:0:78:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 14.885656] scsi 1:0:78:0: SSP: handle(0x00ee), sas_addr(0x5000cca2525e121d), phy(15), device_name(0x5000cca2525e121f) [Mon Dec 9 06:17:20 2019][ 14.896346] scsi 1:0:78:0: enclosure logical id(0x5000ccab04037180), slot(24) [Mon Dec 9 06:17:20 2019][ 14.903563] scsi 1:0:78:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 14.910369] scsi 1:0:78:0: serial_number( 7SHNRTXW) [Mon Dec 9 06:17:20 2019][ 14.915855] scsi 1:0:78:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 14.938146] mpt3sas_cm0: detecting: handle(0x00ef), sas_address(0x5000cca2525e98f5), phy(16) [Mon Dec 9 06:17:20 2019][ 14.946585] mpt3sas_cm0: REPORT_LUNS: handle(0x00ef), retries(0) [Mon Dec 9 06:17:20 2019][ 14.952725] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ef), lun(0) [Mon Dec 9 06:17:20 2019][ 14.959338] scsi 1:0:79:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 14.967638] scsi 1:0:79:0: SSP: handle(0x00ef), sas_addr(0x5000cca2525e98f5), phy(16), device_name(0x5000cca2525e98f7) [Mon Dec 9 06:17:20 2019][ 14.978326] scsi 1:0:79:0: enclosure logical id(0x5000ccab04037180), slot(25) [Mon Dec 9 06:17:20 2019][ 14.985543] scsi 1:0:79:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 14.992349] scsi 1:0:79:0: serial_number( 7SHP0T9W) [Mon Dec 9 06:17:20 2019][ 14.997838] scsi 1:0:79:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.021146] mpt3sas_cm0: detecting: handle(0x00f0), sas_address(0x5000cca2525f8175), phy(17) [Mon Dec 9 06:17:20 2019][ 15.029585] mpt3sas_cm0: REPORT_LUNS: handle(0x00f0), retries(0) [Mon Dec 9 06:17:20 2019][ 15.035715] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f0), lun(0) [Mon Dec 9 06:17:20 2019][ 15.042329] scsi 1:0:80:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.050627] scsi 1:0:80:0: SSP: handle(0x00f0), sas_addr(0x5000cca2525f8175), phy(17), device_name(0x5000cca2525f8177) [Mon Dec 9 06:17:20 2019][ 15.061309] scsi 1:0:80:0: enclosure logical id(0x5000ccab04037180), slot(26) [Mon Dec 9 06:17:20 2019][ 15.068531] scsi 1:0:80:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.075336] scsi 1:0:80:0: serial_number( 7SHPJ89W) [Mon Dec 9 06:17:20 2019][ 15.080822] scsi 1:0:80:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.103149] mpt3sas_cm0: detecting: handle(0x00f1), sas_address(0x5000cca2525fb01d), phy(18) [Mon Dec 9 06:17:20 2019][ 15.111581] mpt3sas_cm0: REPORT_LUNS: handle(0x00f1), retries(0) [Mon Dec 9 06:17:20 2019][ 15.117715] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f1), lun(0) [Mon Dec 9 06:17:20 2019][ 15.124332] scsi 1:0:81:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.132636] scsi 1:0:81:0: SSP: handle(0x00f1), sas_addr(0x5000cca2525fb01d), phy(18), device_name(0x5000cca2525fb01f) [Mon Dec 9 06:17:20 2019][ 15.143326] scsi 1:0:81:0: enclosure logical id(0x5000ccab04037180), slot(27) [Mon Dec 9 06:17:20 2019][ 15.150545] scsi 1:0:81:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.157349] scsi 1:0:81:0: serial_number( 7SHPMBMW) [Mon Dec 9 06:17:20 2019][ 15.162835] scsi 1:0:81:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.186155] mpt3sas_cm0: detecting: handle(0x00f2), sas_address(0x5000cca2525ed549), phy(19) [Mon Dec 9 06:17:20 2019][ 15.194593] mpt3sas_cm0: REPORT_LUNS: handle(0x00f2), retries(0) [Mon Dec 9 06:17:20 2019][ 15.200727] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f2), lun(0) [Mon Dec 9 06:17:20 2019][ 15.207334] scsi 1:0:82:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.215630] scsi 1:0:82:0: SSP: handle(0x00f2), sas_addr(0x5000cca2525ed549), phy(19), device_name(0x5000cca2525ed54b) [Mon Dec 9 06:17:20 2019][ 15.226320] scsi 1:0:82:0: enclosure logical id(0x5000ccab04037180), slot(28) [Mon Dec 9 06:17:20 2019][ 15.233539] scsi 1:0:82:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.240345] scsi 1:0:82:0: serial_number( 7SHP4TVW) [Mon Dec 9 06:17:20 2019][ 15.245831] scsi 1:0:82:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.266153] mpt3sas_cm0: detecting: handle(0x00f3), sas_address(0x5000cca2525fa035), phy(20) [Mon Dec 9 06:17:20 2019][ 15.274590] mpt3sas_cm0: REPORT_LUNS: handle(0x00f3), retries(0) [Mon Dec 9 06:17:20 2019][ 15.280733] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f3), lun(0) [Mon Dec 9 06:17:20 2019][ 15.287347] scsi 1:0:83:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.295655] scsi 1:0:83:0: SSP: handle(0x00f3), sas_addr(0x5000cca2525fa035), phy(20), device_name(0x5000cca2525fa037) [Mon Dec 9 06:17:20 2019][ 15.306340] scsi 1:0:83:0: enclosure logical id(0x5000ccab04037180), slot(29) [Mon Dec 9 06:17:20 2019][ 15.313560] scsi 1:0:83:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.320366] scsi 1:0:83:0: serial_number( 7SHPL9TW) [Mon Dec 9 06:17:20 2019][ 15.325854] scsi 1:0:83:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.349158] mpt3sas_cm0: detecting: handle(0x00f4), sas_address(0x5000cca2525fb941), phy(21) [Mon Dec 9 06:17:20 2019][ 15.357593] mpt3sas_cm0: REPORT_LUNS: handle(0x00f4), retries(0) [Mon Dec 9 06:17:20 2019][ 15.363731] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f4), lun(0) [Mon Dec 9 06:17:20 2019][ 15.370345] scsi 1:0:84:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.378645] scsi 1:0:84:0: SSP: handle(0x00f4), sas_addr(0x5000cca2525fb941), phy(21), device_name(0x5000cca2525fb943) [Mon Dec 9 06:17:20 2019][ 15.389335] scsi 1:0:84:0: enclosure logical id(0x5000ccab04037180), slot(30) [Mon Dec 9 06:17:20 2019][ 15.396555] scsi 1:0:84:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.403359] scsi 1:0:84:0: serial_number( 7SHPMZHW) [Mon Dec 9 06:17:20 2019][ 15.408847] scsi 1:0:84:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.429747] mpt3sas_cm0: detecting: handle(0x00f5), sas_address(0x5000cca2525e22e5), phy(22) [Mon Dec 9 06:17:20 2019][ 15.438185] mpt3sas_cm0: REPORT_LUNS: handle(0x00f5), retries(0) [Mon Dec 9 06:17:20 2019][ 15.444315] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f5), lun(0) [Mon Dec 9 06:17:20 2019][ 15.450935] scsi 1:0:85:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.459241] scsi 1:0:85:0: SSP: handle(0x00f5), sas_addr(0x5000cca2525e22e5), phy(22), device_name(0x5000cca2525e22e7) [Mon Dec 9 06:17:20 2019][ 15.469927] scsi 1:0:85:0: enclosure logical id(0x5000ccab04037180), slot(31) [Mon Dec 9 06:17:20 2019][ 15.477147] scsi 1:0:85:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.483951] scsi 1:0:85:0: serial_number( 7SHNSXKW) [Mon Dec 9 06:17:20 2019][ 15.489442] scsi 1:0:85:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.509164] mpt3sas_cm0: detecting: handle(0x00f6), sas_address(0x5000cca2525fb5bd), phy(23) [Mon Dec 9 06:17:20 2019][ 15.517601] mpt3sas_cm0: REPORT_LUNS: handle(0x00f6), retries(0) [Mon Dec 9 06:17:20 2019][ 15.523788] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f6), lun(0) [Mon Dec 9 06:17:20 2019][ 15.530401] scsi 1:0:86:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.538705] scsi 1:0:86:0: SSP: handle(0x00f6), sas_addr(0x5000cca2525fb5bd), phy(23), device_name(0x5000cca2525fb5bf) [Mon Dec 9 06:17:20 2019][ 15.549394] scsi 1:0:86:0: enclosure logical id(0x5000ccab04037180), slot(32) [Mon Dec 9 06:17:20 2019][ 15.556613] scsi 1:0:86:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.563421] scsi 1:0:86:0: serial_number( 7SHPMS7W) [Mon Dec 9 06:17:20 2019][ 15.568909] scsi 1:0:86:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.589160] mpt3sas_cm0: detecting: handle(0x00f7), sas_address(0x5000cca2525eb77d), phy(24) [Mon Dec 9 06:17:20 2019][ 15.597596] mpt3sas_cm0: REPORT_LUNS: handle(0x00f7), retries(0) [Mon Dec 9 06:17:20 2019][ 15.603763] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f7), lun(0) [Mon Dec 9 06:17:20 2019][ 15.610376] scsi 1:0:87:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.618679] scsi 1:0:87:0: SSP: handle(0x00f7), sas_addr(0x5000cca2525eb77d), phy(24), device_name(0x5000cca2525eb77f) [Mon Dec 9 06:17:20 2019][ 15.629363] scsi 1:0:87:0: enclosure logical id(0x5000ccab04037180), slot(33) [Mon Dec 9 06:17:20 2019][ 15.636582] scsi 1:0:87:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.643386] scsi 1:0:87:0: serial_number( 7SHP2UAW) [Mon Dec 9 06:17:20 2019][ 15.648875] scsi 1:0:87:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.671164] mpt3sas_cm0: detecting: handle(0x00f8), sas_address(0x5000cca2525e1139), phy(25) [Mon Dec 9 06:17:20 2019][ 15.679601] mpt3sas_cm0: REPORT_LUNS: handle(0x00f8), retries(0) [Mon Dec 9 06:17:20 2019][ 15.685766] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f8), lun(0) [Mon Dec 9 06:17:20 2019][ 15.729390] scsi 1:0:88:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:20 2019][ 15.737813] scsi 1:0:88:0: SSP: handle(0x00f8), sas_addr(0x5000cca2525e1139), phy(25), device_name(0x5000cca2525e113b) [Mon Dec 9 06:17:20 2019][ 15.748500] scsi 1:0:88:0: enclosure logical id(0x5000ccab04037180), slot(34) [Mon Dec 9 06:17:20 2019][ 15.755717] scsi 1:0:88:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:20 2019][ 15.762509] scsi 1:0:88:0: serial_number( 7SHNRS2W) [Mon Dec 9 06:17:20 2019][ 15.767994] scsi 1:0:88:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:20 2019][ 15.788166] mpt3sas_cm0: detecting: handle(0x00f9), sas_address(0x5000cca2526014f9), phy(26) [Mon Dec 9 06:17:21 2019][ 15.796605] mpt3sas_cm0: REPORT_LUNS: handle(0x00f9), retries(0) [Mon Dec 9 06:17:21 2019][ 15.802735] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00f9), lun(0) [Mon Dec 9 06:17:21 2019][ 15.810510] scsi 1:0:89:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 15.818854] scsi 1:0:89:0: SSP: handle(0x00f9), sas_addr(0x5000cca2526014f9), phy(26), device_name(0x5000cca2526014fb) [Mon Dec 9 06:17:21 2019][ 15.829544] scsi 1:0:89:0: enclosure logical id(0x5000ccab04037180), slot(35) [Mon Dec 9 06:17:21 2019][ 15.836763] scsi 1:0:89:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 15.843567] scsi 1:0:89:0: serial_number( 7SHPV2VW) [Mon Dec 9 06:17:21 2019][ 15.849054] scsi 1:0:89:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 15.873233] mpt3sas_cm0: detecting: handle(0x00fa), sas_address(0x5000cca252598785), phy(27) [Mon Dec 9 06:17:21 2019][ 15.881670] mpt3sas_cm0: REPORT_LUNS: handle(0x00fa), retries(0) [Mon Dec 9 06:17:21 2019][ 15.887834] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fa), lun(0) [Mon Dec 9 06:17:21 2019][ 15.894455] scsi 1:0:90:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 15.902803] scsi 1:0:90:0: SSP: handle(0x00fa), sas_addr(0x5000cca252598785), phy(27), device_name(0x5000cca252598787) [Mon Dec 9 06:17:21 2019][ 15.913490] scsi 1:0:90:0: enclosure logical id(0x5000ccab04037180), slot(36) [Mon Dec 9 06:17:21 2019][ 15.920710] scsi 1:0:90:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 15.927515] scsi 1:0:90:0: serial_number( 7SHL7BRW) [Mon Dec 9 06:17:21 2019][ 15.933003] scsi 1:0:90:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 15.953179] mpt3sas_cm0: detecting: handle(0x00fb), sas_address(0x5000cca2525f5365), phy(28) [Mon Dec 9 06:17:21 2019][ 15.961615] mpt3sas_cm0: REPORT_LUNS: handle(0x00fb), retries(0) [Mon Dec 9 06:17:21 2019][ 15.967778] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fb), lun(0) [Mon Dec 9 06:17:21 2019][ 15.974542] scsi 1:0:91:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 15.982847] scsi 1:0:91:0: SSP: handle(0x00fb), sas_addr(0x5000cca2525f5365), phy(28), device_name(0x5000cca2525f5367) [Mon Dec 9 06:17:21 2019][ 15.993529] scsi 1:0:91:0: enclosure logical id(0x5000ccab04037180), slot(37) [Mon Dec 9 06:17:21 2019][ 16.000749] scsi 1:0:91:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.007553] scsi 1:0:91:0: serial_number( 7SHPE66W) [Mon Dec 9 06:17:21 2019][ 16.013042] scsi 1:0:91:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.033172] mpt3sas_cm0: detecting: handle(0x00fc), sas_address(0x5000cca2525e263d), phy(29) [Mon Dec 9 06:17:21 2019][ 16.041609] mpt3sas_cm0: REPORT_LUNS: handle(0x00fc), retries(0) [Mon Dec 9 06:17:21 2019][ 16.047767] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fc), lun(0) [Mon Dec 9 06:17:21 2019][ 16.054436] scsi 1:0:92:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.062741] scsi 1:0:92:0: SSP: handle(0x00fc), sas_addr(0x5000cca2525e263d), phy(29), device_name(0x5000cca2525e263f) [Mon Dec 9 06:17:21 2019][ 16.073428] scsi 1:0:92:0: enclosure logical id(0x5000ccab04037180), slot(38) [Mon Dec 9 06:17:21 2019][ 16.080647] scsi 1:0:92:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.087455] scsi 1:0:92:0: serial_number( 7SHNT4GW) [Mon Dec 9 06:17:21 2019][ 16.092941] scsi 1:0:92:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.127183] mpt3sas_cm0: detecting: handle(0x00fd), sas_address(0x5000cca2525f6081), phy(30) [Mon Dec 9 06:17:21 2019][ 16.135618] mpt3sas_cm0: REPORT_LUNS: handle(0x00fd), retries(0) [Mon Dec 9 06:17:21 2019][ 16.142279] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fd), lun(0) [Mon Dec 9 06:17:21 2019][ 16.186312] scsi 1:0:93:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.194635] scsi 1:0:93:0: SSP: handle(0x00fd), sas_addr(0x5000cca2525f6081), phy(30), device_name(0x5000cca2525f6083) [Mon Dec 9 06:17:21 2019][ 16.205323] scsi 1:0:93:0: enclosure logical id(0x5000ccab04037180), slot(39) [Mon Dec 9 06:17:21 2019][ 16.212540] scsi 1:0:93:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.219331] scsi 1:0:93:0: serial_number( 7SHPG28W) [Mon Dec 9 06:17:21 2019][ 16.224817] scsi 1:0:93:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.253189] mpt3sas_cm0: detecting: handle(0x00fe), sas_address(0x5000cca2525ec83d), phy(31) [Mon Dec 9 06:17:21 2019][ 16.261629] mpt3sas_cm0: REPORT_LUNS: handle(0x00fe), retries(0) [Mon Dec 9 06:17:21 2019][ 16.267770] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00fe), lun(0) [Mon Dec 9 06:17:21 2019][ 16.274376] scsi 1:0:94:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.282685] scsi 1:0:94:0: SSP: handle(0x00fe), sas_addr(0x5000cca2525ec83d), phy(31), device_name(0x5000cca2525ec83f) [Mon Dec 9 06:17:21 2019][ 16.293367] scsi 1:0:94:0: enclosure logical id(0x5000ccab04037180), slot(40) [Mon Dec 9 06:17:21 2019][ 16.300589] scsi 1:0:94:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.307392] scsi 1:0:94:0: serial_number( 7SHP3XXW) [Mon Dec 9 06:17:21 2019][ 16.312881] scsi 1:0:94:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.342179] mpt3sas_cm0: detecting: handle(0x00ff), sas_address(0x5000cca2525ec019), phy(32) [Mon Dec 9 06:17:21 2019][ 16.350634] mpt3sas_cm0: REPORT_LUNS: handle(0x00ff), retries(0) [Mon Dec 9 06:17:21 2019][ 16.356755] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ff), lun(0) [Mon Dec 9 06:17:21 2019][ 16.363537] scsi 1:0:95:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.372787] scsi 1:0:95:0: SSP: handle(0x00ff), sas_addr(0x5000cca2525ec019), phy(32), device_name(0x5000cca2525ec01b) [Mon Dec 9 06:17:21 2019][ 16.383469] scsi 1:0:95:0: enclosure logical id(0x5000ccab04037180), slot(41) [Mon Dec 9 06:17:21 2019][ 16.390689] scsi 1:0:95:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.397496] scsi 1:0:95:0: serial_number( 7SHP3D3W) [Mon Dec 9 06:17:21 2019][ 16.402982] scsi 1:0:95:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.423183] mpt3sas_cm0: detecting: handle(0x0100), sas_address(0x5000cca2525ec559), phy(33) [Mon Dec 9 06:17:21 2019][ 16.431618] mpt3sas_cm0: REPORT_LUNS: handle(0x0100), retries(0) [Mon Dec 9 06:17:21 2019][ 16.437783] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0100), lun(0) [Mon Dec 9 06:17:21 2019][ 16.444556] scsi 1:0:96:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.452863] scsi 1:0:96:0: SSP: handle(0x0100), sas_addr(0x5000cca2525ec559), phy(33), device_name(0x5000cca2525ec55b) [Mon Dec 9 06:17:21 2019][ 16.463552] scsi 1:0:96:0: enclosure logical id(0x5000ccab04037180), slot(42) [Mon Dec 9 06:17:21 2019][ 16.470772] scsi 1:0:96:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.477577] scsi 1:0:96:0: serial_number( 7SHP3RYW) [Mon Dec 9 06:17:21 2019][ 16.483064] scsi 1:0:96:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.503185] mpt3sas_cm0: detecting: handle(0x0101), sas_address(0x5000cca2525fd4a1), phy(34) [Mon Dec 9 06:17:21 2019][ 16.511624] mpt3sas_cm0: REPORT_LUNS: handle(0x0101), retries(0) [Mon Dec 9 06:17:21 2019][ 16.517799] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0101), lun(0) [Mon Dec 9 06:17:21 2019][ 16.524587] scsi 1:0:97:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.532892] scsi 1:0:97:0: SSP: handle(0x0101), sas_addr(0x5000cca2525fd4a1), phy(34), device_name(0x5000cca2525fd4a3) [Mon Dec 9 06:17:21 2019][ 16.543582] scsi 1:0:97:0: enclosure logical id(0x5000ccab04037180), slot(43) [Mon Dec 9 06:17:21 2019][ 16.550801] scsi 1:0:97:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.557607] scsi 1:0:97:0: serial_number( 7SHPPU0W) [Mon Dec 9 06:17:21 2019][ 16.563099] scsi 1:0:97:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.583186] mpt3sas_cm0: detecting: handle(0x0102), sas_address(0x5000cca2525eb5f5), phy(35) [Mon Dec 9 06:17:21 2019][ 16.591629] mpt3sas_cm0: REPORT_LUNS: handle(0x0102), retries(0) [Mon Dec 9 06:17:21 2019][ 16.597771] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0102), lun(0) [Mon Dec 9 06:17:21 2019][ 16.604392] scsi 1:0:98:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.612692] scsi 1:0:98:0: SSP: handle(0x0102), sas_addr(0x5000cca2525eb5f5), phy(35), device_name(0x5000cca2525eb5f7) [Mon Dec 9 06:17:21 2019][ 16.623378] scsi 1:0:98:0: enclosure logical id(0x5000ccab04037180), slot(44) [Mon Dec 9 06:17:21 2019][ 16.630598] scsi 1:0:98:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.637402] scsi 1:0:98:0: serial_number( 7SHP2R5W) [Mon Dec 9 06:17:21 2019][ 16.642889] scsi 1:0:98:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.666186] mpt3sas_cm0: detecting: handle(0x0103), sas_address(0x5000cca2525ebeb1), phy(36) [Mon Dec 9 06:17:21 2019][ 16.674621] mpt3sas_cm0: REPORT_LUNS: handle(0x0103), retries(0) [Mon Dec 9 06:17:21 2019][ 16.680754] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0103), lun(0) [Mon Dec 9 06:17:21 2019][ 16.689115] scsi 1:0:99:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.697422] scsi 1:0:99:0: SSP: handle(0x0103), sas_addr(0x5000cca2525ebeb1), phy(36), device_name(0x5000cca2525ebeb3) [Mon Dec 9 06:17:21 2019][ 16.708106] scsi 1:0:99:0: enclosure logical id(0x5000ccab04037180), slot(45) [Mon Dec 9 06:17:21 2019][ 16.715323] scsi 1:0:99:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:21 2019][ 16.722128] scsi 1:0:99:0: serial_number( 7SHP396W) [Mon Dec 9 06:17:21 2019][ 16.727617] scsi 1:0:99:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:21 2019][ 16.750191] mpt3sas_cm0: detecting: handle(0x0104), sas_address(0x5000cca2525f2919), phy(37) [Mon Dec 9 06:17:21 2019][ 16.758628] mpt3sas_cm0: REPORT_LUNS: handle(0x0104), retries(0) [Mon Dec 9 06:17:21 2019][ 16.764805] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0104), lun(0) [Mon Dec 9 06:17:21 2019][ 16.783723] scsi 1:0:100:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:21 2019][ 16.792107] scsi 1:0:100:0: SSP: handle(0x0104), sas_addr(0x5000cca2525f2919), phy(37), device_name(0x5000cca2525f291b) [Mon Dec 9 06:17:22 2019][ 16.802879] scsi 1:0:100:0: enclosure logical id(0x5000ccab04037180), slot(46) [Mon Dec 9 06:17:22 2019][ 16.810183] scsi 1:0:100:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 16.817060] scsi 1:0:100:0: serial_number( 7SHPABWW) [Mon Dec 9 06:17:22 2019][ 16.822631] scsi 1:0:100:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 16.845209] mpt3sas_cm0: detecting: handle(0x0105), sas_address(0x5000cca252602c0d), phy(38) [Mon Dec 9 06:17:22 2019][ 16.853644] mpt3sas_cm0: REPORT_LUNS: handle(0x0105), retries(0) [Mon Dec 9 06:17:22 2019][ 16.859778] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0105), lun(0) [Mon Dec 9 06:17:22 2019][ 16.879826] scsi 1:0:101:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 16.888194] scsi 1:0:101:0: SSP: handle(0x0105), sas_addr(0x5000cca252602c0d), phy(38), device_name(0x5000cca252602c0f) [Mon Dec 9 06:17:22 2019][ 16.898968] scsi 1:0:101:0: enclosure logical id(0x5000ccab04037180), slot(47) [Mon Dec 9 06:17:22 2019][ 16.906272] scsi 1:0:101:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 16.913151] scsi 1:0:101:0: serial_number( 7SHPWMHW) [Mon Dec 9 06:17:22 2019][ 16.918721] scsi 1:0:101:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 16.939195] mpt3sas_cm0: detecting: handle(0x0106), sas_address(0x5000cca2525e7cfd), phy(39) [Mon Dec 9 06:17:22 2019][ 16.947635] mpt3sas_cm0: REPORT_LUNS: handle(0x0106), retries(0) [Mon Dec 9 06:17:22 2019][ 16.953767] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0106), lun(0) [Mon Dec 9 06:17:22 2019][ 16.960380] scsi 1:0:102:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 16.968765] scsi 1:0:102:0: SSP: handle(0x0106), sas_addr(0x5000cca2525e7cfd), phy(39), device_name(0x5000cca2525e7cff) [Mon Dec 9 06:17:22 2019][ 16.979535] scsi 1:0:102:0: enclosure logical id(0x5000ccab04037180), slot(48) [Mon Dec 9 06:17:22 2019][ 16.986839] scsi 1:0:102:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 16.993733] scsi 1:0:102:0: serial_number( 7SHNYXKW) [Mon Dec 9 06:17:22 2019][ 16.999308] scsi 1:0:102:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.023198] mpt3sas_cm0: detecting: handle(0x0107), sas_address(0x5000cca2525f6a31), phy(40) [Mon Dec 9 06:17:22 2019][ 17.031635] mpt3sas_cm0: REPORT_LUNS: handle(0x0107), retries(0) [Mon Dec 9 06:17:22 2019][ 17.037794] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0107), lun(0) [Mon Dec 9 06:17:22 2019][ 17.045237] scsi 1:0:103:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.053623] scsi 1:0:103:0: SSP: handle(0x0107), sas_addr(0x5000cca2525f6a31), phy(40), device_name(0x5000cca2525f6a33) [Mon Dec 9 06:17:22 2019][ 17.064393] scsi 1:0:103:0: enclosure logical id(0x5000ccab04037180), slot(49) [Mon Dec 9 06:17:22 2019][ 17.071697] scsi 1:0:103:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.078594] scsi 1:0:103:0: serial_number( 7SHPGR8W) [Mon Dec 9 06:17:22 2019][ 17.084173] scsi 1:0:103:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.104197] mpt3sas_cm0: detecting: handle(0x0108), sas_address(0x5000cca2525f7f25), phy(41) [Mon Dec 9 06:17:22 2019][ 17.112636] mpt3sas_cm0: REPORT_LUNS: handle(0x0108), retries(0) [Mon Dec 9 06:17:22 2019][ 17.118778] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0108), lun(0) [Mon Dec 9 06:17:22 2019][ 17.125418] scsi 1:0:104:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.133823] scsi 1:0:104:0: SSP: handle(0x0108), sas_addr(0x5000cca2525f7f25), phy(41), device_name(0x5000cca2525f7f27) [Mon Dec 9 06:17:22 2019][ 17.144595] scsi 1:0:104:0: enclosure logical id(0x5000ccab04037180), slot(50) [Mon Dec 9 06:17:22 2019][ 17.151902] scsi 1:0:104:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.158793] scsi 1:0:104:0: serial_number( 7SHPJ3JW) [Mon Dec 9 06:17:22 2019][ 17.164367] scsi 1:0:104:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.191206] mpt3sas_cm0: detecting: handle(0x0109), sas_address(0x5000cca2525eb4b1), phy(42) [Mon Dec 9 06:17:22 2019][ 17.199658] mpt3sas_cm0: REPORT_LUNS: handle(0x0109), retries(0) [Mon Dec 9 06:17:22 2019][ 17.205822] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0109), lun(0) [Mon Dec 9 06:17:22 2019][ 17.212648] scsi 1:0:105:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.221083] scsi 1:0:105:0: SSP: handle(0x0109), sas_addr(0x5000cca2525eb4b1), phy(42), device_name(0x5000cca2525eb4b3) [Mon Dec 9 06:17:22 2019][ 17.231852] scsi 1:0:105:0: enclosure logical id(0x5000ccab04037180), slot(51) [Mon Dec 9 06:17:22 2019][ 17.239160] scsi 1:0:105:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.246053] scsi 1:0:105:0: serial_number( 7SHP2MKW) [Mon Dec 9 06:17:22 2019][ 17.251624] scsi 1:0:105:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.274209] mpt3sas_cm0: detecting: handle(0x010a), sas_address(0x5000cca2525e1f9d), phy(43) [Mon Dec 9 06:17:22 2019][ 17.282646] mpt3sas_cm0: REPORT_LUNS: handle(0x010a), retries(0) [Mon Dec 9 06:17:22 2019][ 17.288782] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010a), lun(0) [Mon Dec 9 06:17:22 2019][ 17.295399] scsi 1:0:106:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.303786] scsi 1:0:106:0: SSP: handle(0x010a), sas_addr(0x5000cca2525e1f9d), phy(43), device_name(0x5000cca2525e1f9f) [Mon Dec 9 06:17:22 2019][ 17.314563] scsi 1:0:106:0: enclosure logical id(0x5000ccab04037180), slot(52) [Mon Dec 9 06:17:22 2019][ 17.321867] scsi 1:0:106:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.328758] scsi 1:0:106:0: serial_number( 7SHNSPTW) [Mon Dec 9 06:17:22 2019][ 17.334334] scsi 1:0:106:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.354793] mpt3sas_cm0: detecting: handle(0x010b), sas_address(0x5000cca2525e52fd), phy(44) [Mon Dec 9 06:17:22 2019][ 17.363231] mpt3sas_cm0: REPORT_LUNS: handle(0x010b), retries(0) [Mon Dec 9 06:17:22 2019][ 17.369394] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010b), lun(0) [Mon Dec 9 06:17:22 2019][ 17.376080] scsi 1:0:107:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.384470] scsi 1:0:107:0: SSP: handle(0x010b), sas_addr(0x5000cca2525e52fd), phy(44), device_name(0x5000cca2525e52ff) [Mon Dec 9 06:17:22 2019][ 17.395242] scsi 1:0:107:0: enclosure logical id(0x5000ccab04037180), slot(53) [Mon Dec 9 06:17:22 2019][ 17.402549] scsi 1:0:107:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.409437] scsi 1:0:107:0: serial_number( 7SHNW3VW) [Mon Dec 9 06:17:22 2019][ 17.415014] scsi 1:0:107:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.435209] mpt3sas_cm0: detecting: handle(0x010c), sas_address(0x5000cca2525f4e71), phy(45) [Mon Dec 9 06:17:22 2019][ 17.443651] mpt3sas_cm0: REPORT_LUNS: handle(0x010c), retries(0) [Mon Dec 9 06:17:22 2019][ 17.449804] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010c), lun(0) [Mon Dec 9 06:17:22 2019][ 17.456405] scsi 1:0:108:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.464795] scsi 1:0:108:0: SSP: handle(0x010c), sas_addr(0x5000cca2525f4e71), phy(45), device_name(0x5000cca2525f4e73) [Mon Dec 9 06:17:22 2019][ 17.475567] scsi 1:0:108:0: enclosure logical id(0x5000ccab04037180), slot(54) [Mon Dec 9 06:17:22 2019][ 17.482871] scsi 1:0:108:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.489763] scsi 1:0:108:0: serial_number( 7SHPDVZW) [Mon Dec 9 06:17:22 2019][ 17.495339] scsi 1:0:108:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.515209] mpt3sas_cm0: detecting: handle(0x010d), sas_address(0x5000cca2525fd499), phy(46) [Mon Dec 9 06:17:22 2019][ 17.523648] mpt3sas_cm0: REPORT_LUNS: handle(0x010d), retries(0) [Mon Dec 9 06:17:22 2019][ 17.529788] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010d), lun(0) [Mon Dec 9 06:17:22 2019][ 17.536397] scsi 1:0:109:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.544789] scsi 1:0:109:0: SSP: handle(0x010d), sas_addr(0x5000cca2525fd499), phy(46), device_name(0x5000cca2525fd49b) [Mon Dec 9 06:17:22 2019][ 17.555562] scsi 1:0:109:0: enclosure logical id(0x5000ccab04037180), slot(55) [Mon Dec 9 06:17:22 2019][ 17.562866] scsi 1:0:109:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.569763] scsi 1:0:109:0: serial_number( 7SHPPTYW) [Mon Dec 9 06:17:22 2019][ 17.575335] scsi 1:0:109:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.598208] mpt3sas_cm0: detecting: handle(0x010e), sas_address(0x5000cca2525e7879), phy(47) [Mon Dec 9 06:17:22 2019][ 17.606649] mpt3sas_cm0: REPORT_LUNS: handle(0x010e), retries(0) [Mon Dec 9 06:17:22 2019][ 17.612788] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010e), lun(0) [Mon Dec 9 06:17:22 2019][ 17.619434] scsi 1:0:110:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.627828] scsi 1:0:110:0: SSP: handle(0x010e), sas_addr(0x5000cca2525e7879), phy(47), device_name(0x5000cca2525e787b) [Mon Dec 9 06:17:22 2019][ 17.638599] scsi 1:0:110:0: enclosure logical id(0x5000ccab04037180), slot(56) [Mon Dec 9 06:17:22 2019][ 17.645906] scsi 1:0:110:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.652798] scsi 1:0:110:0: serial_number( 7SHNYM7W) [Mon Dec 9 06:17:22 2019][ 17.658370] scsi 1:0:110:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.678216] mpt3sas_cm0: detecting: handle(0x010f), sas_address(0x5000cca2525ca199), phy(48) [Mon Dec 9 06:17:22 2019][ 17.686653] mpt3sas_cm0: REPORT_LUNS: handle(0x010f), retries(0) [Mon Dec 9 06:17:22 2019][ 17.692792] mpt3sas_cm0: TEST_UNIT_READY: handle(0x010f), lun(0) [Mon Dec 9 06:17:22 2019][ 17.699410] scsi 1:0:111:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.707796] scsi 1:0:111:0: SSP: handle(0x010f), sas_addr(0x5000cca2525ca199), phy(48), device_name(0x5000cca2525ca19b) [Mon Dec 9 06:17:22 2019][ 17.718568] scsi 1:0:111:0: enclosure logical id(0x5000ccab04037180), slot(57) [Mon Dec 9 06:17:22 2019][ 17.725874] scsi 1:0:111:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:22 2019][ 17.732765] scsi 1:0:111:0: serial_number( 7SHMY83W) [Mon Dec 9 06:17:22 2019][ 17.738341] scsi 1:0:111:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:22 2019][ 17.758213] mpt3sas_cm0: detecting: handle(0x0110), sas_address(0x5000cca2525ffb89), phy(49) [Mon Dec 9 06:17:22 2019][ 17.766648] mpt3sas_cm0: REPORT_LUNS: handle(0x0110), retries(0) [Mon Dec 9 06:17:22 2019][ 17.772782] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0110), lun(0) [Mon Dec 9 06:17:22 2019][ 17.779564] scsi 1:0:112:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:22 2019][ 17.787965] scsi 1:0:112:0: SSP: handle(0x0110), sas_addr(0x5000cca2525ffb89), phy(49), device_name(0x5000cca2525ffb8b) [Mon Dec 9 06:17:23 2019][ 17.798736] scsi 1:0:112:0: enclosure logical id(0x5000ccab04037180), slot(58) [Mon Dec 9 06:17:23 2019][ 17.806043] scsi 1:0:112:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 17.812936] scsi 1:0:112:0: serial_number( 7SHPTDAW) [Mon Dec 9 06:17:23 2019][ 17.818510] scsi 1:0:112:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 17.842215] mpt3sas_cm0: detecting: handle(0x0111), sas_address(0x5000cca2525f2669), phy(50) [Mon Dec 9 06:17:23 2019][ 17.850647] mpt3sas_cm0: REPORT_LUNS: handle(0x0111), retries(0) [Mon Dec 9 06:17:23 2019][ 17.856779] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0111), lun(0) [Mon Dec 9 06:17:23 2019][ 17.863397] scsi 1:0:113:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 17.871788] scsi 1:0:113:0: SSP: handle(0x0111), sas_addr(0x5000cca2525f2669), phy(50), device_name(0x5000cca2525f266b) [Mon Dec 9 06:17:23 2019][ 17.882562] scsi 1:0:113:0: enclosure logical id(0x5000ccab04037180), slot(59) [Mon Dec 9 06:17:23 2019][ 17.889869] scsi 1:0:113:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 17.896761] scsi 1:0:113:0: serial_number( 7SHPA6AW) [Mon Dec 9 06:17:23 2019][ 17.902335] scsi 1:0:113:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 17.925711] mpt3sas_cm0: expander_add: handle(0x00dd), parent(0x00da), sas_addr(0x5000ccab040371ff), phys(68) [Mon Dec 9 06:17:23 2019][ 17.946147] mpt3sas_cm0: detecting: handle(0x0112), sas_address(0x5000cca2525eacc1), phy(42) [Mon Dec 9 06:17:23 2019][ 17.954580] mpt3sas_cm0: REPORT_LUNS: handle(0x0112), retries(0) [Mon Dec 9 06:17:23 2019][ 17.960701] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0112), lun(0) [Mon Dec 9 06:17:23 2019][ 17.967324] scsi 1:0:114:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 17.975719] scsi 1:0:114:0: SSP: handle(0x0112), sas_addr(0x5000cca2525eacc1), phy(42), device_name(0x5000cca2525eacc3) [Mon Dec 9 06:17:23 2019][ 17.986488] scsi 1:0:114:0: enclosure logical id(0x5000ccab04037180), slot(1) [Mon Dec 9 06:17:23 2019][ 17.993708] scsi 1:0:114:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.000597] scsi 1:0:114:0: serial_number( 7SHP235W) [Mon Dec 9 06:17:23 2019][ 18.006172] scsi 1:0:114:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.026222] mpt3sas_cm0: detecting: handle(0x0113), sas_address(0x5000cca2525f8151), phy(43) [Mon Dec 9 06:17:23 2019][ 18.034662] mpt3sas_cm0: REPORT_LUNS: handle(0x0113), retries(0) [Mon Dec 9 06:17:23 2019][ 18.040793] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0113), lun(0) [Mon Dec 9 06:17:23 2019][ 18.047431] scsi 1:0:115:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.055824] scsi 1:0:115:0: SSP: handle(0x0113), sas_addr(0x5000cca2525f8151), phy(43), device_name(0x5000cca2525f8153) [Mon Dec 9 06:17:23 2019][ 18.066596] scsi 1:0:115:0: enclosure logical id(0x5000ccab04037180), slot(3) [Mon Dec 9 06:17:23 2019][ 18.073815] scsi 1:0:115:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.080711] scsi 1:0:115:0: serial_number( 7SHPJ80W) [Mon Dec 9 06:17:23 2019][ 18.086282] scsi 1:0:115:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.106812] mpt3sas_cm0: detecting: handle(0x0114), sas_address(0x5000cca2525ef839), phy(44) [Mon Dec 9 06:17:23 2019][ 18.115250] mpt3sas_cm0: REPORT_LUNS: handle(0x0114), retries(0) [Mon Dec 9 06:17:23 2019][ 18.121390] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0114), lun(0) [Mon Dec 9 06:17:23 2019][ 18.128000] scsi 1:0:116:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.136391] scsi 1:0:116:0: SSP: handle(0x0114), sas_addr(0x5000cca2525ef839), phy(44), device_name(0x5000cca2525ef83b) [Mon Dec 9 06:17:23 2019][ 18.147163] scsi 1:0:116:0: enclosure logical id(0x5000ccab04037180), slot(4) [Mon Dec 9 06:17:23 2019][ 18.154383] scsi 1:0:116:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.161277] scsi 1:0:116:0: serial_number( 7SHP73ZW) [Mon Dec 9 06:17:23 2019][ 18.166848] scsi 1:0:116:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.189227] mpt3sas_cm0: detecting: handle(0x0115), sas_address(0x5000cca2525e72a9), phy(45) [Mon Dec 9 06:17:23 2019][ 18.197662] mpt3sas_cm0: REPORT_LUNS: handle(0x0115), retries(0) [Mon Dec 9 06:17:23 2019][ 18.203835] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0115), lun(0) [Mon Dec 9 06:17:23 2019][ 18.210577] scsi 1:0:117:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.225366] scsi 1:0:117:0: SSP: handle(0x0115), sas_addr(0x5000cca2525e72a9), phy(45), device_name(0x5000cca2525e72ab) [Mon Dec 9 06:17:23 2019][ 18.236138] scsi 1:0:117:0: enclosure logical id(0x5000ccab04037180), slot(5) [Mon Dec 9 06:17:23 2019][ 18.243356] scsi 1:0:117:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.250250] scsi 1:0:117:0: serial_number( 7SHNY77W) [Mon Dec 9 06:17:23 2019][ 18.255824] scsi 1:0:117:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.278226] mpt3sas_cm0: detecting: handle(0x0116), sas_address(0x5000cca2525d3c89), phy(46) [Mon Dec 9 06:17:23 2019][ 18.286663] mpt3sas_cm0: REPORT_LUNS: handle(0x0116), retries(0) [Mon Dec 9 06:17:23 2019][ 18.292829] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0116), lun(0) [Mon Dec 9 06:17:23 2019][ 18.299537] scsi 1:0:118:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.319935] scsi 1:0:118:0: SSP: handle(0x0116), sas_addr(0x5000cca2525d3c89), phy(46), device_name(0x5000cca2525d3c8b) [Mon Dec 9 06:17:23 2019][ 18.330710] scsi 1:0:118:0: enclosure logical id(0x5000ccab04037180), slot(6) [Mon Dec 9 06:17:23 2019][ 18.337930] scsi 1:0:118:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.344822] scsi 1:0:118:0: serial_number( 7SHN8KZW) [Mon Dec 9 06:17:23 2019][ 18.350395] scsi 1:0:118:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.370231] mpt3sas_cm0: detecting: handle(0x0117), sas_address(0x5000cca2525fae0d), phy(47) [Mon Dec 9 06:17:23 2019][ 18.378669] mpt3sas_cm0: REPORT_LUNS: handle(0x0117), retries(0) [Mon Dec 9 06:17:23 2019][ 18.384834] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0117), lun(0) [Mon Dec 9 06:17:23 2019][ 18.411287] scsi 1:0:119:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.419660] scsi 1:0:119:0: SSP: handle(0x0117), sas_addr(0x5000cca2525fae0d), phy(47), device_name(0x5000cca2525fae0f) [Mon Dec 9 06:17:23 2019][ 18.430432] scsi 1:0:119:0: enclosure logical id(0x5000ccab04037180), slot(7) [Mon Dec 9 06:17:23 2019][ 18.437652] scsi 1:0:119:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.444526] scsi 1:0:119:0: serial_number( 7SHPM7BW) [Mon Dec 9 06:17:23 2019][ 18.450099] scsi 1:0:119:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.470263] mpt3sas_cm0: detecting: handle(0x0118), sas_address(0x5000cca2525efdad), phy(48) [Mon Dec 9 06:17:23 2019][ 18.478701] mpt3sas_cm0: REPORT_LUNS: handle(0x0118), retries(0) [Mon Dec 9 06:17:23 2019][ 18.484863] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0118), lun(0) [Mon Dec 9 06:17:23 2019][ 18.491674] scsi 1:0:120:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.500073] scsi 1:0:120:0: SSP: handle(0x0118), sas_addr(0x5000cca2525efdad), phy(48), device_name(0x5000cca2525efdaf) [Mon Dec 9 06:17:23 2019][ 18.510845] scsi 1:0:120:0: enclosure logical id(0x5000ccab04037180), slot(8) [Mon Dec 9 06:17:23 2019][ 18.518062] scsi 1:0:120:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.524957] scsi 1:0:120:0: serial_number( 7SHP7H7W) [Mon Dec 9 06:17:23 2019][ 18.530539] scsi 1:0:120:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.554234] mpt3sas_cm0: detecting: handle(0x0119), sas_address(0x5000cca2525fa301), phy(49) [Mon Dec 9 06:17:23 2019][ 18.562674] mpt3sas_cm0: REPORT_LUNS: handle(0x0119), retries(0) [Mon Dec 9 06:17:23 2019][ 18.568840] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0119), lun(0) [Mon Dec 9 06:17:23 2019][ 18.575693] scsi 1:0:121:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.584095] scsi 1:0:121:0: SSP: handle(0x0119), sas_addr(0x5000cca2525fa301), phy(49), device_name(0x5000cca2525fa303) [Mon Dec 9 06:17:23 2019][ 18.594869] scsi 1:0:121:0: enclosure logical id(0x5000ccab04037180), slot(9) [Mon Dec 9 06:17:23 2019][ 18.602089] scsi 1:0:121:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:23 2019][ 18.608978] scsi 1:0:121:0: serial_number( 7SHPLHKW) [Mon Dec 9 06:17:23 2019][ 18.614554] scsi 1:0:121:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:23 2019][ 18.651238] mpt3sas_cm0: detecting: handle(0x011a), sas_address(0x5000cca2525fb4bd), phy(50) [Mon Dec 9 06:17:23 2019][ 18.659676] mpt3sas_cm0: REPORT_LUNS: handle(0x011a), retries(0) [Mon Dec 9 06:17:23 2019][ 18.665820] mpt3sas_cm0: TEST_UNIT_READY: handle(0x011a), lun(0) [Mon Dec 9 06:17:23 2019][ 18.750803] scsi 1:0:122:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:23 2019][ 18.759178] scsi 1:0:122:0: SSP: handle(0x011a), sas_addr(0x5000cca2525fb4bd), phy(50), device_name(0x5000cca2525fb4bf) [Mon Dec 9 06:17:23 2019][ 18.769948] scsi 1:0:122:0: enclosure logical id(0x5000ccab04037180), slot(10) [Mon Dec 9 06:17:24 2019][ 18.777255] scsi 1:0:122:0: enclosure level(0x0000), connector name( C2 ) [Mon Dec 9 06:17:24 2019][ 18.784131] scsi 1:0:122:0: serial_number( 7SHPMP5W) [Mon Dec 9 06:17:24 2019][ 18.789703] scsi 1:0:122:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 18.815750] mpt3sas_cm0: expander_add: handle(0x0017), parent(0x0009), sas_addr(0x5000ccab0405db7d), phys(49) [Mon Dec 9 06:17:24 2019][ 18.836307] mpt3sas_cm0: detecting: handle(0x001b), sas_address(0x5000ccab0405db7c), phy(48) [Mon Dec 9 06:17:24 2019][ 18.844749] mpt3sas_cm0: REPORT_LUNS: handle(0x001b), retries(0) [Mon Dec 9 06:17:24 2019][ 18.851187] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001b), lun(0) [Mon Dec 9 06:17:24 2019][ 18.858040] scsi 1:0:123:0: Enclosure HGST H4060-J 2033 PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 18.866618] scsi 1:0:123:0: set ignore_delay_remove for handle(0x001b) [Mon Dec 9 06:17:24 2019][ 18.873146] scsi 1:0:123:0: SES: handle(0x001b), sas_addr(0x5000ccab0405db7c), phy(48), device_name(0x0000000000000000) [Mon Dec 9 06:17:24 2019][ 18.883917] scsi 1:0:123:0: enclosure logical id(0x5000ccab0405db00), slot(60) [Mon Dec 9 06:17:24 2019][ 18.891224] scsi 1:0:123:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 18.898118] scsi 1:0:123:0: serial_number(USWSJ03918EZ0069 ) [Mon Dec 9 06:17:24 2019][ 18.904038] scsi 1:0:123:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 18.928605] mpt3sas_cm0: expander_add: handle(0x0019), parent(0x0017), sas_addr(0x5000ccab0405db79), phys(68) [Mon Dec 9 06:17:24 2019][ 18.949726] mpt3sas_cm0: detecting: handle(0x001c), sas_address(0x5000cca252550a76), phy(0) [Mon Dec 9 06:17:24 2019][ 18.958079] mpt3sas_cm0: REPORT_LUNS: handle(0x001c), retries(0) [Mon Dec 9 06:17:24 2019][ 18.964939] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001c), lun(0) [Mon Dec 9 06:17:24 2019][ 18.972854] scsi 1:0:124:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 18.981251] scsi 1:0:124:0: SSP: handle(0x001c), sas_addr(0x5000cca252550a76), phy(0), device_name(0x5000cca252550a77) [Mon Dec 9 06:17:24 2019][ 18.991933] scsi 1:0:124:0: enclosure logical id(0x5000ccab0405db00), slot(0) [Mon Dec 9 06:17:24 2019][ 18.999152] scsi 1:0:124:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.006043] scsi 1:0:124:0: serial_number( 7SHHSVGG) [Mon Dec 9 06:17:24 2019][ 19.011620] scsi 1:0:124:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.034230] mpt3sas_cm0: detecting: handle(0x001d), sas_address(0x5000cca25253eb32), phy(1) [Mon Dec 9 06:17:24 2019][ 19.042578] mpt3sas_cm0: REPORT_LUNS: handle(0x001d), retries(0) [Mon Dec 9 06:17:24 2019][ 19.048713] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001d), lun(0) [Mon Dec 9 06:17:24 2019][ 19.055363] scsi 1:0:125:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.063753] scsi 1:0:125:0: SSP: handle(0x001d), sas_addr(0x5000cca25253eb32), phy(1), device_name(0x5000cca25253eb33) [Mon Dec 9 06:17:24 2019][ 19.074441] scsi 1:0:125:0: enclosure logical id(0x5000ccab0405db00), slot(2) [Mon Dec 9 06:17:24 2019][ 19.081661] scsi 1:0:125:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.088554] scsi 1:0:125:0: serial_number( 7SHH4RDG) [Mon Dec 9 06:17:24 2019][ 19.094127] scsi 1:0:125:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.114232] mpt3sas_cm0: detecting: handle(0x001e), sas_address(0x5000cca26b950bb6), phy(2) [Mon Dec 9 06:17:24 2019][ 19.122582] mpt3sas_cm0: REPORT_LUNS: handle(0x001e), retries(0) [Mon Dec 9 06:17:24 2019][ 19.128741] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001e), lun(0) [Mon Dec 9 06:17:24 2019][ 19.135446] scsi 1:0:126:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.146152] scsi 1:0:126:0: SSP: handle(0x001e), sas_addr(0x5000cca26b950bb6), phy(2), device_name(0x5000cca26b950bb7) [Mon Dec 9 06:17:24 2019][ 19.156839] scsi 1:0:126:0: enclosure logical id(0x5000ccab0405db00), slot(11) [Mon Dec 9 06:17:24 2019][ 19.164144] scsi 1:0:126:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.171038] scsi 1:0:126:0: serial_number( 1SJMZ22Z) [Mon Dec 9 06:17:24 2019][ 19.176611] scsi 1:0:126:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.199239] mpt3sas_cm0: detecting: handle(0x001f), sas_address(0x5000cca25253f3be), phy(3) [Mon Dec 9 06:17:24 2019][ 19.207591] mpt3sas_cm0: REPORT_LUNS: handle(0x001f), retries(0) [Mon Dec 9 06:17:24 2019][ 19.213727] mpt3sas_cm0: TEST_UNIT_READY: handle(0x001f), lun(0) [Mon Dec 9 06:17:24 2019][ 19.234740] scsi 1:0:127:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.243128] scsi 1:0:127:0: SSP: handle(0x001f), sas_addr(0x5000cca25253f3be), phy(3), device_name(0x5000cca25253f3bf) [Mon Dec 9 06:17:24 2019][ 19.253813] scsi 1:0:127:0: enclosure logical id(0x5000ccab0405db00), slot(12) [Mon Dec 9 06:17:24 2019][ 19.261117] scsi 1:0:127:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.267995] scsi 1:0:127:0: serial_number( 7SHH591G) [Mon Dec 9 06:17:24 2019][ 19.273567] scsi 1:0:127:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.312240] mpt3sas_cm0: detecting: handle(0x0020), sas_address(0x5000cca26a2ac3da), phy(4) [Mon Dec 9 06:17:24 2019][ 19.320593] mpt3sas_cm0: REPORT_LUNS: handle(0x0020), retries(0) [Mon Dec 9 06:17:24 2019][ 19.326724] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0020), lun(0) [Mon Dec 9 06:17:24 2019][ 19.333344] scsi 1:0:128:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.341734] scsi 1:0:128:0: SSP: handle(0x0020), sas_addr(0x5000cca26a2ac3da), phy(4), device_name(0x5000cca26a2ac3db) [Mon Dec 9 06:17:24 2019][ 19.352423] scsi 1:0:128:0: enclosure logical id(0x5000ccab0405db00), slot(13) [Mon Dec 9 06:17:24 2019][ 19.359730] scsi 1:0:128:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.366621] scsi 1:0:128:0: serial_number( 2TGSJ30D) [Mon Dec 9 06:17:24 2019][ 19.372194] scsi 1:0:128:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.395241] mpt3sas_cm0: detecting: handle(0x0021), sas_address(0x5000cca25254102a), phy(5) [Mon Dec 9 06:17:24 2019][ 19.403588] mpt3sas_cm0: REPORT_LUNS: handle(0x0021), retries(0) [Mon Dec 9 06:17:24 2019][ 19.409719] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0021), lun(0) [Mon Dec 9 06:17:24 2019][ 19.416340] scsi 1:0:129:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.424743] scsi 1:0:129:0: SSP: handle(0x0021), sas_addr(0x5000cca25254102a), phy(5), device_name(0x5000cca25254102b) [Mon Dec 9 06:17:24 2019][ 19.435425] scsi 1:0:129:0: enclosure logical id(0x5000ccab0405db00), slot(14) [Mon Dec 9 06:17:24 2019][ 19.442731] scsi 1:0:129:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.449623] scsi 1:0:129:0: serial_number( 7SHH75RG) [Mon Dec 9 06:17:24 2019][ 19.455197] scsi 1:0:129:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.475280] mpt3sas_cm0: detecting: handle(0x0022), sas_address(0x5000cca25254534a), phy(6) [Mon Dec 9 06:17:24 2019][ 19.483627] mpt3sas_cm0: REPORT_LUNS: handle(0x0022), retries(0) [Mon Dec 9 06:17:24 2019][ 19.489787] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0022), lun(0) [Mon Dec 9 06:17:24 2019][ 19.496536] scsi 1:0:130:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.504934] scsi 1:0:130:0: SSP: handle(0x0022), sas_addr(0x5000cca25254534a), phy(6), device_name(0x5000cca25254534b) [Mon Dec 9 06:17:24 2019][ 19.515621] scsi 1:0:130:0: enclosure logical id(0x5000ccab0405db00), slot(15) [Mon Dec 9 06:17:24 2019][ 19.522928] scsi 1:0:130:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.529817] scsi 1:0:130:0: serial_number( 7SHHBN9G) [Mon Dec 9 06:17:24 2019][ 19.535394] scsi 1:0:130:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.555247] mpt3sas_cm0: detecting: handle(0x0023), sas_address(0x5000cca2525430c6), phy(7) [Mon Dec 9 06:17:24 2019][ 19.563598] mpt3sas_cm0: REPORT_LUNS: handle(0x0023), retries(0) [Mon Dec 9 06:17:24 2019][ 19.569746] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0023), lun(0) [Mon Dec 9 06:17:24 2019][ 19.595611] scsi 1:0:131:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.605795] scsi 1:0:131:0: SSP: handle(0x0023), sas_addr(0x5000cca2525430c6), phy(7), device_name(0x5000cca2525430c7) [Mon Dec 9 06:17:24 2019][ 19.616477] scsi 1:0:131:0: enclosure logical id(0x5000ccab0405db00), slot(16) [Mon Dec 9 06:17:24 2019][ 19.623784] scsi 1:0:131:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.630661] scsi 1:0:131:0: serial_number( 7SHH9B1G) [Mon Dec 9 06:17:24 2019][ 19.636232] scsi 1:0:131:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.683250] mpt3sas_cm0: detecting: handle(0x0024), sas_address(0x5000cca25254385e), phy(8) [Mon Dec 9 06:17:24 2019][ 19.691599] mpt3sas_cm0: REPORT_LUNS: handle(0x0024), retries(0) [Mon Dec 9 06:17:24 2019][ 19.697761] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0024), lun(0) [Mon Dec 9 06:17:24 2019][ 19.705623] scsi 1:0:132:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:24 2019][ 19.716476] scsi 1:0:132:0: SSP: handle(0x0024), sas_addr(0x5000cca25254385e), phy(8), device_name(0x5000cca25254385f) [Mon Dec 9 06:17:24 2019][ 19.727164] scsi 1:0:132:0: enclosure logical id(0x5000ccab0405db00), slot(17) [Mon Dec 9 06:17:24 2019][ 19.734468] scsi 1:0:132:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:24 2019][ 19.741362] scsi 1:0:132:0: serial_number( 7SHH9VRG) [Mon Dec 9 06:17:24 2019][ 19.746935] scsi 1:0:132:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:24 2019][ 19.794295] mpt3sas_cm0: detecting: handle(0x0025), sas_address(0x5000cca25253f30e), phy(9) [Mon Dec 9 06:17:25 2019][ 19.802648] mpt3sas_cm0: REPORT_LUNS: handle(0x0025), retries(0) [Mon Dec 9 06:17:25 2019][ 19.808792] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0025), lun(0) [Mon Dec 9 06:17:25 2019][ 19.815553] scsi 1:0:133:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 19.823953] scsi 1:0:133:0: SSP: handle(0x0025), sas_addr(0x5000cca25253f30e), phy(9), device_name(0x5000cca25253f30f) [Mon Dec 9 06:17:25 2019][ 19.834640] scsi 1:0:133:0: enclosure logical id(0x5000ccab0405db00), slot(18) [Mon Dec 9 06:17:25 2019][ 19.841948] scsi 1:0:133:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 19.848839] scsi 1:0:133:0: serial_number( 7SHH57MG) [Mon Dec 9 06:17:25 2019][ 19.854412] scsi 1:0:133:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 19.874253] mpt3sas_cm0: detecting: handle(0x0026), sas_address(0x5000cca252545f66), phy(10) [Mon Dec 9 06:17:25 2019][ 19.882694] mpt3sas_cm0: REPORT_LUNS: handle(0x0026), retries(0) [Mon Dec 9 06:17:25 2019][ 19.888831] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0026), lun(0) [Mon Dec 9 06:17:25 2019][ 19.895602] scsi 1:0:134:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 19.904210] scsi 1:0:134:0: SSP: handle(0x0026), sas_addr(0x5000cca252545f66), phy(10), device_name(0x5000cca252545f67) [Mon Dec 9 06:17:25 2019][ 19.914981] scsi 1:0:134:0: enclosure logical id(0x5000ccab0405db00), slot(19) [Mon Dec 9 06:17:25 2019][ 19.922289] scsi 1:0:134:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 19.929183] scsi 1:0:134:0: serial_number( 7SHHDG9G) [Mon Dec 9 06:17:25 2019][ 19.934755] scsi 1:0:134:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 19.963256] mpt3sas_cm0: detecting: handle(0x0027), sas_address(0x5000cca266daa4e6), phy(11) [Mon Dec 9 06:17:25 2019][ 19.971695] mpt3sas_cm0: REPORT_LUNS: handle(0x0027), retries(0) [Mon Dec 9 06:17:25 2019][ 19.977839] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0027), lun(0) [Mon Dec 9 06:17:25 2019][ 19.984445] scsi 1:0:135:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 19.992836] scsi 1:0:135:0: SSP: handle(0x0027), sas_addr(0x5000cca266daa4e6), phy(11), device_name(0x5000cca266daa4e7) [Mon Dec 9 06:17:25 2019][ 20.003610] scsi 1:0:135:0: enclosure logical id(0x5000ccab0405db00), slot(20) [Mon Dec 9 06:17:25 2019][ 20.010917] scsi 1:0:135:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 20.017807] scsi 1:0:135:0: serial_number( 7JKW7MYK) [Mon Dec 9 06:17:25 2019][ 20.023381] scsi 1:0:135:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 20.043254] mpt3sas_cm0: detecting: handle(0x0028), sas_address(0x5000cca26a25167e), phy(12) [Mon Dec 9 06:17:25 2019][ 20.051689] mpt3sas_cm0: REPORT_LUNS: handle(0x0028), retries(0) [Mon Dec 9 06:17:25 2019][ 20.057850] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0028), lun(0) [Mon Dec 9 06:17:25 2019][ 20.064457] scsi 1:0:136:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 20.072850] scsi 1:0:136:0: SSP: handle(0x0028), sas_addr(0x5000cca26a25167e), phy(12), device_name(0x5000cca26a25167f) [Mon Dec 9 06:17:25 2019][ 20.083621] scsi 1:0:136:0: enclosure logical id(0x5000ccab0405db00), slot(21) [Mon Dec 9 06:17:25 2019][ 20.090928] scsi 1:0:136:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 20.097820] scsi 1:0:136:0: serial_number( 2TGND9JD) [Mon Dec 9 06:17:25 2019][ 20.103394] scsi 1:0:136:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 20.123257] mpt3sas_cm0: detecting: handle(0x0029), sas_address(0x5000cca25253edaa), phy(13) [Mon Dec 9 06:17:25 2019][ 20.131694] mpt3sas_cm0: REPORT_LUNS: handle(0x0029), retries(0) [Mon Dec 9 06:17:25 2019][ 20.137823] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0029), lun(0) [Mon Dec 9 06:17:25 2019][ 20.144426] scsi 1:0:137:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 20.152818] scsi 1:0:137:0: SSP: handle(0x0029), sas_addr(0x5000cca25253edaa), phy(13), device_name(0x5000cca25253edab) [Mon Dec 9 06:17:25 2019][ 20.163593] scsi 1:0:137:0: enclosure logical id(0x5000ccab0405db00), slot(22) [Mon Dec 9 06:17:25 2019][ 20.170899] scsi 1:0:137:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 20.177788] scsi 1:0:137:0: serial_number( 7SHH4WHG) [Mon Dec 9 06:17:25 2019][ 20.183365] scsi 1:0:137:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 20.204257] mpt3sas_cm0: detecting: handle(0x002a), sas_address(0x5000cca266d491a2), phy(14) [Mon Dec 9 06:17:25 2019][ 20.212694] mpt3sas_cm0: REPORT_LUNS: handle(0x002a), retries(0) [Mon Dec 9 06:17:25 2019][ 20.218822] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002a), lun(0) [Mon Dec 9 06:17:25 2019][ 20.225451] scsi 1:0:138:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 20.233835] scsi 1:0:138:0: SSP: handle(0x002a), sas_addr(0x5000cca266d491a2), phy(14), device_name(0x5000cca266d491a3) [Mon Dec 9 06:17:25 2019][ 20.244609] scsi 1:0:138:0: enclosure logical id(0x5000ccab0405db00), slot(23) [Mon Dec 9 06:17:25 2019][ 20.251916] scsi 1:0:138:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 20.258806] scsi 1:0:138:0: serial_number( 7JKSX22K) [Mon Dec 9 06:17:25 2019][ 20.264383] scsi 1:0:138:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 20.284260] mpt3sas_cm0: detecting: handle(0x002b), sas_address(0x5000cca26b9a709a), phy(15) [Mon Dec 9 06:17:25 2019][ 20.292697] mpt3sas_cm0: REPORT_LUNS: handle(0x002b), retries(0) [Mon Dec 9 06:17:25 2019][ 20.298833] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002b), lun(0) [Mon Dec 9 06:17:25 2019][ 20.468617] scsi 1:0:139:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 20.477001] scsi 1:0:139:0: SSP: handle(0x002b), sas_addr(0x5000cca26b9a709a), phy(15), device_name(0x5000cca26b9a709b) [Mon Dec 9 06:17:25 2019][ 20.487778] scsi 1:0:139:0: enclosure logical id(0x5000ccab0405db00), slot(24) [Mon Dec 9 06:17:25 2019][ 20.495082] scsi 1:0:139:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 20.501974] scsi 1:0:139:0: serial_number( 1SJRY0YZ) [Mon Dec 9 06:17:25 2019][ 20.507547] scsi 1:0:139:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:25 2019][ 20.533272] mpt3sas_cm0: detecting: handle(0x002c), sas_address(0x5000cca25253f832), phy(16) [Mon Dec 9 06:17:25 2019][ 20.541707] mpt3sas_cm0: REPORT_LUNS: handle(0x002c), retries(0) [Mon Dec 9 06:17:25 2019][ 20.548005] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002c), lun(0) [Mon Dec 9 06:17:25 2019][ 20.728121] scsi 1:0:140:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:25 2019][ 20.736500] scsi 1:0:140:0: SSP: handle(0x002c), sas_addr(0x5000cca25253f832), phy(16), device_name(0x5000cca25253f833) [Mon Dec 9 06:17:25 2019][ 20.747272] scsi 1:0:140:0: enclosure logical id(0x5000ccab0405db00), slot(25) [Mon Dec 9 06:17:25 2019][ 20.754578] scsi 1:0:140:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:25 2019][ 20.761455] scsi 1:0:140:0: serial_number( 7SHH5L7G) [Mon Dec 9 06:17:25 2019][ 20.767025] scsi 1:0:140:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 20.787273] mpt3sas_cm0: detecting: handle(0x002d), sas_address(0x5000cca26a2ab23e), phy(17) [Mon Dec 9 06:17:26 2019][ 20.795705] mpt3sas_cm0: REPORT_LUNS: handle(0x002d), retries(0) [Mon Dec 9 06:17:26 2019][ 20.801867] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002d), lun(0) [Mon Dec 9 06:17:26 2019][ 20.808471] scsi 1:0:141:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 20.816856] scsi 1:0:141:0: SSP: handle(0x002d), sas_addr(0x5000cca26a2ab23e), phy(17), device_name(0x5000cca26a2ab23f) [Mon Dec 9 06:17:26 2019][ 20.827632] scsi 1:0:141:0: enclosure logical id(0x5000ccab0405db00), slot(26) [Mon Dec 9 06:17:26 2019][ 20.834937] scsi 1:0:141:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 20.841828] scsi 1:0:141:0: serial_number( 2TGSGXND) [Mon Dec 9 06:17:26 2019][ 20.847404] scsi 1:0:141:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 20.867275] mpt3sas_cm0: detecting: handle(0x002e), sas_address(0x5000cca26b9b9696), phy(18) [Mon Dec 9 06:17:26 2019][ 20.875710] mpt3sas_cm0: REPORT_LUNS: handle(0x002e), retries(0) [Mon Dec 9 06:17:26 2019][ 20.881974] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002e), lun(0) [Mon Dec 9 06:17:26 2019][ 20.889131] scsi 1:0:142:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 20.897525] scsi 1:0:142:0: SSP: handle(0x002e), sas_addr(0x5000cca26b9b9696), phy(18), device_name(0x5000cca26b9b9697) [Mon Dec 9 06:17:26 2019][ 20.908294] scsi 1:0:142:0: enclosure logical id(0x5000ccab0405db00), slot(27) [Mon Dec 9 06:17:26 2019][ 20.915601] scsi 1:0:142:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 20.922492] scsi 1:0:142:0: serial_number( 1SJSKLWZ) [Mon Dec 9 06:17:26 2019][ 20.928065] scsi 1:0:142:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 20.948277] mpt3sas_cm0: detecting: handle(0x002f), sas_address(0x5000cca252559472), phy(19) [Mon Dec 9 06:17:26 2019][ 20.956713] mpt3sas_cm0: REPORT_LUNS: handle(0x002f), retries(0) [Mon Dec 9 06:17:26 2019][ 20.962861] mpt3sas_cm0: TEST_UNIT_READY: handle(0x002f), lun(0) [Mon Dec 9 06:17:26 2019][ 20.975815] scsi 1:0:143:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 20.984197] scsi 1:0:143:0: SSP: handle(0x002f), sas_addr(0x5000cca252559472), phy(19), device_name(0x5000cca252559473) [Mon Dec 9 06:17:26 2019][ 20.994972] scsi 1:0:143:0: enclosure logical id(0x5000ccab0405db00), slot(28) [Mon Dec 9 06:17:26 2019][ 21.002277] scsi 1:0:143:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.009156] scsi 1:0:143:0: serial_number( 7SHJ21AG) [Mon Dec 9 06:17:26 2019][ 21.014735] scsi 1:0:143:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.046290] mpt3sas_cm0: detecting: handle(0x0030), sas_address(0x5000cca25253f94e), phy(20) [Mon Dec 9 06:17:26 2019][ 21.054727] mpt3sas_cm0: REPORT_LUNS: handle(0x0030), retries(0) [Mon Dec 9 06:17:26 2019][ 21.060874] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0030), lun(0) [Mon Dec 9 06:17:26 2019][ 21.067593] scsi 1:0:144:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.077404] scsi 1:0:144:0: SSP: handle(0x0030), sas_addr(0x5000cca25253f94e), phy(20), device_name(0x5000cca25253f94f) [Mon Dec 9 06:17:26 2019][ 21.088177] scsi 1:0:144:0: enclosure logical id(0x5000ccab0405db00), slot(29) [Mon Dec 9 06:17:26 2019][ 21.095481] scsi 1:0:144:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.102376] scsi 1:0:144:0: serial_number( 7SHH5NJG) [Mon Dec 9 06:17:26 2019][ 21.107947] scsi 1:0:144:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.138284] mpt3sas_cm0: detecting: handle(0x0031), sas_address(0x5000cca25253e69a), phy(21) [Mon Dec 9 06:17:26 2019][ 21.146726] mpt3sas_cm0: REPORT_LUNS: handle(0x0031), retries(0) [Mon Dec 9 06:17:26 2019][ 21.152865] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0031), lun(0) [Mon Dec 9 06:17:26 2019][ 21.162001] scsi 1:0:145:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.170392] scsi 1:0:145:0: SSP: handle(0x0031), sas_addr(0x5000cca25253e69a), phy(21), device_name(0x5000cca25253e69b) [Mon Dec 9 06:17:26 2019][ 21.181164] scsi 1:0:145:0: enclosure logical id(0x5000ccab0405db00), slot(30) [Mon Dec 9 06:17:26 2019][ 21.188468] scsi 1:0:145:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.195359] scsi 1:0:145:0: serial_number( 7SHH4DXG) [Mon Dec 9 06:17:26 2019][ 21.200935] scsi 1:0:145:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.231883] mpt3sas_cm0: detecting: handle(0x0032), sas_address(0x5000cca252543cc2), phy(22) [Mon Dec 9 06:17:26 2019][ 21.240318] mpt3sas_cm0: REPORT_LUNS: handle(0x0032), retries(0) [Mon Dec 9 06:17:26 2019][ 21.246449] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0032), lun(0) [Mon Dec 9 06:17:26 2019][ 21.253048] scsi 1:0:146:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.261436] scsi 1:0:146:0: SSP: handle(0x0032), sas_addr(0x5000cca252543cc2), phy(22), device_name(0x5000cca252543cc3) [Mon Dec 9 06:17:26 2019][ 21.272208] scsi 1:0:146:0: enclosure logical id(0x5000ccab0405db00), slot(31) [Mon Dec 9 06:17:26 2019][ 21.279514] scsi 1:0:146:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.286404] scsi 1:0:146:0: serial_number( 7SHHA4TG) [Mon Dec 9 06:17:26 2019][ 21.291980] scsi 1:0:146:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.322287] mpt3sas_cm0: detecting: handle(0x0033), sas_address(0x5000cca26a24fcde), phy(23) [Mon Dec 9 06:17:26 2019][ 21.330737] mpt3sas_cm0: REPORT_LUNS: handle(0x0033), retries(0) [Mon Dec 9 06:17:26 2019][ 21.336859] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0033), lun(0) [Mon Dec 9 06:17:26 2019][ 21.343466] scsi 1:0:147:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.351852] scsi 1:0:147:0: SSP: handle(0x0033), sas_addr(0x5000cca26a24fcde), phy(23), device_name(0x5000cca26a24fcdf) [Mon Dec 9 06:17:26 2019][ 21.362622] scsi 1:0:147:0: enclosure logical id(0x5000ccab0405db00), slot(32) [Mon Dec 9 06:17:26 2019][ 21.369927] scsi 1:0:147:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.376819] scsi 1:0:147:0: serial_number( 2TGNALMD) [Mon Dec 9 06:17:26 2019][ 21.382393] scsi 1:0:147:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.402285] mpt3sas_cm0: detecting: handle(0x0034), sas_address(0x5000cca252543bce), phy(24) [Mon Dec 9 06:17:26 2019][ 21.410725] mpt3sas_cm0: REPORT_LUNS: handle(0x0034), retries(0) [Mon Dec 9 06:17:26 2019][ 21.416860] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0034), lun(0) [Mon Dec 9 06:17:26 2019][ 21.423461] scsi 1:0:148:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.431850] scsi 1:0:148:0: SSP: handle(0x0034), sas_addr(0x5000cca252543bce), phy(24), device_name(0x5000cca252543bcf) [Mon Dec 9 06:17:26 2019][ 21.442625] scsi 1:0:148:0: enclosure logical id(0x5000ccab0405db00), slot(33) [Mon Dec 9 06:17:26 2019][ 21.449932] scsi 1:0:148:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.456822] scsi 1:0:148:0: serial_number( 7SHHA2UG) [Mon Dec 9 06:17:26 2019][ 21.462396] scsi 1:0:148:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.482289] mpt3sas_cm0: detecting: handle(0x0035), sas_address(0x5000cca252551266), phy(25) [Mon Dec 9 06:17:26 2019][ 21.490723] mpt3sas_cm0: REPORT_LUNS: handle(0x0035), retries(0) [Mon Dec 9 06:17:26 2019][ 21.496861] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0035), lun(0) [Mon Dec 9 06:17:26 2019][ 21.505027] scsi 1:0:149:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.513524] scsi 1:0:149:0: SSP: handle(0x0035), sas_addr(0x5000cca252551266), phy(25), device_name(0x5000cca252551267) [Mon Dec 9 06:17:26 2019][ 21.524294] scsi 1:0:149:0: enclosure logical id(0x5000ccab0405db00), slot(34) [Mon Dec 9 06:17:26 2019][ 21.531599] scsi 1:0:149:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.538491] scsi 1:0:149:0: serial_number( 7SHHTBVG) [Mon Dec 9 06:17:26 2019][ 21.544064] scsi 1:0:149:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.572298] mpt3sas_cm0: detecting: handle(0x0036), sas_address(0x5000cca252555fca), phy(26) [Mon Dec 9 06:17:26 2019][ 21.580737] mpt3sas_cm0: REPORT_LUNS: handle(0x0036), retries(0) [Mon Dec 9 06:17:26 2019][ 21.586878] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0036), lun(0) [Mon Dec 9 06:17:26 2019][ 21.620949] scsi 1:0:150:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.629344] scsi 1:0:150:0: SSP: handle(0x0036), sas_addr(0x5000cca252555fca), phy(26), device_name(0x5000cca252555fcb) [Mon Dec 9 06:17:26 2019][ 21.640116] scsi 1:0:150:0: enclosure logical id(0x5000ccab0405db00), slot(35) [Mon Dec 9 06:17:26 2019][ 21.647423] scsi 1:0:150:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.654315] scsi 1:0:150:0: serial_number( 7SHHYJMG) [Mon Dec 9 06:17:26 2019][ 21.659889] scsi 1:0:150:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:26 2019][ 21.698909] mpt3sas_cm0: detecting: handle(0x0037), sas_address(0x5000cca252559f7e), phy(27) [Mon Dec 9 06:17:26 2019][ 21.707342] mpt3sas_cm0: REPORT_LUNS: handle(0x0037), retries(0) [Mon Dec 9 06:17:26 2019][ 21.713482] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0037), lun(0) [Mon Dec 9 06:17:26 2019][ 21.720293] scsi 1:0:151:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:26 2019][ 21.733357] scsi 1:0:151:0: SSP: handle(0x0037), sas_addr(0x5000cca252559f7e), phy(27), device_name(0x5000cca252559f7f) [Mon Dec 9 06:17:26 2019][ 21.744130] scsi 1:0:151:0: enclosure logical id(0x5000ccab0405db00), slot(36) [Mon Dec 9 06:17:26 2019][ 21.751434] scsi 1:0:151:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:26 2019][ 21.758326] scsi 1:0:151:0: serial_number( 7SHJ2T4G) [Mon Dec 9 06:17:26 2019][ 21.763899] scsi 1:0:151:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 21.786332] mpt3sas_cm0: detecting: handle(0x0038), sas_address(0x5000cca26c244bce), phy(28) [Mon Dec 9 06:17:27 2019][ 21.794772] mpt3sas_cm0: REPORT_LUNS: handle(0x0038), retries(0) [Mon Dec 9 06:17:27 2019][ 21.800904] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0038), lun(0) [Mon Dec 9 06:17:27 2019][ 21.807711] scsi 1:0:152:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 21.816098] scsi 1:0:152:0: SSP: handle(0x0038), sas_addr(0x5000cca26c244bce), phy(28), device_name(0x5000cca26c244bcf) [Mon Dec 9 06:17:27 2019][ 21.826870] scsi 1:0:152:0: enclosure logical id(0x5000ccab0405db00), slot(37) [Mon Dec 9 06:17:27 2019][ 21.834177] scsi 1:0:152:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 21.841068] scsi 1:0:152:0: serial_number( 1DGMYU2Z) [Mon Dec 9 06:17:27 2019][ 21.846644] scsi 1:0:152:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 21.867296] mpt3sas_cm0: detecting: handle(0x0039), sas_address(0x5000cca26a2aa10e), phy(29) [Mon Dec 9 06:17:27 2019][ 21.875731] mpt3sas_cm0: REPORT_LUNS: handle(0x0039), retries(0) [Mon Dec 9 06:17:27 2019][ 21.881865] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0039), lun(0) [Mon Dec 9 06:17:27 2019][ 21.888654] scsi 1:0:153:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 21.897047] scsi 1:0:153:0: SSP: handle(0x0039), sas_addr(0x5000cca26a2aa10e), phy(29), device_name(0x5000cca26a2aa10f) [Mon Dec 9 06:17:27 2019][ 21.907819] scsi 1:0:153:0: enclosure logical id(0x5000ccab0405db00), slot(38) [Mon Dec 9 06:17:27 2019][ 21.915126] scsi 1:0:153:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 21.922020] scsi 1:0:153:0: serial_number( 2TGSET5D) [Mon Dec 9 06:17:27 2019][ 21.927591] scsi 1:0:153:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 21.950304] mpt3sas_cm0: detecting: handle(0x003a), sas_address(0x5000cca25254e236), phy(30) [Mon Dec 9 06:17:27 2019][ 21.958743] mpt3sas_cm0: REPORT_LUNS: handle(0x003a), retries(0) [Mon Dec 9 06:17:27 2019][ 21.964884] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003a), lun(0) [Mon Dec 9 06:17:27 2019][ 21.982927] scsi 1:0:154:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 21.991500] scsi 1:0:154:0: SSP: handle(0x003a), sas_addr(0x5000cca25254e236), phy(30), device_name(0x5000cca25254e237) [Mon Dec 9 06:17:27 2019][ 22.002273] scsi 1:0:154:0: enclosure logical id(0x5000ccab0405db00), slot(39) [Mon Dec 9 06:17:27 2019][ 22.009577] scsi 1:0:154:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.016471] scsi 1:0:154:0: serial_number( 7SHHP5BG) [Mon Dec 9 06:17:27 2019][ 22.022043] scsi 1:0:154:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.069313] mpt3sas_cm0: detecting: handle(0x003b), sas_address(0x5000cca25254df96), phy(31) [Mon Dec 9 06:17:27 2019][ 22.077749] mpt3sas_cm0: REPORT_LUNS: handle(0x003b), retries(0) [Mon Dec 9 06:17:27 2019][ 22.084709] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003b), lun(0) [Mon Dec 9 06:17:27 2019][ 22.109363] scsi 1:0:155:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.117809] scsi 1:0:155:0: SSP: handle(0x003b), sas_addr(0x5000cca25254df96), phy(31), device_name(0x5000cca25254df97) [Mon Dec 9 06:17:27 2019][ 22.128584] scsi 1:0:155:0: enclosure logical id(0x5000ccab0405db00), slot(40) [Mon Dec 9 06:17:27 2019][ 22.135889] scsi 1:0:155:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.142780] scsi 1:0:155:0: serial_number( 7SHHNZYG) [Mon Dec 9 06:17:27 2019][ 22.148353] scsi 1:0:155:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.183319] mpt3sas_cm0: detecting: handle(0x003c), sas_address(0x5000cca25254e9d2), phy(32) [Mon Dec 9 06:17:27 2019][ 22.191760] mpt3sas_cm0: REPORT_LUNS: handle(0x003c), retries(0) [Mon Dec 9 06:17:27 2019][ 22.197889] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003c), lun(0) [Mon Dec 9 06:17:27 2019][ 22.239975] scsi 1:0:156:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.248362] scsi 1:0:156:0: SSP: handle(0x003c), sas_addr(0x5000cca25254e9d2), phy(32), device_name(0x5000cca25254e9d3) [Mon Dec 9 06:17:27 2019][ 22.259132] scsi 1:0:156:0: enclosure logical id(0x5000ccab0405db00), slot(41) [Mon Dec 9 06:17:27 2019][ 22.266440] scsi 1:0:156:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.273330] scsi 1:0:156:0: serial_number( 7SHHPP2G) [Mon Dec 9 06:17:27 2019][ 22.278903] scsi 1:0:156:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.400322] mpt3sas_cm0: detecting: handle(0x003d), sas_address(0x5000cca26a24008a), phy(33) [Mon Dec 9 06:17:27 2019][ 22.408763] mpt3sas_cm0: REPORT_LUNS: handle(0x003d), retries(0) [Mon Dec 9 06:17:27 2019][ 22.414903] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003d), lun(0) [Mon Dec 9 06:17:27 2019][ 22.421525] scsi 1:0:157:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.429905] scsi 1:0:157:0: SSP: handle(0x003d), sas_addr(0x5000cca26a24008a), phy(33), device_name(0x5000cca26a24008b) [Mon Dec 9 06:17:27 2019][ 22.440679] scsi 1:0:157:0: enclosure logical id(0x5000ccab0405db00), slot(42) [Mon Dec 9 06:17:27 2019][ 22.447983] scsi 1:0:157:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.454877] scsi 1:0:157:0: serial_number( 2TGMTTPD) [Mon Dec 9 06:17:27 2019][ 22.460450] scsi 1:0:157:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.483317] mpt3sas_cm0: detecting: handle(0x003e), sas_address(0x5000cca26a24b9ea), phy(34) [Mon Dec 9 06:17:27 2019][ 22.491758] mpt3sas_cm0: REPORT_LUNS: handle(0x003e), retries(0) [Mon Dec 9 06:17:27 2019][ 22.497891] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003e), lun(0) [Mon Dec 9 06:17:27 2019][ 22.506984] scsi 1:0:158:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.515363] scsi 1:0:158:0: SSP: handle(0x003e), sas_addr(0x5000cca26a24b9ea), phy(34), device_name(0x5000cca26a24b9eb) [Mon Dec 9 06:17:27 2019][ 22.526134] scsi 1:0:158:0: enclosure logical id(0x5000ccab0405db00), slot(43) [Mon Dec 9 06:17:27 2019][ 22.533438] scsi 1:0:158:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.540331] scsi 1:0:158:0: serial_number( 2TGN64DD) [Mon Dec 9 06:17:27 2019][ 22.545905] scsi 1:0:158:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.572373] mpt3sas_cm0: detecting: handle(0x003f), sas_address(0x5000cca26a25aed6), phy(35) [Mon Dec 9 06:17:27 2019][ 22.580812] mpt3sas_cm0: REPORT_LUNS: handle(0x003f), retries(0) [Mon Dec 9 06:17:27 2019][ 22.586962] mpt3sas_cm0: TEST_UNIT_READY: handle(0x003f), lun(0) [Mon Dec 9 06:17:27 2019][ 22.593574] scsi 1:0:159:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.601988] scsi 1:0:159:0: SSP: handle(0x003f), sas_addr(0x5000cca26a25aed6), phy(35), device_name(0x5000cca26a25aed7) [Mon Dec 9 06:17:27 2019][ 22.612759] scsi 1:0:159:0: enclosure logical id(0x5000ccab0405db00), slot(44) [Mon Dec 9 06:17:27 2019][ 22.620066] scsi 1:0:159:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.626956] scsi 1:0:159:0: serial_number( 2TGNRG1D) [Mon Dec 9 06:17:27 2019][ 22.632530] scsi 1:0:159:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.652320] mpt3sas_cm0: detecting: handle(0x0040), sas_address(0x5000cca266d32b6a), phy(36) [Mon Dec 9 06:17:27 2019][ 22.660761] mpt3sas_cm0: REPORT_LUNS: handle(0x0040), retries(0) [Mon Dec 9 06:17:27 2019][ 22.666902] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0040), lun(0) [Mon Dec 9 06:17:27 2019][ 22.674761] scsi 1:0:160:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.683145] scsi 1:0:160:0: SSP: handle(0x0040), sas_addr(0x5000cca266d32b6a), phy(36), device_name(0x5000cca266d32b6b) [Mon Dec 9 06:17:27 2019][ 22.693916] scsi 1:0:160:0: enclosure logical id(0x5000ccab0405db00), slot(45) [Mon Dec 9 06:17:27 2019][ 22.701222] scsi 1:0:160:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.708113] scsi 1:0:160:0: serial_number( 7JKS46JK) [Mon Dec 9 06:17:27 2019][ 22.713687] scsi 1:0:160:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:27 2019][ 22.736429] mpt3sas_cm0: detecting: handle(0x0041), sas_address(0x5000cca26b9bf886), phy(37) [Mon Dec 9 06:17:27 2019][ 22.744866] mpt3sas_cm0: REPORT_LUNS: handle(0x0041), retries(0) [Mon Dec 9 06:17:27 2019][ 22.751007] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0041), lun(0) [Mon Dec 9 06:17:27 2019][ 22.757858] scsi 1:0:161:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:27 2019][ 22.766243] scsi 1:0:161:0: SSP: handle(0x0041), sas_addr(0x5000cca26b9bf886), phy(37), device_name(0x5000cca26b9bf887) [Mon Dec 9 06:17:27 2019][ 22.777013] scsi 1:0:161:0: enclosure logical id(0x5000ccab0405db00), slot(46) [Mon Dec 9 06:17:27 2019][ 22.784318] scsi 1:0:161:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:27 2019][ 22.791211] scsi 1:0:161:0: serial_number( 1SJST42Z) [Mon Dec 9 06:17:28 2019][ 22.796785] scsi 1:0:161:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 22.819333] mpt3sas_cm0: detecting: handle(0x0042), sas_address(0x5000cca26b9b24ca), phy(38) [Mon Dec 9 06:17:28 2019][ 22.827773] mpt3sas_cm0: REPORT_LUNS: handle(0x0042), retries(0) [Mon Dec 9 06:17:28 2019][ 22.833946] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0042), lun(0) [Mon Dec 9 06:17:28 2019][ 22.840765] scsi 1:0:162:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 22.849150] scsi 1:0:162:0: SSP: handle(0x0042), sas_addr(0x5000cca26b9b24ca), phy(38), device_name(0x5000cca26b9b24cb) [Mon Dec 9 06:17:28 2019][ 22.859921] scsi 1:0:162:0: enclosure logical id(0x5000ccab0405db00), slot(47) [Mon Dec 9 06:17:28 2019][ 22.867225] scsi 1:0:162:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 22.874119] scsi 1:0:162:0: serial_number( 1SJSA0YZ) [Mon Dec 9 06:17:28 2019][ 22.879693] scsi 1:0:162:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 22.902325] mpt3sas_cm0: detecting: handle(0x0043), sas_address(0x5000cca26a21d742), phy(39) [Mon Dec 9 06:17:28 2019][ 22.910768] mpt3sas_cm0: REPORT_LUNS: handle(0x0043), retries(0) [Mon Dec 9 06:17:28 2019][ 22.916934] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0043), lun(0) [Mon Dec 9 06:17:28 2019][ 22.923724] scsi 1:0:163:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 22.932114] scsi 1:0:163:0: SSP: handle(0x0043), sas_addr(0x5000cca26a21d742), phy(39), device_name(0x5000cca26a21d743) [Mon Dec 9 06:17:28 2019][ 22.942890] scsi 1:0:163:0: enclosure logical id(0x5000ccab0405db00), slot(48) [Mon Dec 9 06:17:28 2019][ 22.950194] scsi 1:0:163:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 22.957087] scsi 1:0:163:0: serial_number( 2TGLLYED) [Mon Dec 9 06:17:28 2019][ 22.962661] scsi 1:0:163:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 22.985328] mpt3sas_cm0: detecting: handle(0x0044), sas_address(0x5000cca26a27af5e), phy(40) [Mon Dec 9 06:17:28 2019][ 22.993771] mpt3sas_cm0: REPORT_LUNS: handle(0x0044), retries(0) [Mon Dec 9 06:17:28 2019][ 22.999938] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0044), lun(0) [Mon Dec 9 06:17:28 2019][ 23.006708] scsi 1:0:164:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.015105] scsi 1:0:164:0: SSP: handle(0x0044), sas_addr(0x5000cca26a27af5e), phy(40), device_name(0x5000cca26a27af5f) [Mon Dec 9 06:17:28 2019][ 23.025875] scsi 1:0:164:0: enclosure logical id(0x5000ccab0405db00), slot(49) [Mon Dec 9 06:17:28 2019][ 23.033179] scsi 1:0:164:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.040072] scsi 1:0:164:0: serial_number( 2TGPUL5D) [Mon Dec 9 06:17:28 2019][ 23.045645] scsi 1:0:164:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.068333] mpt3sas_cm0: detecting: handle(0x0045), sas_address(0x5000cca2525552e6), phy(41) [Mon Dec 9 06:17:28 2019][ 23.076773] mpt3sas_cm0: REPORT_LUNS: handle(0x0045), retries(0) [Mon Dec 9 06:17:28 2019][ 23.082912] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0045), lun(0) [Mon Dec 9 06:17:28 2019][ 23.142655] scsi 1:0:165:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.151033] scsi 1:0:165:0: SSP: handle(0x0045), sas_addr(0x5000cca2525552e6), phy(41), device_name(0x5000cca2525552e7) [Mon Dec 9 06:17:28 2019][ 23.161806] scsi 1:0:165:0: enclosure logical id(0x5000ccab0405db00), slot(50) [Mon Dec 9 06:17:28 2019][ 23.169111] scsi 1:0:165:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.176004] scsi 1:0:165:0: serial_number( 7SHHXP0G) [Mon Dec 9 06:17:28 2019][ 23.181578] scsi 1:0:165:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.227333] mpt3sas_cm0: detecting: handle(0x0046), sas_address(0x5000cca26a26dff2), phy(42) [Mon Dec 9 06:17:28 2019][ 23.235774] mpt3sas_cm0: REPORT_LUNS: handle(0x0046), retries(0) [Mon Dec 9 06:17:28 2019][ 23.241931] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0046), lun(0) [Mon Dec 9 06:17:28 2019][ 23.248548] scsi 1:0:166:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.256933] scsi 1:0:166:0: SSP: handle(0x0046), sas_addr(0x5000cca26a26dff2), phy(42), device_name(0x5000cca26a26dff3) [Mon Dec 9 06:17:28 2019][ 23.267707] scsi 1:0:166:0: enclosure logical id(0x5000ccab0405db00), slot(51) [Mon Dec 9 06:17:28 2019][ 23.275011] scsi 1:0:166:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.281905] scsi 1:0:166:0: serial_number( 2TGPBSYD) [Mon Dec 9 06:17:28 2019][ 23.287479] scsi 1:0:166:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.307329] mpt3sas_cm0: detecting: handle(0x0047), sas_address(0x5000cca26b9c5d52), phy(43) [Mon Dec 9 06:17:28 2019][ 23.315772] mpt3sas_cm0: REPORT_LUNS: handle(0x0047), retries(0) [Mon Dec 9 06:17:28 2019][ 23.321910] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0047), lun(0) [Mon Dec 9 06:17:28 2019][ 23.328636] scsi 1:0:167:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.337013] scsi 1:0:167:0: SSP: handle(0x0047), sas_addr(0x5000cca26b9c5d52), phy(43), device_name(0x5000cca26b9c5d53) [Mon Dec 9 06:17:28 2019][ 23.347789] scsi 1:0:167:0: enclosure logical id(0x5000ccab0405db00), slot(52) [Mon Dec 9 06:17:28 2019][ 23.355093] scsi 1:0:167:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.361987] scsi 1:0:167:0: serial_number( 1SJSZV5Z) [Mon Dec 9 06:17:28 2019][ 23.367561] scsi 1:0:167:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.390963] mpt3sas_cm0: detecting: handle(0x0048), sas_address(0x5000cca26b9602c6), phy(44) [Mon Dec 9 06:17:28 2019][ 23.399405] mpt3sas_cm0: REPORT_LUNS: handle(0x0048), retries(0) [Mon Dec 9 06:17:28 2019][ 23.405550] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0048), lun(0) [Mon Dec 9 06:17:28 2019][ 23.421839] scsi 1:0:168:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.430254] scsi 1:0:168:0: SSP: handle(0x0048), sas_addr(0x5000cca26b9602c6), phy(44), device_name(0x5000cca26b9602c7) [Mon Dec 9 06:17:28 2019][ 23.441026] scsi 1:0:168:0: enclosure logical id(0x5000ccab0405db00), slot(53) [Mon Dec 9 06:17:28 2019][ 23.448333] scsi 1:0:168:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.455227] scsi 1:0:168:0: serial_number( 1SJNHJ4Z) [Mon Dec 9 06:17:28 2019][ 23.460798] scsi 1:0:168:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.483345] mpt3sas_cm0: detecting: handle(0x0049), sas_address(0x5000cca252544a02), phy(45) [Mon Dec 9 06:17:28 2019][ 23.491788] mpt3sas_cm0: REPORT_LUNS: handle(0x0049), retries(0) [Mon Dec 9 06:17:28 2019][ 23.497955] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0049), lun(0) [Mon Dec 9 06:17:28 2019][ 23.521496] scsi 1:0:169:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.529884] scsi 1:0:169:0: SSP: handle(0x0049), sas_addr(0x5000cca252544a02), phy(45), device_name(0x5000cca252544a03) [Mon Dec 9 06:17:28 2019][ 23.540660] scsi 1:0:169:0: enclosure logical id(0x5000ccab0405db00), slot(54) [Mon Dec 9 06:17:28 2019][ 23.547968] scsi 1:0:169:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.554862] scsi 1:0:169:0: serial_number( 7SHHB14G) [Mon Dec 9 06:17:28 2019][ 23.560434] scsi 1:0:169:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.580344] mpt3sas_cm0: detecting: handle(0x004a), sas_address(0x5000cca252559f9e), phy(46) [Mon Dec 9 06:17:28 2019][ 23.588787] mpt3sas_cm0: REPORT_LUNS: handle(0x004a), retries(0) [Mon Dec 9 06:17:28 2019][ 23.594926] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004a), lun(0) [Mon Dec 9 06:17:28 2019][ 23.609780] scsi 1:0:170:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.618175] scsi 1:0:170:0: SSP: handle(0x004a), sas_addr(0x5000cca252559f9e), phy(46), device_name(0x5000cca252559f9f) [Mon Dec 9 06:17:28 2019][ 23.628950] scsi 1:0:170:0: enclosure logical id(0x5000ccab0405db00), slot(55) [Mon Dec 9 06:17:28 2019][ 23.636257] scsi 1:0:170:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.643151] scsi 1:0:170:0: serial_number( 7SHJ2TDG) [Mon Dec 9 06:17:28 2019][ 23.648723] scsi 1:0:170:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.678375] mpt3sas_cm0: detecting: handle(0x004b), sas_address(0x5000cca25255571e), phy(47) [Mon Dec 9 06:17:28 2019][ 23.686816] mpt3sas_cm0: REPORT_LUNS: handle(0x004b), retries(0) [Mon Dec 9 06:17:28 2019][ 23.692956] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004b), lun(0) [Mon Dec 9 06:17:28 2019][ 23.707569] scsi 1:0:171:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:28 2019][ 23.715945] scsi 1:0:171:0: SSP: handle(0x004b), sas_addr(0x5000cca25255571e), phy(47), device_name(0x5000cca25255571f) [Mon Dec 9 06:17:28 2019][ 23.726723] scsi 1:0:171:0: enclosure logical id(0x5000ccab0405db00), slot(56) [Mon Dec 9 06:17:28 2019][ 23.734028] scsi 1:0:171:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:28 2019][ 23.740921] scsi 1:0:171:0: serial_number( 7SHHXYRG) [Mon Dec 9 06:17:28 2019][ 23.746494] scsi 1:0:171:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:28 2019][ 23.766347] mpt3sas_cm0: detecting: handle(0x004c), sas_address(0x5000cca26b9bf57e), phy(48) [Mon Dec 9 06:17:28 2019][ 23.774786] mpt3sas_cm0: REPORT_LUNS: handle(0x004c), retries(0) [Mon Dec 9 06:17:29 2019][ 23.780923] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004c), lun(0) [Mon Dec 9 06:17:29 2019][ 23.805687] scsi 1:0:172:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 23.814070] scsi 1:0:172:0: SSP: handle(0x004c), sas_addr(0x5000cca26b9bf57e), phy(48), device_name(0x5000cca26b9bf57f) [Mon Dec 9 06:17:29 2019][ 23.824840] scsi 1:0:172:0: enclosure logical id(0x5000ccab0405db00), slot(57) [Mon Dec 9 06:17:29 2019][ 23.832147] scsi 1:0:172:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 23.839039] scsi 1:0:172:0: serial_number( 1SJSSXUZ) [Mon Dec 9 06:17:29 2019][ 23.844613] scsi 1:0:172:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 23.867347] mpt3sas_cm0: detecting: handle(0x004d), sas_address(0x5000cca252555372), phy(49) [Mon Dec 9 06:17:29 2019][ 23.875788] mpt3sas_cm0: REPORT_LUNS: handle(0x004d), retries(0) [Mon Dec 9 06:17:29 2019][ 23.881952] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004d), lun(0) [Mon Dec 9 06:17:29 2019][ 23.888580] scsi 1:0:173:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 23.896952] scsi 1:0:173:0: SSP: handle(0x004d), sas_addr(0x5000cca252555372), phy(49), device_name(0x5000cca252555373) [Mon Dec 9 06:17:29 2019][ 23.907722] scsi 1:0:173:0: enclosure logical id(0x5000ccab0405db00), slot(58) [Mon Dec 9 06:17:29 2019][ 23.915026] scsi 1:0:173:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 23.921920] scsi 1:0:173:0: serial_number( 7SHHXR4G) [Mon Dec 9 06:17:29 2019][ 23.927494] scsi 1:0:173:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 23.947349] mpt3sas_cm0: detecting: handle(0x004e), sas_address(0x5000cca25253eefe), phy(50) [Mon Dec 9 06:17:29 2019][ 23.955786] mpt3sas_cm0: REPORT_LUNS: handle(0x004e), retries(0) [Mon Dec 9 06:17:29 2019][ 23.961956] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004e), lun(0) [Mon Dec 9 06:17:29 2019][ 23.968592] scsi 1:0:174:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 23.976965] scsi 1:0:174:0: SSP: handle(0x004e), sas_addr(0x5000cca25253eefe), phy(50), device_name(0x5000cca25253eeff) [Mon Dec 9 06:17:29 2019][ 23.987734] scsi 1:0:174:0: enclosure logical id(0x5000ccab0405db00), slot(59) [Mon Dec 9 06:17:29 2019][ 23.995039] scsi 1:0:174:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 24.001933] scsi 1:0:174:0: serial_number( 7SHH4Z7G) [Mon Dec 9 06:17:29 2019][ 24.007506] scsi 1:0:174:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 24.029845] mpt3sas_cm0: expander_add: handle(0x001a), parent(0x0017), sas_addr(0x5000ccab0405db7b), phys(68) [Mon Dec 9 06:17:29 2019][ 24.050568] mpt3sas_cm0: detecting: handle(0x004f), sas_address(0x5000cca26b9cbb06), phy(42) [Mon Dec 9 06:17:29 2019][ 24.059024] mpt3sas_cm0: REPORT_LUNS: handle(0x004f), retries(0) [Mon Dec 9 06:17:29 2019][ 24.065149] mpt3sas_cm0: TEST_UNIT_READY: handle(0x004f), lun(0) [Mon Dec 9 06:17:29 2019][ 24.071795] scsi 1:0:175:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 24.080191] scsi 1:0:175:0: SSP: handle(0x004f), sas_addr(0x5000cca26b9cbb06), phy(42), device_name(0x5000cca26b9cbb07) [Mon Dec 9 06:17:29 2019][ 24.090967] scsi 1:0:175:0: enclosure logical id(0x5000ccab0405db00), slot(1) [Mon Dec 9 06:17:29 2019][ 24.098184] scsi 1:0:175:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 24.105078] scsi 1:0:175:0: serial_number( 1SJT62MZ) [Mon Dec 9 06:17:29 2019][ 24.110649] scsi 1:0:175:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 24.147349] mpt3sas_cm0: detecting: handle(0x0050), sas_address(0x5000cca252544476), phy(43) [Mon Dec 9 06:17:29 2019][ 24.155805] mpt3sas_cm0: REPORT_LUNS: handle(0x0050), retries(0) [Mon Dec 9 06:17:29 2019][ 24.161950] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0050), lun(0) [Mon Dec 9 06:17:29 2019][ 24.168678] scsi 1:0:176:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 24.184932] scsi 1:0:176:0: SSP: handle(0x0050), sas_addr(0x5000cca252544476), phy(43), device_name(0x5000cca252544477) [Mon Dec 9 06:17:29 2019][ 24.195704] scsi 1:0:176:0: enclosure logical id(0x5000ccab0405db00), slot(3) [Mon Dec 9 06:17:29 2019][ 24.202924] scsi 1:0:176:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 24.209815] scsi 1:0:176:0: serial_number( 7SHHANPG) [Mon Dec 9 06:17:29 2019][ 24.215389] scsi 1:0:176:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 24.235948] mpt3sas_cm0: detecting: handle(0x0051), sas_address(0x5000cca26a26173e), phy(44) [Mon Dec 9 06:17:29 2019][ 24.244384] mpt3sas_cm0: REPORT_LUNS: handle(0x0051), retries(0) [Mon Dec 9 06:17:29 2019][ 24.250543] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0051), lun(0) [Mon Dec 9 06:17:29 2019][ 24.275589] scsi 1:0:177:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 24.283977] scsi 1:0:177:0: SSP: handle(0x0051), sas_addr(0x5000cca26a26173e), phy(44), device_name(0x5000cca26a26173f) [Mon Dec 9 06:17:29 2019][ 24.294751] scsi 1:0:177:0: enclosure logical id(0x5000ccab0405db00), slot(4) [Mon Dec 9 06:17:29 2019][ 24.301968] scsi 1:0:177:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 24.308861] scsi 1:0:177:0: serial_number( 2TGNYDLD) [Mon Dec 9 06:17:29 2019][ 24.314435] scsi 1:0:177:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 24.334358] mpt3sas_cm0: detecting: handle(0x0052), sas_address(0x5000cca252544cb6), phy(45) [Mon Dec 9 06:17:29 2019][ 24.342796] mpt3sas_cm0: REPORT_LUNS: handle(0x0052), retries(0) [Mon Dec 9 06:17:29 2019][ 24.348954] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0052), lun(0) [Mon Dec 9 06:17:29 2019][ 24.355768] scsi 1:0:178:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 24.364145] scsi 1:0:178:0: SSP: handle(0x0052), sas_addr(0x5000cca252544cb6), phy(45), device_name(0x5000cca252544cb7) [Mon Dec 9 06:17:29 2019][ 24.374919] scsi 1:0:178:0: enclosure logical id(0x5000ccab0405db00), slot(5) [Mon Dec 9 06:17:29 2019][ 24.382136] scsi 1:0:178:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 24.389029] scsi 1:0:178:0: serial_number( 7SHHB6RG) [Mon Dec 9 06:17:29 2019][ 24.394602] scsi 1:0:178:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:29 2019][ 24.446367] mpt3sas_cm0: detecting: handle(0x0053), sas_address(0x5000cca26c238692), phy(46) [Mon Dec 9 06:17:29 2019][ 24.454805] mpt3sas_cm0: REPORT_LUNS: handle(0x0053), retries(0) [Mon Dec 9 06:17:29 2019][ 24.460952] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0053), lun(0) [Mon Dec 9 06:17:29 2019][ 24.706352] scsi 1:0:179:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:29 2019][ 24.714726] scsi 1:0:179:0: SSP: handle(0x0053), sas_addr(0x5000cca26c238692), phy(46), device_name(0x5000cca26c238693) [Mon Dec 9 06:17:29 2019][ 24.725502] scsi 1:0:179:0: enclosure logical id(0x5000ccab0405db00), slot(6) [Mon Dec 9 06:17:29 2019][ 24.732722] scsi 1:0:179:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:29 2019][ 24.739615] scsi 1:0:179:0: serial_number( 1DGMJNWZ) [Mon Dec 9 06:17:29 2019][ 24.745188] scsi 1:0:179:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 24.931367] mpt3sas_cm0: detecting: handle(0x0054), sas_address(0x5000cca26a2ac96a), phy(47) [Mon Dec 9 06:17:30 2019][ 24.939820] mpt3sas_cm0: REPORT_LUNS: handle(0x0054), retries(0) [Mon Dec 9 06:17:30 2019][ 24.945945] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0054), lun(0) [Mon Dec 9 06:17:30 2019][ 24.999733] scsi 1:0:180:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.008148] scsi 1:0:180:0: SSP: handle(0x0054), sas_addr(0x5000cca26a2ac96a), phy(47), device_name(0x5000cca26a2ac96b) [Mon Dec 9 06:17:30 2019][ 25.018921] scsi 1:0:180:0: enclosure logical id(0x5000ccab0405db00), slot(7) [Mon Dec 9 06:17:30 2019][ 25.026138] scsi 1:0:180:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:30 2019][ 25.033030] scsi 1:0:180:0: serial_number( 2TGSJGHD) [Mon Dec 9 06:17:30 2019][ 25.038603] scsi 1:0:180:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.155379] mpt3sas_cm0: detecting: handle(0x0055), sas_address(0x5000cca25253e61a), phy(48) [Mon Dec 9 06:17:30 2019][ 25.163819] mpt3sas_cm0: REPORT_LUNS: handle(0x0055), retries(0) [Mon Dec 9 06:17:30 2019][ 25.169993] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0055), lun(0) [Mon Dec 9 06:17:30 2019][ 25.176816] scsi 1:0:181:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.185205] scsi 1:0:181:0: SSP: handle(0x0055), sas_addr(0x5000cca25253e61a), phy(48), device_name(0x5000cca25253e61b) [Mon Dec 9 06:17:30 2019][ 25.195976] scsi 1:0:181:0: enclosure logical id(0x5000ccab0405db00), slot(8) [Mon Dec 9 06:17:30 2019][ 25.203195] scsi 1:0:181:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:30 2019][ 25.210094] scsi 1:0:181:0: serial_number( 7SHH4BWG) [Mon Dec 9 06:17:30 2019][ 25.215670] scsi 1:0:181:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.238382] mpt3sas_cm0: detecting: handle(0x0056), sas_address(0x5000cca252542cfe), phy(49) [Mon Dec 9 06:17:30 2019][ 25.246821] mpt3sas_cm0: REPORT_LUNS: handle(0x0056), retries(0) [Mon Dec 9 06:17:30 2019][ 25.252993] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0056), lun(0) [Mon Dec 9 06:17:30 2019][ 25.259851] scsi 1:0:182:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.268284] scsi 1:0:182:0: SSP: handle(0x0056), sas_addr(0x5000cca252542cfe), phy(49), device_name(0x5000cca252542cff) [Mon Dec 9 06:17:30 2019][ 25.279056] scsi 1:0:182:0: enclosure logical id(0x5000ccab0405db00), slot(9) [Mon Dec 9 06:17:30 2019][ 25.286276] scsi 1:0:182:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:30 2019][ 25.293168] scsi 1:0:182:0: serial_number( 7SHH937G) [Mon Dec 9 06:17:30 2019][ 25.298744] scsi 1:0:182:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.321384] mpt3sas_cm0: detecting: handle(0x0057), sas_address(0x5000cca26a3181fe), phy(50) [Mon Dec 9 06:17:30 2019][ 25.329825] mpt3sas_cm0: REPORT_LUNS: handle(0x0057), retries(0) [Mon Dec 9 06:17:30 2019][ 25.335991] mpt3sas_cm0: TEST_UNIT_READY: handle(0x0057), lun(0) [Mon Dec 9 06:17:30 2019][ 25.342817] scsi 1:0:183:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.351209] scsi 1:0:183:0: SSP: handle(0x0057), sas_addr(0x5000cca26a3181fe), phy(50), device_name(0x5000cca26a3181ff) [Mon Dec 9 06:17:30 2019][ 25.361981] scsi 1:0:183:0: enclosure logical id(0x5000ccab0405db00), slot(10) [Mon Dec 9 06:17:30 2019][ 25.369287] scsi 1:0:183:0: enclosure level(0x0000), connector name( C1 ) [Mon Dec 9 06:17:30 2019][ 25.376181] scsi 1:0:183:0: serial_number( 2TGW71ND) [Mon Dec 9 06:17:30 2019][ 25.381753] scsi 1:0:183:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.406892] mpt3sas_cm0: expander_add: handle(0x0099), parent(0x000a), sas_addr(0x5000ccab0405db3d), phys(49) [Mon Dec 9 06:17:30 2019][ 25.428327] mpt3sas_cm0: detecting: handle(0x009d), sas_address(0x5000ccab0405db3c), phy(48) [Mon Dec 9 06:17:30 2019][ 25.436763] mpt3sas_cm0: REPORT_LUNS: handle(0x009d), retries(0) [Mon Dec 9 06:17:30 2019][ 25.444026] mpt3sas_cm0: TEST_UNIT_READY: handle(0x009d), lun(0) [Mon Dec 9 06:17:30 2019][ 25.450935] scsi 1:0:184:0: Enclosure HGST H4060-J 2033 PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.459522] scsi 1:0:184:0: set ignore_delay_remove for handle(0x009d) [Mon Dec 9 06:17:30 2019][ 25.466046] scsi 1:0:184:0: SES: handle(0x009d), sas_addr(0x5000ccab0405db3c), phy(48), device_name(0x0000000000000000) [Mon Dec 9 06:17:30 2019][ 25.476817] scsi 1:0:184:0: enclosure logical id(0x5000ccab0405db00), slot(60) [Mon Dec 9 06:17:30 2019][ 25.484122] scsi 1:0:184:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:30 2019][ 25.491014] scsi 1:0:184:0: serial_number(USWSJ03918EZ0069 ) [Mon Dec 9 06:17:30 2019][ 25.496936] scsi 1:0:184:0: qdepth(1), tagged(0), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.521756] mpt3sas_cm0: expander_add: handle(0x009b), parent(0x0099), sas_addr(0x5000ccab0405db3f), phys(68) [Mon Dec 9 06:17:30 2019][ 25.543977] mpt3sas_cm0: detecting: handle(0x009e), sas_address(0x5000cca252550a75), phy(0) [Mon Dec 9 06:17:30 2019][ 25.552334] mpt3sas_cm0: REPORT_LUNS: handle(0x009e), retries(0) [Mon Dec 9 06:17:30 2019][ 25.558455] mpt3sas_cm0: TEST_UNIT_READY: handle(0x009e), lun(0) [Mon Dec 9 06:17:30 2019][ 25.565084] scsi 1:0:185:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.573487] scsi 1:0:185:0: SSP: handle(0x009e), sas_addr(0x5000cca252550a75), phy(0), device_name(0x5000cca252550a77) [Mon Dec 9 06:17:30 2019][ 25.584174] scsi 1:0:185:0: enclosure logical id(0x5000ccab0405db00), slot(0) [Mon Dec 9 06:17:30 2019][ 25.591393] scsi 1:0:185:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:30 2019][ 25.598287] scsi 1:0:185:0: serial_number( 7SHHSVGG) [Mon Dec 9 06:17:30 2019][ 25.603860] scsi 1:0:185:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.626393] mpt3sas_cm0: detecting: handle(0x009f), sas_address(0x5000cca25253eb31), phy(1) [Mon Dec 9 06:17:30 2019][ 25.634742] mpt3sas_cm0: REPORT_LUNS: handle(0x009f), retries(0) [Mon Dec 9 06:17:30 2019][ 25.640883] mpt3sas_cm0: TEST_UNIT_READY: handle(0x009f), lun(0) [Mon Dec 9 06:17:30 2019][ 25.647533] scsi 1:0:186:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.655910] scsi 1:0:186:0: SSP: handle(0x009f), sas_addr(0x5000cca25253eb31), phy(1), device_name(0x5000cca25253eb33) [Mon Dec 9 06:17:30 2019][ 25.666596] scsi 1:0:186:0: enclosure logical id(0x5000ccab0405db00), slot(2) [Mon Dec 9 06:17:30 2019][ 25.673816] scsi 1:0:186:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:30 2019][ 25.680709] scsi 1:0:186:0: serial_number( 7SHH4RDG) [Mon Dec 9 06:17:30 2019][ 25.686280] scsi 1:0:186:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:30 2019][ 25.706392] mpt3sas_cm0: detecting: handle(0x00a0), sas_address(0x5000cca26b950bb5), phy(2) [Mon Dec 9 06:17:30 2019][ 25.714746] mpt3sas_cm0: REPORT_LUNS: handle(0x00a0), retries(0) [Mon Dec 9 06:17:30 2019][ 25.720892] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a0), lun(0) [Mon Dec 9 06:17:30 2019][ 25.764588] scsi 1:0:187:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:30 2019][ 25.772971] scsi 1:0:187:0: SSP: handle(0x00a0), sas_addr(0x5000cca26b950bb5), phy(2), device_name(0x5000cca26b950bb7) [Mon Dec 9 06:17:30 2019][ 25.783659] scsi 1:0:187:0: enclosure logical id(0x5000ccab0405db00), slot(11) [Mon Dec 9 06:17:31 2019][ 25.790967] scsi 1:0:187:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 25.797860] scsi 1:0:187:0: serial_number( 1SJMZ22Z) [Mon Dec 9 06:17:31 2019][ 25.803431] scsi 1:0:187:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 25.832397] mpt3sas_cm0: detecting: handle(0x00a1), sas_address(0x5000cca25253f3bd), phy(3) [Mon Dec 9 06:17:31 2019][ 25.840745] mpt3sas_cm0: REPORT_LUNS: handle(0x00a1), retries(0) [Mon Dec 9 06:17:31 2019][ 25.846885] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a1), lun(0) [Mon Dec 9 06:17:31 2019][ 25.853530] scsi 1:0:188:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 25.861915] scsi 1:0:188:0: SSP: handle(0x00a1), sas_addr(0x5000cca25253f3bd), phy(3), device_name(0x5000cca25253f3bf) [Mon Dec 9 06:17:31 2019][ 25.872598] scsi 1:0:188:0: enclosure logical id(0x5000ccab0405db00), slot(12) [Mon Dec 9 06:17:31 2019][ 25.879905] scsi 1:0:188:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 25.886798] scsi 1:0:188:0: serial_number( 7SHH591G) [Mon Dec 9 06:17:31 2019][ 25.892372] scsi 1:0:188:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 25.912398] mpt3sas_cm0: detecting: handle(0x00a2), sas_address(0x5000cca26a2ac3d9), phy(4) [Mon Dec 9 06:17:31 2019][ 25.920749] mpt3sas_cm0: REPORT_LUNS: handle(0x00a2), retries(0) [Mon Dec 9 06:17:31 2019][ 25.926922] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a2), lun(0) [Mon Dec 9 06:17:31 2019][ 25.933712] scsi 1:0:189:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 25.942091] scsi 1:0:189:0: SSP: handle(0x00a2), sas_addr(0x5000cca26a2ac3d9), phy(4), device_name(0x5000cca26a2ac3db) [Mon Dec 9 06:17:31 2019][ 25.952776] scsi 1:0:189:0: enclosure logical id(0x5000ccab0405db00), slot(13) [Mon Dec 9 06:17:31 2019][ 25.960084] scsi 1:0:189:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 25.966974] scsi 1:0:189:0: serial_number( 2TGSJ30D) [Mon Dec 9 06:17:31 2019][ 25.972548] scsi 1:0:189:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 25.992400] mpt3sas_cm0: detecting: handle(0x00a3), sas_address(0x5000cca252541029), phy(5) [Mon Dec 9 06:17:31 2019][ 26.000753] mpt3sas_cm0: REPORT_LUNS: handle(0x00a3), retries(0) [Mon Dec 9 06:17:31 2019][ 26.006920] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a3), lun(0) [Mon Dec 9 06:17:31 2019][ 26.013633] scsi 1:0:190:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.022009] scsi 1:0:190:0: SSP: handle(0x00a3), sas_addr(0x5000cca252541029), phy(5), device_name(0x5000cca25254102b) [Mon Dec 9 06:17:31 2019][ 26.032695] scsi 1:0:190:0: enclosure logical id(0x5000ccab0405db00), slot(14) [Mon Dec 9 06:17:31 2019][ 26.040000] scsi 1:0:190:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.046892] scsi 1:0:190:0: serial_number( 7SHH75RG) [Mon Dec 9 06:17:31 2019][ 26.052465] scsi 1:0:190:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.072401] mpt3sas_cm0: detecting: handle(0x00a4), sas_address(0x5000cca252545349), phy(6) [Mon Dec 9 06:17:31 2019][ 26.080749] mpt3sas_cm0: REPORT_LUNS: handle(0x00a4), retries(0) [Mon Dec 9 06:17:31 2019][ 26.086889] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a4), lun(0) [Mon Dec 9 06:17:31 2019][ 26.093539] scsi 1:0:191:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.101914] scsi 1:0:191:0: SSP: handle(0x00a4), sas_addr(0x5000cca252545349), phy(6), device_name(0x5000cca25254534b) [Mon Dec 9 06:17:31 2019][ 26.112602] scsi 1:0:191:0: enclosure logical id(0x5000ccab0405db00), slot(15) [Mon Dec 9 06:17:31 2019][ 26.119909] scsi 1:0:191:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.126801] scsi 1:0:191:0: serial_number( 7SHHBN9G) [Mon Dec 9 06:17:31 2019][ 26.132376] scsi 1:0:191:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.152404] mpt3sas_cm0: detecting: handle(0x00a5), sas_address(0x5000cca2525430c5), phy(7) [Mon Dec 9 06:17:31 2019][ 26.160752] mpt3sas_cm0: REPORT_LUNS: handle(0x00a5), retries(0) [Mon Dec 9 06:17:31 2019][ 26.167075] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a5), lun(0) [Mon Dec 9 06:17:31 2019][ 26.178924] scsi 1:0:192:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.187308] scsi 1:0:192:0: SSP: handle(0x00a5), sas_addr(0x5000cca2525430c5), phy(7), device_name(0x5000cca2525430c7) [Mon Dec 9 06:17:31 2019][ 26.197997] scsi 1:0:192:0: enclosure logical id(0x5000ccab0405db00), slot(16) [Mon Dec 9 06:17:31 2019][ 26.205304] scsi 1:0:192:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.212197] scsi 1:0:192:0: serial_number( 7SHH9B1G) [Mon Dec 9 06:17:31 2019][ 26.217770] scsi 1:0:192:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.240414] mpt3sas_cm0: detecting: handle(0x00a6), sas_address(0x5000cca25254385d), phy(8) [Mon Dec 9 06:17:31 2019][ 26.248764] mpt3sas_cm0: REPORT_LUNS: handle(0x00a6), retries(0) [Mon Dec 9 06:17:31 2019][ 26.254897] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a6), lun(0) [Mon Dec 9 06:17:31 2019][ 26.266038] scsi 1:0:193:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.274419] scsi 1:0:193:0: SSP: handle(0x00a6), sas_addr(0x5000cca25254385d), phy(8), device_name(0x5000cca25254385f) [Mon Dec 9 06:17:31 2019][ 26.285107] scsi 1:0:193:0: enclosure logical id(0x5000ccab0405db00), slot(17) [Mon Dec 9 06:17:31 2019][ 26.292414] scsi 1:0:193:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.299306] scsi 1:0:193:0: serial_number( 7SHH9VRG) [Mon Dec 9 06:17:31 2019][ 26.304879] scsi 1:0:193:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.333425] mpt3sas_cm0: detecting: handle(0x00a7), sas_address(0x5000cca25253f30d), phy(9) [Mon Dec 9 06:17:31 2019][ 26.341777] mpt3sas_cm0: REPORT_LUNS: handle(0x00a7), retries(0) [Mon Dec 9 06:17:31 2019][ 26.347919] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a7), lun(0) [Mon Dec 9 06:17:31 2019][ 26.354730] scsi 1:0:194:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.363122] scsi 1:0:194:0: SSP: handle(0x00a7), sas_addr(0x5000cca25253f30d), phy(9), device_name(0x5000cca25253f30f) [Mon Dec 9 06:17:31 2019][ 26.373805] scsi 1:0:194:0: enclosure logical id(0x5000ccab0405db00), slot(18) [Mon Dec 9 06:17:31 2019][ 26.381110] scsi 1:0:194:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.388004] scsi 1:0:194:0: serial_number( 7SHH57MG) [Mon Dec 9 06:17:31 2019][ 26.393578] scsi 1:0:194:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.413444] mpt3sas_cm0: detecting: handle(0x00a8), sas_address(0x5000cca252545f65), phy(10) [Mon Dec 9 06:17:31 2019][ 26.421884] mpt3sas_cm0: REPORT_LUNS: handle(0x00a8), retries(0) [Mon Dec 9 06:17:31 2019][ 26.428044] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a8), lun(0) [Mon Dec 9 06:17:31 2019][ 26.434872] scsi 1:0:195:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.443275] scsi 1:0:195:0: SSP: handle(0x00a8), sas_addr(0x5000cca252545f65), phy(10), device_name(0x5000cca252545f67) [Mon Dec 9 06:17:31 2019][ 26.454044] scsi 1:0:195:0: enclosure logical id(0x5000ccab0405db00), slot(19) [Mon Dec 9 06:17:31 2019][ 26.461350] scsi 1:0:195:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.468240] scsi 1:0:195:0: serial_number( 7SHHDG9G) [Mon Dec 9 06:17:31 2019][ 26.473815] scsi 1:0:195:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.496412] mpt3sas_cm0: detecting: handle(0x00a9), sas_address(0x5000cca266daa4e5), phy(11) [Mon Dec 9 06:17:31 2019][ 26.504852] mpt3sas_cm0: REPORT_LUNS: handle(0x00a9), retries(0) [Mon Dec 9 06:17:31 2019][ 26.510986] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00a9), lun(0) [Mon Dec 9 06:17:31 2019][ 26.525513] scsi 1:0:196:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.533900] scsi 1:0:196:0: SSP: handle(0x00a9), sas_addr(0x5000cca266daa4e5), phy(11), device_name(0x5000cca266daa4e7) [Mon Dec 9 06:17:31 2019][ 26.544673] scsi 1:0:196:0: enclosure logical id(0x5000ccab0405db00), slot(20) [Mon Dec 9 06:17:31 2019][ 26.551978] scsi 1:0:196:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.558872] scsi 1:0:196:0: serial_number( 7JKW7MYK) [Mon Dec 9 06:17:31 2019][ 26.564450] scsi 1:0:196:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.594415] mpt3sas_cm0: detecting: handle(0x00aa), sas_address(0x5000cca26a25167d), phy(12) [Mon Dec 9 06:17:31 2019][ 26.602849] mpt3sas_cm0: REPORT_LUNS: handle(0x00aa), retries(0) [Mon Dec 9 06:17:31 2019][ 26.609007] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00aa), lun(0) [Mon Dec 9 06:17:31 2019][ 26.615817] scsi 1:0:197:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.624208] scsi 1:0:197:0: SSP: handle(0x00aa), sas_addr(0x5000cca26a25167d), phy(12), device_name(0x5000cca26a25167f) [Mon Dec 9 06:17:31 2019][ 26.634981] scsi 1:0:197:0: enclosure logical id(0x5000ccab0405db00), slot(21) [Mon Dec 9 06:17:31 2019][ 26.642288] scsi 1:0:197:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.649182] scsi 1:0:197:0: serial_number( 2TGND9JD) [Mon Dec 9 06:17:31 2019][ 26.654752] scsi 1:0:197:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.677415] mpt3sas_cm0: detecting: handle(0x00ab), sas_address(0x5000cca25253eda9), phy(13) [Mon Dec 9 06:17:31 2019][ 26.685852] mpt3sas_cm0: REPORT_LUNS: handle(0x00ab), retries(0) [Mon Dec 9 06:17:31 2019][ 26.691985] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ab), lun(0) [Mon Dec 9 06:17:31 2019][ 26.701296] scsi 1:0:198:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:31 2019][ 26.709679] scsi 1:0:198:0: SSP: handle(0x00ab), sas_addr(0x5000cca25253eda9), phy(13), device_name(0x5000cca25253edab) [Mon Dec 9 06:17:31 2019][ 26.720456] scsi 1:0:198:0: enclosure logical id(0x5000ccab0405db00), slot(22) [Mon Dec 9 06:17:31 2019][ 26.727760] scsi 1:0:198:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:31 2019][ 26.734652] scsi 1:0:198:0: serial_number( 7SHH4WHG) [Mon Dec 9 06:17:31 2019][ 26.740225] scsi 1:0:198:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:31 2019][ 26.763416] mpt3sas_cm0: detecting: handle(0x00ac), sas_address(0x5000cca266d491a1), phy(14) [Mon Dec 9 06:17:31 2019][ 26.771853] mpt3sas_cm0: REPORT_LUNS: handle(0x00ac), retries(0) [Mon Dec 9 06:17:31 2019][ 26.777986] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ac), lun(0) [Mon Dec 9 06:17:31 2019][ 26.784642] scsi 1:0:199:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 26.793028] scsi 1:0:199:0: SSP: handle(0x00ac), sas_addr(0x5000cca266d491a1), phy(14), device_name(0x5000cca266d491a3) [Mon Dec 9 06:17:32 2019][ 26.803805] scsi 1:0:199:0: enclosure logical id(0x5000ccab0405db00), slot(23) [Mon Dec 9 06:17:32 2019][ 26.811109] scsi 1:0:199:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 26.818001] scsi 1:0:199:0: serial_number( 7JKSX22K) [Mon Dec 9 06:17:32 2019][ 26.823577] scsi 1:0:199:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 26.843419] mpt3sas_cm0: detecting: handle(0x00ad), sas_address(0x5000cca26b9a7099), phy(15) [Mon Dec 9 06:17:32 2019][ 26.851857] mpt3sas_cm0: REPORT_LUNS: handle(0x00ad), retries(0) [Mon Dec 9 06:17:32 2019][ 26.858031] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ad), lun(0) [Mon Dec 9 06:17:32 2019][ 26.874100] scsi 1:0:200:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 26.882478] scsi 1:0:200:0: SSP: handle(0x00ad), sas_addr(0x5000cca26b9a7099), phy(15), device_name(0x5000cca26b9a709b) [Mon Dec 9 06:17:32 2019][ 26.893254] scsi 1:0:200:0: enclosure logical id(0x5000ccab0405db00), slot(24) [Mon Dec 9 06:17:32 2019][ 26.900561] scsi 1:0:200:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 26.907454] scsi 1:0:200:0: serial_number( 1SJRY0YZ) [Mon Dec 9 06:17:32 2019][ 26.913026] scsi 1:0:200:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 26.935424] mpt3sas_cm0: detecting: handle(0x00ae), sas_address(0x5000cca25253f831), phy(16) [Mon Dec 9 06:17:32 2019][ 26.943866] mpt3sas_cm0: REPORT_LUNS: handle(0x00ae), retries(0) [Mon Dec 9 06:17:32 2019][ 26.957487] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ae), lun(0) [Mon Dec 9 06:17:32 2019][ 26.965975] scsi 1:0:201:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 26.974368] scsi 1:0:201:0: SSP: handle(0x00ae), sas_addr(0x5000cca25253f831), phy(16), device_name(0x5000cca25253f833) [Mon Dec 9 06:17:32 2019][ 26.985141] scsi 1:0:201:0: enclosure logical id(0x5000ccab0405db00), slot(25) [Mon Dec 9 06:17:32 2019][ 26.992448] scsi 1:0:201:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 26.999341] scsi 1:0:201:0: serial_number( 7SHH5L7G) [Mon Dec 9 06:17:32 2019][ 27.004912] scsi 1:0:201:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.035435] mpt3sas_cm0: detecting: handle(0x00af), sas_address(0x5000cca26a2ab23d), phy(17) [Mon Dec 9 06:17:32 2019][ 27.043873] mpt3sas_cm0: REPORT_LUNS: handle(0x00af), retries(0) [Mon Dec 9 06:17:32 2019][ 27.050033] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00af), lun(0) [Mon Dec 9 06:17:32 2019][ 27.056681] scsi 1:0:202:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.065071] scsi 1:0:202:0: SSP: handle(0x00af), sas_addr(0x5000cca26a2ab23d), phy(17), device_name(0x5000cca26a2ab23f) [Mon Dec 9 06:17:32 2019][ 27.075839] scsi 1:0:202:0: enclosure logical id(0x5000ccab0405db00), slot(26) [Mon Dec 9 06:17:32 2019][ 27.083146] scsi 1:0:202:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.090037] scsi 1:0:202:0: serial_number( 2TGSGXND) [Mon Dec 9 06:17:32 2019][ 27.095615] scsi 1:0:202:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.116475] mpt3sas_cm0: detecting: handle(0x00b0), sas_address(0x5000cca26b9b9695), phy(18) [Mon Dec 9 06:17:32 2019][ 27.124907] mpt3sas_cm0: REPORT_LUNS: handle(0x00b0), retries(0) [Mon Dec 9 06:17:32 2019][ 27.131074] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b0), lun(0) [Mon Dec 9 06:17:32 2019][ 27.137769] scsi 1:0:203:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.146201] scsi 1:0:203:0: SSP: handle(0x00b0), sas_addr(0x5000cca26b9b9695), phy(18), device_name(0x5000cca26b9b9697) [Mon Dec 9 06:17:32 2019][ 27.156970] scsi 1:0:203:0: enclosure logical id(0x5000ccab0405db00), slot(27) [Mon Dec 9 06:17:32 2019][ 27.164278] scsi 1:0:203:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.171169] scsi 1:0:203:0: serial_number( 1SJSKLWZ) [Mon Dec 9 06:17:32 2019][ 27.176743] scsi 1:0:203:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.197430] mpt3sas_cm0: detecting: handle(0x00b1), sas_address(0x5000cca252559471), phy(19) [Mon Dec 9 06:17:32 2019][ 27.205864] mpt3sas_cm0: REPORT_LUNS: handle(0x00b1), retries(0) [Mon Dec 9 06:17:32 2019][ 27.212023] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b1), lun(0) [Mon Dec 9 06:17:32 2019][ 27.218682] scsi 1:0:204:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.227066] scsi 1:0:204:0: SSP: handle(0x00b1), sas_addr(0x5000cca252559471), phy(19), device_name(0x5000cca252559473) [Mon Dec 9 06:17:32 2019][ 27.237843] scsi 1:0:204:0: enclosure logical id(0x5000ccab0405db00), slot(28) [Mon Dec 9 06:17:32 2019][ 27.245146] scsi 1:0:204:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.252038] scsi 1:0:204:0: serial_number( 7SHJ21AG) [Mon Dec 9 06:17:32 2019][ 27.257614] scsi 1:0:204:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.312439] mpt3sas_cm0: detecting: handle(0x00b2), sas_address(0x5000cca25253f94d), phy(20) [Mon Dec 9 06:17:32 2019][ 27.320874] mpt3sas_cm0: REPORT_LUNS: handle(0x00b2), retries(0) [Mon Dec 9 06:17:32 2019][ 27.327010] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b2), lun(0) [Mon Dec 9 06:17:32 2019][ 27.345304] scsi 1:0:205:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.353687] scsi 1:0:205:0: SSP: handle(0x00b2), sas_addr(0x5000cca25253f94d), phy(20), device_name(0x5000cca25253f94f) [Mon Dec 9 06:17:32 2019][ 27.364456] scsi 1:0:205:0: enclosure logical id(0x5000ccab0405db00), slot(29) [Mon Dec 9 06:17:32 2019][ 27.371762] scsi 1:0:205:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.378640] scsi 1:0:205:0: serial_number( 7SHH5NJG) [Mon Dec 9 06:17:32 2019][ 27.384210] scsi 1:0:205:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.426450] mpt3sas_cm0: detecting: handle(0x00b3), sas_address(0x5000cca25253e699), phy(21) [Mon Dec 9 06:17:32 2019][ 27.434887] mpt3sas_cm0: REPORT_LUNS: handle(0x00b3), retries(0) [Mon Dec 9 06:17:32 2019][ 27.441017] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b3), lun(0) [Mon Dec 9 06:17:32 2019][ 27.464744] scsi 1:0:206:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.473116] scsi 1:0:206:0: SSP: handle(0x00b3), sas_addr(0x5000cca25253e699), phy(21), device_name(0x5000cca25253e69b) [Mon Dec 9 06:17:32 2019][ 27.483887] scsi 1:0:206:0: enclosure logical id(0x5000ccab0405db00), slot(30) [Mon Dec 9 06:17:32 2019][ 27.491191] scsi 1:0:206:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.498068] scsi 1:0:206:0: serial_number( 7SHH4DXG) [Mon Dec 9 06:17:32 2019][ 27.503641] scsi 1:0:206:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.525026] mpt3sas_cm0: detecting: handle(0x00b4), sas_address(0x5000cca252543cc1), phy(22) [Mon Dec 9 06:17:32 2019][ 27.533466] mpt3sas_cm0: REPORT_LUNS: handle(0x00b4), retries(0) [Mon Dec 9 06:17:32 2019][ 27.539607] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b4), lun(0) [Mon Dec 9 06:17:32 2019][ 27.546220] scsi 1:0:207:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.554616] scsi 1:0:207:0: SSP: handle(0x00b4), sas_addr(0x5000cca252543cc1), phy(22), device_name(0x5000cca252543cc3) [Mon Dec 9 06:17:32 2019][ 27.565390] scsi 1:0:207:0: enclosure logical id(0x5000ccab0405db00), slot(31) [Mon Dec 9 06:17:32 2019][ 27.572695] scsi 1:0:207:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.579590] scsi 1:0:207:0: serial_number( 7SHHA4TG) [Mon Dec 9 06:17:32 2019][ 27.585163] scsi 1:0:207:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.605437] mpt3sas_cm0: detecting: handle(0x00b5), sas_address(0x5000cca26a24fcdd), phy(23) [Mon Dec 9 06:17:32 2019][ 27.613875] mpt3sas_cm0: REPORT_LUNS: handle(0x00b5), retries(0) [Mon Dec 9 06:17:32 2019][ 27.620034] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b5), lun(0) [Mon Dec 9 06:17:32 2019][ 27.626760] scsi 1:0:208:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.635194] scsi 1:0:208:0: SSP: handle(0x00b5), sas_addr(0x5000cca26a24fcdd), phy(23), device_name(0x5000cca26a24fcdf) [Mon Dec 9 06:17:32 2019][ 27.645965] scsi 1:0:208:0: enclosure logical id(0x5000ccab0405db00), slot(32) [Mon Dec 9 06:17:32 2019][ 27.653272] scsi 1:0:208:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.660161] scsi 1:0:208:0: serial_number( 2TGNALMD) [Mon Dec 9 06:17:32 2019][ 27.665737] scsi 1:0:208:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.688439] mpt3sas_cm0: detecting: handle(0x00b6), sas_address(0x5000cca252543bcd), phy(24) [Mon Dec 9 06:17:32 2019][ 27.696880] mpt3sas_cm0: REPORT_LUNS: handle(0x00b6), retries(0) [Mon Dec 9 06:17:32 2019][ 27.703020] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b6), lun(0) [Mon Dec 9 06:17:32 2019][ 27.709721] scsi 1:0:209:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:32 2019][ 27.718108] scsi 1:0:209:0: SSP: handle(0x00b6), sas_addr(0x5000cca252543bcd), phy(24), device_name(0x5000cca252543bcf) [Mon Dec 9 06:17:32 2019][ 27.728882] scsi 1:0:209:0: enclosure logical id(0x5000ccab0405db00), slot(33) [Mon Dec 9 06:17:32 2019][ 27.736186] scsi 1:0:209:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:32 2019][ 27.743079] scsi 1:0:209:0: serial_number( 7SHHA2UG) [Mon Dec 9 06:17:32 2019][ 27.748652] scsi 1:0:209:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:32 2019][ 27.777438] mpt3sas_cm0: detecting: handle(0x00b7), sas_address(0x5000cca252551265), phy(25) [Mon Dec 9 06:17:33 2019][ 27.785879] mpt3sas_cm0: REPORT_LUNS: handle(0x00b7), retries(0) [Mon Dec 9 06:17:33 2019][ 27.792011] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b7), lun(0) [Mon Dec 9 06:17:33 2019][ 27.798652] scsi 1:0:210:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 27.807035] scsi 1:0:210:0: SSP: handle(0x00b7), sas_addr(0x5000cca252551265), phy(25), device_name(0x5000cca252551267) [Mon Dec 9 06:17:33 2019][ 27.817812] scsi 1:0:210:0: enclosure logical id(0x5000ccab0405db00), slot(34) [Mon Dec 9 06:17:33 2019][ 27.825119] scsi 1:0:210:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 27.832007] scsi 1:0:210:0: serial_number( 7SHHTBVG) [Mon Dec 9 06:17:33 2019][ 27.837583] scsi 1:0:210:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 27.857442] mpt3sas_cm0: detecting: handle(0x00b8), sas_address(0x5000cca252555fc9), phy(26) [Mon Dec 9 06:17:33 2019][ 27.865882] mpt3sas_cm0: REPORT_LUNS: handle(0x00b8), retries(0) [Mon Dec 9 06:17:33 2019][ 27.872017] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b8), lun(0) [Mon Dec 9 06:17:33 2019][ 27.878631] scsi 1:0:211:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 27.887014] scsi 1:0:211:0: SSP: handle(0x00b8), sas_addr(0x5000cca252555fc9), phy(26), device_name(0x5000cca252555fcb) [Mon Dec 9 06:17:33 2019][ 27.897789] scsi 1:0:211:0: enclosure logical id(0x5000ccab0405db00), slot(35) [Mon Dec 9 06:17:33 2019][ 27.905096] scsi 1:0:211:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 27.911988] scsi 1:0:211:0: serial_number( 7SHHYJMG) [Mon Dec 9 06:17:33 2019][ 27.917561] scsi 1:0:211:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 27.937443] mpt3sas_cm0: detecting: handle(0x00b9), sas_address(0x5000cca252559f7d), phy(27) [Mon Dec 9 06:17:33 2019][ 27.945878] mpt3sas_cm0: REPORT_LUNS: handle(0x00b9), retries(0) [Mon Dec 9 06:17:33 2019][ 27.952037] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00b9), lun(0) [Mon Dec 9 06:17:33 2019][ 27.966904] scsi 1:0:212:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 27.975275] scsi 1:0:212:0: SSP: handle(0x00b9), sas_addr(0x5000cca252559f7d), phy(27), device_name(0x5000cca252559f7f) [Mon Dec 9 06:17:33 2019][ 27.986044] scsi 1:0:212:0: enclosure logical id(0x5000ccab0405db00), slot(36) [Mon Dec 9 06:17:33 2019][ 27.993351] scsi 1:0:212:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.000229] scsi 1:0:212:0: serial_number( 7SHJ2T4G) [Mon Dec 9 06:17:33 2019][ 28.005810] scsi 1:0:212:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 28.028446] mpt3sas_cm0: detecting: handle(0x00ba), sas_address(0x5000cca26c244bcd), phy(28) [Mon Dec 9 06:17:33 2019][ 28.036881] mpt3sas_cm0: REPORT_LUNS: handle(0x00ba), retries(0) [Mon Dec 9 06:17:33 2019][ 28.043022] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ba), lun(0) [Mon Dec 9 06:17:33 2019][ 28.049655] scsi 1:0:213:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 28.058039] scsi 1:0:213:0: SSP: handle(0x00ba), sas_addr(0x5000cca26c244bcd), phy(28), device_name(0x5000cca26c244bcf) [Mon Dec 9 06:17:33 2019][ 28.068812] scsi 1:0:213:0: enclosure logical id(0x5000ccab0405db00), slot(37) [Mon Dec 9 06:17:33 2019][ 28.076120] scsi 1:0:213:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.083010] scsi 1:0:213:0: serial_number( 1DGMYU2Z) [Mon Dec 9 06:17:33 2019][ 28.088586] scsi 1:0:213:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 28.121449] mpt3sas_cm0: detecting: handle(0x00bb), sas_address(0x5000cca26a2aa10d), phy(29) [Mon Dec 9 06:17:33 2019][ 28.129884] mpt3sas_cm0: REPORT_LUNS: handle(0x00bb), retries(0) [Mon Dec 9 06:17:33 2019][ 28.136025] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00bb), lun(0) [Mon Dec 9 06:17:33 2019][ 28.233702] scsi 1:0:214:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 28.242088] scsi 1:0:214:0: SSP: handle(0x00bb), sas_addr(0x5000cca26a2aa10d), phy(29), device_name(0x5000cca26a2aa10f) [Mon Dec 9 06:17:33 2019][ 28.252863] scsi 1:0:214:0: enclosure logical id(0x5000ccab0405db00), slot(38) [Mon Dec 9 06:17:33 2019][ 28.260170] scsi 1:0:214:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.267060] scsi 1:0:214:0: serial_number( 2TGSET5D) [Mon Dec 9 06:17:33 2019][ 28.272635] scsi 1:0:214:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 28.298454] mpt3sas_cm0: detecting: handle(0x00bc), sas_address(0x5000cca25254e235), phy(30) [Mon Dec 9 06:17:33 2019][ 28.306889] mpt3sas_cm0: REPORT_LUNS: handle(0x00bc), retries(0) [Mon Dec 9 06:17:33 2019][ 28.313051] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00bc), lun(0) [Mon Dec 9 06:17:33 2019][ 28.319670] scsi 1:0:215:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 28.328055] scsi 1:0:215:0: SSP: handle(0x00bc), sas_addr(0x5000cca25254e235), phy(30), device_name(0x5000cca25254e237) [Mon Dec 9 06:17:33 2019][ 28.338830] scsi 1:0:215:0: enclosure logical id(0x5000ccab0405db00), slot(39) [Mon Dec 9 06:17:33 2019][ 28.346138] scsi 1:0:215:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.353028] scsi 1:0:215:0: serial_number( 7SHHP5BG) [Mon Dec 9 06:17:33 2019][ 28.358602] scsi 1:0:215:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 28.379459] mpt3sas_cm0: detecting: handle(0x00bd), sas_address(0x5000cca25254df95), phy(31) [Mon Dec 9 06:17:33 2019][ 28.387899] mpt3sas_cm0: REPORT_LUNS: handle(0x00bd), retries(0) [Mon Dec 9 06:17:33 2019][ 28.394037] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00bd), lun(0) [Mon Dec 9 06:17:33 2019][ 28.439541] scsi 1:0:216:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 28.447922] scsi 1:0:216:0: SSP: handle(0x00bd), sas_addr(0x5000cca25254df95), phy(31), device_name(0x5000cca25254df97) [Mon Dec 9 06:17:33 2019][ 28.458694] scsi 1:0:216:0: enclosure logical id(0x5000ccab0405db00), slot(40) [Mon Dec 9 06:17:33 2019][ 28.466001] scsi 1:0:216:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.472890] scsi 1:0:216:0: serial_number( 7SHHNZYG) [Mon Dec 9 06:17:33 2019][ 28.478466] scsi 1:0:216:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 28.568112] mpt3sas_cm0: detecting: handle(0x00be), sas_address(0x5000cca25254e9d1), phy(32) [Mon Dec 9 06:17:33 2019][ 28.576557] mpt3sas_cm0: REPORT_LUNS: handle(0x00be), retries(0) [Mon Dec 9 06:17:33 2019][ 28.582697] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00be), lun(0) [Mon Dec 9 06:17:33 2019][ 28.594947] scsi 1:0:217:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 28.603638] scsi 1:0:217:0: SSP: handle(0x00be), sas_addr(0x5000cca25254e9d1), phy(32), device_name(0x5000cca25254e9d3) [Mon Dec 9 06:17:33 2019][ 28.614413] scsi 1:0:217:0: enclosure logical id(0x5000ccab0405db00), slot(41) [Mon Dec 9 06:17:33 2019][ 28.621717] scsi 1:0:217:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.628596] scsi 1:0:217:0: serial_number( 7SHHPP2G) [Mon Dec 9 06:17:33 2019][ 28.634174] scsi 1:0:217:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:33 2019][ 28.663469] mpt3sas_cm0: detecting: handle(0x00bf), sas_address(0x5000cca26a240089), phy(33) [Mon Dec 9 06:17:33 2019][ 28.671904] mpt3sas_cm0: REPORT_LUNS: handle(0x00bf), retries(0) [Mon Dec 9 06:17:33 2019][ 28.678034] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00bf), lun(0) [Mon Dec 9 06:17:33 2019][ 28.722599] scsi 1:0:218:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:33 2019][ 28.730985] scsi 1:0:218:0: SSP: handle(0x00bf), sas_addr(0x5000cca26a240089), phy(33), device_name(0x5000cca26a24008b) [Mon Dec 9 06:17:33 2019][ 28.741754] scsi 1:0:218:0: enclosure logical id(0x5000ccab0405db00), slot(42) [Mon Dec 9 06:17:33 2019][ 28.749059] scsi 1:0:218:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:33 2019][ 28.755937] scsi 1:0:218:0: serial_number( 2TGMTTPD) [Mon Dec 9 06:17:34 2019][ 28.761509] scsi 1:0:218:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 28.787473] mpt3sas_cm0: detecting: handle(0x00c0), sas_address(0x5000cca26a24b9e9), phy(34) [Mon Dec 9 06:17:34 2019][ 28.795909] mpt3sas_cm0: REPORT_LUNS: handle(0x00c0), retries(0) [Mon Dec 9 06:17:34 2019][ 28.802042] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c0), lun(0) [Mon Dec 9 06:17:34 2019][ 28.813348] scsi 1:0:219:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 28.821732] scsi 1:0:219:0: SSP: handle(0x00c0), sas_addr(0x5000cca26a24b9e9), phy(34), device_name(0x5000cca26a24b9eb) [Mon Dec 9 06:17:34 2019][ 28.832506] scsi 1:0:219:0: enclosure logical id(0x5000ccab0405db00), slot(43) [Mon Dec 9 06:17:34 2019][ 28.839810] scsi 1:0:219:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 28.846701] scsi 1:0:219:0: serial_number( 2TGN64DD) [Mon Dec 9 06:17:34 2019][ 28.852277] scsi 1:0:219:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 28.872472] mpt3sas_cm0: detecting: handle(0x00c1), sas_address(0x5000cca26a25aed5), phy(35) [Mon Dec 9 06:17:34 2019][ 28.880904] mpt3sas_cm0: REPORT_LUNS: handle(0x00c1), retries(0) [Mon Dec 9 06:17:34 2019][ 28.887037] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c1), lun(0) [Mon Dec 9 06:17:34 2019][ 28.912813] scsi 1:0:220:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 28.921185] scsi 1:0:220:0: SSP: handle(0x00c1), sas_addr(0x5000cca26a25aed5), phy(35), device_name(0x5000cca26a25aed7) [Mon Dec 9 06:17:34 2019][ 28.931956] scsi 1:0:220:0: enclosure logical id(0x5000ccab0405db00), slot(44) [Mon Dec 9 06:17:34 2019][ 28.939263] scsi 1:0:220:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 28.946142] scsi 1:0:220:0: serial_number( 2TGNRG1D) [Mon Dec 9 06:17:34 2019][ 28.951720] scsi 1:0:220:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 28.972469] mpt3sas_cm0: detecting: handle(0x00c2), sas_address(0x5000cca266d32b69), phy(36) [Mon Dec 9 06:17:34 2019][ 28.980905] mpt3sas_cm0: REPORT_LUNS: handle(0x00c2), retries(0) [Mon Dec 9 06:17:34 2019][ 28.987076] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c2), lun(0) [Mon Dec 9 06:17:34 2019][ 28.993719] scsi 1:0:221:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.002108] scsi 1:0:221:0: SSP: handle(0x00c2), sas_addr(0x5000cca266d32b69), phy(36), device_name(0x5000cca266d32b6b) [Mon Dec 9 06:17:34 2019][ 29.012880] scsi 1:0:221:0: enclosure logical id(0x5000ccab0405db00), slot(45) [Mon Dec 9 06:17:34 2019][ 29.020187] scsi 1:0:221:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.027080] scsi 1:0:221:0: serial_number( 7JKS46JK) [Mon Dec 9 06:17:34 2019][ 29.032654] scsi 1:0:221:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.052474] mpt3sas_cm0: detecting: handle(0x00c3), sas_address(0x5000cca26b9bf885), phy(37) [Mon Dec 9 06:17:34 2019][ 29.060915] mpt3sas_cm0: REPORT_LUNS: handle(0x00c3), retries(0) [Mon Dec 9 06:17:34 2019][ 29.067055] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c3), lun(0) [Mon Dec 9 06:17:34 2019][ 29.073696] scsi 1:0:222:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.082084] scsi 1:0:222:0: SSP: handle(0x00c3), sas_addr(0x5000cca26b9bf885), phy(37), device_name(0x5000cca26b9bf887) [Mon Dec 9 06:17:34 2019][ 29.092858] scsi 1:0:222:0: enclosure logical id(0x5000ccab0405db00), slot(46) [Mon Dec 9 06:17:34 2019][ 29.100165] scsi 1:0:222:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.107056] scsi 1:0:222:0: serial_number( 1SJST42Z) [Mon Dec 9 06:17:34 2019][ 29.112630] scsi 1:0:222:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.188509] mpt3sas_cm0: detecting: handle(0x00c4), sas_address(0x5000cca26b9b24c9), phy(38) [Mon Dec 9 06:17:34 2019][ 29.196945] mpt3sas_cm0: REPORT_LUNS: handle(0x00c4), retries(0) [Mon Dec 9 06:17:34 2019][ 29.203080] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c4), lun(0) [Mon Dec 9 06:17:34 2019][ 29.212591] scsi 1:0:223:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.220974] scsi 1:0:223:0: SSP: handle(0x00c4), sas_addr(0x5000cca26b9b24c9), phy(38), device_name(0x5000cca26b9b24cb) [Mon Dec 9 06:17:34 2019][ 29.231746] scsi 1:0:223:0: enclosure logical id(0x5000ccab0405db00), slot(47) [Mon Dec 9 06:17:34 2019][ 29.239050] scsi 1:0:223:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.245928] scsi 1:0:223:0: serial_number( 1SJSA0YZ) [Mon Dec 9 06:17:34 2019][ 29.251499] scsi 1:0:223:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.271480] mpt3sas_cm0: detecting: handle(0x00c5), sas_address(0x5000cca26a21d741), phy(39) [Mon Dec 9 06:17:34 2019][ 29.279923] mpt3sas_cm0: REPORT_LUNS: handle(0x00c5), retries(0) [Mon Dec 9 06:17:34 2019][ 29.286086] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c5), lun(0) [Mon Dec 9 06:17:34 2019][ 29.292881] scsi 1:0:224:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.301275] scsi 1:0:224:0: SSP: handle(0x00c5), sas_addr(0x5000cca26a21d741), phy(39), device_name(0x5000cca26a21d743) [Mon Dec 9 06:17:34 2019][ 29.312043] scsi 1:0:224:0: enclosure logical id(0x5000ccab0405db00), slot(48) [Mon Dec 9 06:17:34 2019][ 29.319350] scsi 1:0:224:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.326240] scsi 1:0:224:0: serial_number( 2TGLLYED) [Mon Dec 9 06:17:34 2019][ 29.331818] scsi 1:0:224:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.352480] mpt3sas_cm0: detecting: handle(0x00c6), sas_address(0x5000cca26a27af5d), phy(40) [Mon Dec 9 06:17:34 2019][ 29.360920] mpt3sas_cm0: REPORT_LUNS: handle(0x00c6), retries(0) [Mon Dec 9 06:17:34 2019][ 29.367084] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c6), lun(0) [Mon Dec 9 06:17:34 2019][ 29.373724] scsi 1:0:225:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.382112] scsi 1:0:225:0: SSP: handle(0x00c6), sas_addr(0x5000cca26a27af5d), phy(40), device_name(0x5000cca26a27af5f) [Mon Dec 9 06:17:34 2019][ 29.392889] scsi 1:0:225:0: enclosure logical id(0x5000ccab0405db00), slot(49) [Mon Dec 9 06:17:34 2019][ 29.400196] scsi 1:0:225:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.407087] scsi 1:0:225:0: serial_number( 2TGPUL5D) [Mon Dec 9 06:17:34 2019][ 29.412660] scsi 1:0:225:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.432479] mpt3sas_cm0: detecting: handle(0x00c7), sas_address(0x5000cca2525552e5), phy(41) [Mon Dec 9 06:17:34 2019][ 29.440917] mpt3sas_cm0: REPORT_LUNS: handle(0x00c7), retries(0) [Mon Dec 9 06:17:34 2019][ 29.447061] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c7), lun(0) [Mon Dec 9 06:17:34 2019][ 29.453709] scsi 1:0:226:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.462105] scsi 1:0:226:0: SSP: handle(0x00c7), sas_addr(0x5000cca2525552e5), phy(41), device_name(0x5000cca2525552e7) [Mon Dec 9 06:17:34 2019][ 29.472876] scsi 1:0:226:0: enclosure logical id(0x5000ccab0405db00), slot(50) [Mon Dec 9 06:17:34 2019][ 29.480180] scsi 1:0:226:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.487073] scsi 1:0:226:0: serial_number( 7SHHXP0G) [Mon Dec 9 06:17:34 2019][ 29.492648] scsi 1:0:226:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.512486] mpt3sas_cm0: detecting: handle(0x00c8), sas_address(0x5000cca26a26dff1), phy(42) [Mon Dec 9 06:17:34 2019][ 29.520923] mpt3sas_cm0: REPORT_LUNS: handle(0x00c8), retries(0) [Mon Dec 9 06:17:34 2019][ 29.527070] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c8), lun(0) [Mon Dec 9 06:17:34 2019][ 29.533745] scsi 1:0:227:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.542134] scsi 1:0:227:0: SSP: handle(0x00c8), sas_addr(0x5000cca26a26dff1), phy(42), device_name(0x5000cca26a26dff3) [Mon Dec 9 06:17:34 2019][ 29.552904] scsi 1:0:227:0: enclosure logical id(0x5000ccab0405db00), slot(51) [Mon Dec 9 06:17:34 2019][ 29.560211] scsi 1:0:227:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.567108] scsi 1:0:227:0: serial_number( 2TGPBSYD) [Mon Dec 9 06:17:34 2019][ 29.572687] scsi 1:0:227:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.595491] mpt3sas_cm0: detecting: handle(0x00c9), sas_address(0x5000cca26b9c5d51), phy(43) [Mon Dec 9 06:17:34 2019][ 29.603934] mpt3sas_cm0: REPORT_LUNS: handle(0x00c9), retries(0) [Mon Dec 9 06:17:34 2019][ 29.610741] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00c9), lun(0) [Mon Dec 9 06:17:34 2019][ 29.636259] scsi 1:0:228:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.644642] scsi 1:0:228:0: SSP: handle(0x00c9), sas_addr(0x5000cca26b9c5d51), phy(43), device_name(0x5000cca26b9c5d53) [Mon Dec 9 06:17:34 2019][ 29.655417] scsi 1:0:228:0: enclosure logical id(0x5000ccab0405db00), slot(52) [Mon Dec 9 06:17:34 2019][ 29.662724] scsi 1:0:228:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.669599] scsi 1:0:228:0: serial_number( 1SJSZV5Z) [Mon Dec 9 06:17:34 2019][ 29.675172] scsi 1:0:228:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:34 2019][ 29.700086] mpt3sas_cm0: detecting: handle(0x00ca), sas_address(0x5000cca26b9602c5), phy(44) [Mon Dec 9 06:17:34 2019][ 29.708538] mpt3sas_cm0: REPORT_LUNS: handle(0x00ca), retries(0) [Mon Dec 9 06:17:34 2019][ 29.714655] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ca), lun(0) [Mon Dec 9 06:17:34 2019][ 29.726616] scsi 1:0:229:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:34 2019][ 29.736415] scsi 1:0:229:0: SSP: handle(0x00ca), sas_addr(0x5000cca26b9602c5), phy(44), device_name(0x5000cca26b9602c7) [Mon Dec 9 06:17:34 2019][ 29.747190] scsi 1:0:229:0: enclosure logical id(0x5000ccab0405db00), slot(53) [Mon Dec 9 06:17:34 2019][ 29.754497] scsi 1:0:229:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:34 2019][ 29.761390] scsi 1:0:229:0: serial_number( 1SJNHJ4Z) [Mon Dec 9 06:17:35 2019][ 29.766961] scsi 1:0:229:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 29.810615] mpt3sas_cm0: detecting: handle(0x00cb), sas_address(0x5000cca252544a01), phy(45) [Mon Dec 9 06:17:35 2019][ 29.819053] mpt3sas_cm0: REPORT_LUNS: handle(0x00cb), retries(0) [Mon Dec 9 06:17:35 2019][ 29.825193] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00cb), lun(0) [Mon Dec 9 06:17:35 2019][ 29.831829] scsi 1:0:230:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 29.840220] scsi 1:0:230:0: SSP: handle(0x00cb), sas_addr(0x5000cca252544a01), phy(45), device_name(0x5000cca252544a03) [Mon Dec 9 06:17:35 2019][ 29.850994] scsi 1:0:230:0: enclosure logical id(0x5000ccab0405db00), slot(54) [Mon Dec 9 06:17:35 2019][ 29.858301] scsi 1:0:230:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 29.865191] scsi 1:0:230:0: serial_number( 7SHHB14G) [Mon Dec 9 06:17:35 2019][ 29.870764] scsi 1:0:230:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 29.893527] mpt3sas_cm0: detecting: handle(0x00cc), sas_address(0x5000cca252559f9d), phy(46) [Mon Dec 9 06:17:35 2019][ 29.901968] mpt3sas_cm0: REPORT_LUNS: handle(0x00cc), retries(0) [Mon Dec 9 06:17:35 2019][ 29.908099] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00cc), lun(0) [Mon Dec 9 06:17:35 2019][ 29.980081] scsi 1:0:231:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 29.988467] scsi 1:0:231:0: SSP: handle(0x00cc), sas_addr(0x5000cca252559f9d), phy(46), device_name(0x5000cca252559f9f) [Mon Dec 9 06:17:35 2019][ 29.999241] scsi 1:0:231:0: enclosure logical id(0x5000ccab0405db00), slot(55) [Mon Dec 9 06:17:35 2019][ 30.006546] scsi 1:0:231:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.013424] scsi 1:0:231:0: serial_number( 7SHJ2TDG) [Mon Dec 9 06:17:35 2019][ 30.018995] scsi 1:0:231:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.039495] mpt3sas_cm0: detecting: handle(0x00cd), sas_address(0x5000cca25255571d), phy(47) [Mon Dec 9 06:17:35 2019][ 30.047935] mpt3sas_cm0: REPORT_LUNS: handle(0x00cd), retries(0) [Mon Dec 9 06:17:35 2019][ 30.054075] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00cd), lun(0) [Mon Dec 9 06:17:35 2019][ 30.060757] scsi 1:0:232:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.069152] scsi 1:0:232:0: SSP: handle(0x00cd), sas_addr(0x5000cca25255571d), phy(47), device_name(0x5000cca25255571f) [Mon Dec 9 06:17:35 2019][ 30.079921] scsi 1:0:232:0: enclosure logical id(0x5000ccab0405db00), slot(56) [Mon Dec 9 06:17:35 2019][ 30.087228] scsi 1:0:232:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.094118] scsi 1:0:232:0: serial_number( 7SHHXYRG) [Mon Dec 9 06:17:35 2019][ 30.099694] scsi 1:0:232:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.122499] mpt3sas_cm0: detecting: handle(0x00ce), sas_address(0x5000cca26b9bf57d), phy(48) [Mon Dec 9 06:17:35 2019][ 30.130939] mpt3sas_cm0: REPORT_LUNS: handle(0x00ce), retries(0) [Mon Dec 9 06:17:35 2019][ 30.137072] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00ce), lun(0) [Mon Dec 9 06:17:35 2019][ 30.154253] scsi 1:0:233:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.162643] scsi 1:0:233:0: SSP: handle(0x00ce), sas_addr(0x5000cca26b9bf57d), phy(48), device_name(0x5000cca26b9bf57f) [Mon Dec 9 06:17:35 2019][ 30.173419] scsi 1:0:233:0: enclosure logical id(0x5000ccab0405db00), slot(57) [Mon Dec 9 06:17:35 2019][ 30.180726] scsi 1:0:233:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.187604] scsi 1:0:233:0: serial_number( 1SJSSXUZ) [Mon Dec 9 06:17:35 2019][ 30.193184] scsi 1:0:233:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.213502] mpt3sas_cm0: detecting: handle(0x00cf), sas_address(0x5000cca252555371), phy(49) [Mon Dec 9 06:17:35 2019][ 30.221944] mpt3sas_cm0: REPORT_LUNS: handle(0x00cf), retries(0) [Mon Dec 9 06:17:35 2019][ 30.228150] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00cf), lun(0) [Mon Dec 9 06:17:35 2019][ 30.235011] scsi 1:0:234:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.243408] scsi 1:0:234:0: SSP: handle(0x00cf), sas_addr(0x5000cca252555371), phy(49), device_name(0x5000cca252555373) [Mon Dec 9 06:17:35 2019][ 30.254177] scsi 1:0:234:0: enclosure logical id(0x5000ccab0405db00), slot(58) [Mon Dec 9 06:17:35 2019][ 30.261484] scsi 1:0:234:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.268376] scsi 1:0:234:0: serial_number( 7SHHXR4G) [Mon Dec 9 06:17:35 2019][ 30.273949] scsi 1:0:234:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.296498] mpt3sas_cm0: detecting: handle(0x00d0), sas_address(0x5000cca25253eefd), phy(50) [Mon Dec 9 06:17:35 2019][ 30.304936] mpt3sas_cm0: REPORT_LUNS: handle(0x00d0), retries(0) [Mon Dec 9 06:17:35 2019][ 30.311102] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d0), lun(0) [Mon Dec 9 06:17:35 2019][ 30.317742] scsi 1:0:235:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.326128] scsi 1:0:235:0: SSP: handle(0x00d0), sas_addr(0x5000cca25253eefd), phy(50), device_name(0x5000cca25253eeff) [Mon Dec 9 06:17:35 2019][ 30.336902] scsi 1:0:235:0: enclosure logical id(0x5000ccab0405db00), slot(59) [Mon Dec 9 06:17:35 2019][ 30.344210] scsi 1:0:235:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.351101] scsi 1:0:235:0: serial_number( 7SHH4Z7G) [Mon Dec 9 06:17:35 2019][ 30.356676] scsi 1:0:235:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.378985] mpt3sas_cm0: expander_add: handle(0x009c), parent(0x0099), sas_addr(0x5000ccab0405db7f), phys(68) [Mon Dec 9 06:17:35 2019][ 30.400920] mpt3sas_cm0: detecting: handle(0x00d1), sas_address(0x5000cca26b9cbb05), phy(42) [Mon Dec 9 06:17:35 2019][ 30.409354] mpt3sas_cm0: REPORT_LUNS: handle(0x00d1), retries(0) [Mon Dec 9 06:17:35 2019][ 30.415472] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d1), lun(0) [Mon Dec 9 06:17:35 2019][ 30.422288] scsi 1:0:236:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.430687] scsi 1:0:236:0: SSP: handle(0x00d1), sas_addr(0x5000cca26b9cbb05), phy(42), device_name(0x5000cca26b9cbb07) [Mon Dec 9 06:17:35 2019][ 30.441460] scsi 1:0:236:0: enclosure logical id(0x5000ccab0405db00), slot(1) [Mon Dec 9 06:17:35 2019][ 30.448679] scsi 1:0:236:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.455570] scsi 1:0:236:0: serial_number( 1SJT62MZ) [Mon Dec 9 06:17:35 2019][ 30.461146] scsi 1:0:236:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.483507] mpt3sas_cm0: detecting: handle(0x00d2), sas_address(0x5000cca252544475), phy(43) [Mon Dec 9 06:17:35 2019][ 30.491940] mpt3sas_cm0: REPORT_LUNS: handle(0x00d2), retries(0) [Mon Dec 9 06:17:35 2019][ 30.498097] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d2), lun(0) [Mon Dec 9 06:17:35 2019][ 30.504891] scsi 1:0:237:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.518314] scsi 1:0:237:0: SSP: handle(0x00d2), sas_addr(0x5000cca252544475), phy(43), device_name(0x5000cca252544477) [Mon Dec 9 06:17:35 2019][ 30.529082] scsi 1:0:237:0: enclosure logical id(0x5000ccab0405db00), slot(3) [Mon Dec 9 06:17:35 2019][ 30.536302] scsi 1:0:237:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.543195] scsi 1:0:237:0: serial_number( 7SHHANPG) [Mon Dec 9 06:17:35 2019][ 30.548769] scsi 1:0:237:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.572101] mpt3sas_cm0: detecting: handle(0x00d3), sas_address(0x5000cca26a26173d), phy(44) [Mon Dec 9 06:17:35 2019][ 30.580549] mpt3sas_cm0: REPORT_LUNS: handle(0x00d3), retries(0) [Mon Dec 9 06:17:35 2019][ 30.586692] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d3), lun(0) [Mon Dec 9 06:17:35 2019][ 30.614858] scsi 1:0:238:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.623251] scsi 1:0:238:0: SSP: handle(0x00d3), sas_addr(0x5000cca26a26173d), phy(44), device_name(0x5000cca26a26173f) [Mon Dec 9 06:17:35 2019][ 30.634022] scsi 1:0:238:0: enclosure logical id(0x5000ccab0405db00), slot(4) [Mon Dec 9 06:17:35 2019][ 30.641240] scsi 1:0:238:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.648117] scsi 1:0:238:0: serial_number( 2TGNYDLD) [Mon Dec 9 06:17:35 2019][ 30.653688] scsi 1:0:238:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:35 2019][ 30.691516] mpt3sas_cm0: detecting: handle(0x00d4), sas_address(0x5000cca252544cb5), phy(45) [Mon Dec 9 06:17:35 2019][ 30.699956] mpt3sas_cm0: REPORT_LUNS: handle(0x00d4), retries(0) [Mon Dec 9 06:17:35 2019][ 30.706109] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d4), lun(0) [Mon Dec 9 06:17:35 2019][ 30.752782] scsi 1:0:239:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:35 2019][ 30.761165] scsi 1:0:239:0: SSP: handle(0x00d4), sas_addr(0x5000cca252544cb5), phy(45), device_name(0x5000cca252544cb7) [Mon Dec 9 06:17:35 2019][ 30.771937] scsi 1:0:239:0: enclosure logical id(0x5000ccab0405db00), slot(5) [Mon Dec 9 06:17:35 2019][ 30.779158] scsi 1:0:239:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:35 2019][ 30.786033] scsi 1:0:239:0: serial_number( 7SHHB6RG) [Mon Dec 9 06:17:35 2019][ 30.791603] scsi 1:0:239:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:36 2019][ 30.811514] mpt3sas_cm0: detecting: handle(0x00d5), sas_address(0x5000cca26c238691), phy(46) [Mon Dec 9 06:17:36 2019][ 30.819947] mpt3sas_cm0: REPORT_LUNS: handle(0x00d5), retries(0) [Mon Dec 9 06:17:36 2019][ 30.826094] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d5), lun(0) [Mon Dec 9 06:17:36 2019][ 30.832906] scsi 1:0:240:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:36 2019][ 30.841307] scsi 1:0:240:0: SSP: handle(0x00d5), sas_addr(0x5000cca26c238691), phy(46), device_name(0x5000cca26c238693) [Mon Dec 9 06:17:36 2019][ 30.852081] scsi 1:0:240:0: enclosure logical id(0x5000ccab0405db00), slot(6) [Mon Dec 9 06:17:36 2019][ 30.859298] scsi 1:0:240:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:36 2019][ 30.866191] scsi 1:0:240:0: serial_number( 1DGMJNWZ) [Mon Dec 9 06:17:36 2019][ 30.871765] scsi 1:0:240:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:36 2019][ 30.894550] mpt3sas_cm0: detecting: handle(0x00d6), sas_address(0x5000cca26a2ac969), phy(47) [Mon Dec 9 06:17:36 2019][ 30.902984] mpt3sas_cm0: REPORT_LUNS: handle(0x00d6), retries(0) [Mon Dec 9 06:17:36 2019][ 30.909118] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d6), lun(0) [Mon Dec 9 06:17:36 2019][ 30.926099] scsi 1:0:241:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:36 2019][ 30.934473] scsi 1:0:241:0: SSP: handle(0x00d6), sas_addr(0x5000cca26a2ac969), phy(47), device_name(0x5000cca26a2ac96b) [Mon Dec 9 06:17:36 2019][ 30.945250] scsi 1:0:241:0: enclosure logical id(0x5000ccab0405db00), slot(7) [Mon Dec 9 06:17:36 2019][ 30.952469] scsi 1:0:241:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:36 2019][ 30.959344] scsi 1:0:241:0: serial_number( 2TGSJGHD) [Mon Dec 9 06:17:36 2019][ 30.964916] scsi 1:0:241:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:36 2019][ 30.985518] mpt3sas_cm0: detecting: handle(0x00d7), sas_address(0x5000cca25253e619), phy(48) [Mon Dec 9 06:17:36 2019][ 30.993952] mpt3sas_cm0: REPORT_LUNS: handle(0x00d7), retries(0) [Mon Dec 9 06:17:36 2019][ 31.000112] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d7), lun(0) [Mon Dec 9 06:17:36 2019][ 31.006862] scsi 1:0:242:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:36 2019][ 31.015252] scsi 1:0:242:0: SSP: handle(0x00d7), sas_addr(0x5000cca25253e619), phy(48), device_name(0x5000cca25253e61b) [Mon Dec 9 06:17:36 2019][ 31.026023] scsi 1:0:242:0: enclosure logical id(0x5000ccab0405db00), slot(8) [Mon Dec 9 06:17:36 2019][ 31.033243] scsi 1:0:242:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:36 2019][ 31.040136] scsi 1:0:242:0: serial_number( 7SHH4BWG) [Mon Dec 9 06:17:36 2019][ 31.045709] scsi 1:0:242:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:36 2019][ 31.066518] mpt3sas_cm0: detecting: handle(0x00d8), sas_address(0x5000cca252542cfd), phy(49) [Mon Dec 9 06:17:36 2019][ 31.074953] mpt3sas_cm0: REPORT_LUNS: handle(0x00d8), retries(0) [Mon Dec 9 06:17:36 2019][ 31.081085] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d8), lun(0) [Mon Dec 9 06:17:36 2019][ 31.087744] scsi 1:0:243:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:36 2019][ 31.096134] scsi 1:0:243:0: SSP: handle(0x00d8), sas_addr(0x5000cca252542cfd), phy(49), device_name(0x5000cca252542cff) [Mon Dec 9 06:17:36 2019][ 31.106903] scsi 1:0:243:0: enclosure logical id(0x5000ccab0405db00), slot(9) [Mon Dec 9 06:17:36 2019][ 31.114123] scsi 1:0:243:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:36 2019][ 31.121015] scsi 1:0:243:0: serial_number( 7SHH937G) [Mon Dec 9 06:17:36 2019][ 31.126588] scsi 1:0:243:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:36 2019][ 31.146520] mpt3sas_cm0: detecting: handle(0x00d9), sas_address(0x5000cca26a3181fd), phy(50) [Mon Dec 9 06:17:36 2019][ 31.154958] mpt3sas_cm0: REPORT_LUNS: handle(0x00d9), retries(0) [Mon Dec 9 06:17:36 2019][ 31.161089] mpt3sas_cm0: TEST_UNIT_READY: handle(0x00d9), lun(0) [Mon Dec 9 06:17:36 2019][ 31.167738] scsi 1:0:244:0: Direct-Access HGST HUH721008AL5200 A38F PQ: 0 ANSI: 6 [Mon Dec 9 06:17:36 2019][ 31.176125] scsi 1:0:244:0: SSP: handle(0x00d9), sas_addr(0x5000cca26a3181fd), phy(50), device_name(0x5000cca26a3181ff) [Mon Dec 9 06:17:36 2019][ 31.186900] scsi 1:0:244:0: enclosure logical id(0x5000ccab0405db00), slot(10) [Mon Dec 9 06:17:36 2019][ 31.194204] scsi 1:0:244:0: enclosure level(0x0000), connector name( C0 ) [Mon Dec 9 06:17:36 2019][ 31.201096] scsi 1:0:244:0: serial_number( 2TGW71ND) [Mon Dec 9 06:17:36 2019][ 31.206672] scsi 1:0:244:0: qdepth(254), tagged(1), simple(0), ordered(0), scsi_level(7), cmd_que(1) [Mon Dec 9 06:17:36 2019][ 31.234333] mpt3sas_cm0: port enable: SUCCESS [Mon Dec 9 06:17:36 2019][ 31.239493] sd 1:0:2:0: [sdb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.247353] sd 1:0:2:0: [sdb] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.252608] sd 1:0:3:0: [sdc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.252628] sd 1:0:4:0: [sdd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.252629] sd 1:0:4:0: [sdd] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.252640] sd 1:0:5:0: [sde] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.252642] sd 1:0:5:0: [sde] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.252793] sd 1:0:6:0: [sdf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.252795] sd 1:0:6:0: [sdf] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.252973] sd 1:0:8:0: [sdh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.252974] sd 1:0:8:0: [sdh] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253099] sd 1:0:10:0: [sdj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253101] sd 1:0:10:0: [sdj] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253304] sd 1:0:13:0: [sdm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253311] sd 1:0:13:0: [sdm] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253320] sd 1:0:11:0: [sdk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253321] sd 1:0:11:0: [sdk] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253557] sd 1:0:6:0: [sdf] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.253573] sd 1:0:14:0: [sdn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253574] sd 1:0:14:0: [sdn] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253654] sd 1:0:19:0: [sds] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253659] sd 1:0:19:0: [sds] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253686] sd 1:0:8:0: [sdh] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.253722] sd 1:0:20:0: [sdt] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253723] sd 1:0:20:0: [sdt] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253775] sd 1:0:18:0: [sdr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253777] sd 1:0:18:0: [sdr] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253793] sd 1:0:21:0: [sdu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253794] sd 1:0:21:0: [sdu] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.253830] sd 1:0:10:0: [sdj] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.253851] sd 1:0:22:0: [sdv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.253856] sd 1:0:22:0: [sdv] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254096] sd 1:0:13:0: [sdm] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254136] sd 1:0:8:0: [sdh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.254142] sd 1:0:23:0: [sdw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254143] sd 1:0:23:0: [sdw] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254170] sd 1:0:6:0: [sdf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.254294] sd 1:0:10:0: [sdj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.254351] sd 1:0:26:0: [sdz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254353] sd 1:0:26:0: [sdz] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254449] sd 1:0:20:0: [sdt] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254495] sd 1:0:18:0: [sdr] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254504] sd 1:0:31:0: [sdae] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254505] sd 1:0:31:0: [sdae] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254518] sd 1:0:21:0: [sdu] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254563] sd 1:0:32:0: [sdaf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254564] sd 1:0:32:0: [sdaf] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254585] sd 1:0:13:0: [sdm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.254605] sd 1:0:22:0: [sdv] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254632] sd 1:0:33:0: [sdag] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254638] sd 1:0:33:0: [sdag] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254692] sd 1:0:19:0: [sds] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254729] sd 1:0:34:0: [sdah] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254731] sd 1:0:34:0: [sdah] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254763] sd 1:0:5:0: [sde] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.254898] sd 1:0:25:0: [sdy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.254900] sd 1:0:25:0: [sdy] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.254929] sd 1:0:20:0: [sdt] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.254978] sd 1:0:18:0: [sdr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.254996] sd 1:0:21:0: [sdu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.255086] sd 1:0:37:0: [sdak] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.255087] sd 1:0:37:0: [sdak] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.255096] sd 1:0:22:0: [sdv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.255105] sd 1:0:38:0: [sdal] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.255107] sd 1:0:38:0: [sdal] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.255157] sd 1:0:19:0: [sds] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.255231] sd 1:0:31:0: [sdae] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.255305] sd 1:0:32:0: [sdaf] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.255389] sd 1:0:33:0: [sdag] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.255506] sd 1:0:34:0: [sdah] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.255695] sd 1:0:31:0: [sdae] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.255775] sd 1:0:32:0: [sdaf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.255806] sd 1:0:37:0: [sdak] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.255828] sd 1:0:38:0: [sdal] Write Protect is off [Mon Dec 9 06:17:36 2019][ 31.255864] sd 1:0:33:0: [sdag] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.255983] sd 1:0:34:0: [sdah] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.256060] sd 1:0:5:0: [sde] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.256285] sd 1:0:9:0: [sdi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:36 2019][ 31.256287] sd 1:0:9:0: [sdi] 4096-byte physical blocks [Mon Dec 9 06:17:36 2019][ 31.256288] sd 1:0:37:0: [sdak] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:36 2019][ 31.256309] sd 1:0:38:0: [sdal] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.256727] sd 1:0:17:0: [sdq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.256729] sd 1:0:17:0: [sdq] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.256756] sd 1:0:42:0: [sdap] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.256758] sd 1:0:42:0: [sdap] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.257860] sd 1:0:9:0: [sdi] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.257998] sd 1:0:29:0: [sdac] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.258001] sd 1:0:29:0: [sdac] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.258316] sd 1:0:41:0: [sdao] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.258319] sd 1:0:41:0: [sdao] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.258558] sd 1:0:42:0: [sdap] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.258664] sd 1:0:17:0: [sdq] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.258734] sd 1:0:29:0: [sdac] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.259025] sd 1:0:42:0: [sdap] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.259162] sd 1:0:17:0: [sdq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.259390] sd 1:0:2:0: [sdb] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.259560] sd 1:0:29:0: [sdac] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.259860] sd 1:0:2:0: [sdb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.260699] sd 1:0:47:0: [sdau] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.260700] sd 1:0:47:0: [sdau] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.260943] sd 1:0:30:0: [sdad] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.260945] sd 1:0:30:0: [sdad] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.261656] sd 1:0:30:0: [sdad] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.263371] sd 1:0:7:0: [sdg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.263373] sd 1:0:7:0: [sdg] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.263415] sd 1:0:11:0: [sdk] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.263569] sd 1:0:30:0: [sdad] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.264118] sd 1:0:7:0: [sdg] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.264576] sd 1:0:7:0: [sdg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.264751] sd 1:0:11:0: [sdk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.264857] sd 1:0:26:0: [sdz] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.265450] sd 1:0:23:0: [sdw] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.265467] sd 1:0:51:0: [sday] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.265468] sd 1:0:51:0: [sday] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.265662] sd 1:0:9:0: [sdi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.265830] sd 1:0:25:0: [sdy] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.266108] sd 1:0:14:0: [sdn] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.267582] sd 1:0:12:0: [sdl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.267583] sd 1:0:12:0: [sdl] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.267847] sd 1:0:53:0: [sdba] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.267849] sd 1:0:53:0: [sdba] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.268354] sd 1:0:54:0: [sdbb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.268355] sd 1:0:54:0: [sdbb] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.268859] sd 1:0:12:0: [sdl] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.268868] sd 1:0:24:0: [sdx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.268872] sd 1:0:24:0: [sdx] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.269299] sd 1:0:43:0: [sdaq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.269300] sd 1:0:43:0: [sdaq] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.270089] sd 1:0:43:0: [sdaq] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.270510] sd 1:0:44:0: [sdar] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.270514] sd 1:0:44:0: [sdar] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.270550] sd 1:0:43:0: [sdaq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.270562] sd 1:0:53:0: [sdba] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.270919] sd 1:0:41:0: [sdao] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.271084] sd 1:0:35:0: [sdai] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.271086] sd 1:0:35:0: [sdai] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.271181] sd 1:0:36:0: [sdaj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.271183] sd 1:0:36:0: [sdaj] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.271302] sd 1:0:53:0: [sdba] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.271385] sd 1:0:54:0: [sdbb] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.271422] sd 1:0:41:0: [sdao] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.271639] sd 1:0:44:0: [sdar] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.271669] sd 1:0:31:0: [sdae] Attached SCSI disk [Mon Dec 9 06:17:37 2019][ 31.271852] sd 1:0:54:0: [sdbb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.271968] sd 1:0:57:0: [sdbe] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.271969] sd 1:0:57:0: [sdbe] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.272112] sd 1:0:44:0: [sdar] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.272211] sd 1:0:35:0: [sdai] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.272296] sd 1:0:36:0: [sdaj] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.272602] sd 1:0:55:0: [sdbc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.272604] sd 1:0:55:0: [sdbc] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.272774] sd 1:0:35:0: [sdai] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.272867] sd 1:0:36:0: [sdaj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.272873] sd 1:0:46:0: [sdat] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.272874] sd 1:0:46:0: [sdat] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.273073] sd 1:0:52:0: [sdaz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.273074] sd 1:0:52:0: [sdaz] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.273145] sd 1:0:45:0: [sdas] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.273146] sd 1:0:45:0: [sdas] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.273373] sd 1:0:6:0: [sdf] Attached SCSI disk [Mon Dec 9 06:17:37 2019][ 31.273596] sd 1:0:46:0: [sdat] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.274566] sd 1:0:46:0: [sdat] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.275826] sd 1:0:49:0: [sdaw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.275828] sd 1:0:49:0: [sdaw] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.275829] sd 1:0:18:0: [sdr] Attached SCSI disk [Mon Dec 9 06:17:37 2019][ 31.277272] sd 1:0:52:0: [sdaz] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.277305] sd 1:0:45:0: [sdas] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.277314] sd 1:0:47:0: [sdau] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.277549] sd 1:0:59:0: [sdbg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.277551] sd 1:0:59:0: [sdbg] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.277846] sd 1:0:60:0: [sdbh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.277848] sd 1:0:60:0: [sdbh] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.277892] sd 1:0:26:0: [sdz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.278353] sd 1:0:47:0: [sdau] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.281905] sd 1:0:63:0: [sdbj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.281906] sd 1:0:63:0: [sdbj] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.319115] sd 1:0:99:0: [sdct] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.319117] sd 1:0:99:0: [sdct] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.319925] sd 1:0:102:0: [sdcw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.319927] sd 1:0:102:0: [sdcw] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328266] sd 1:0:48:0: [sdav] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328268] sd 1:0:48:0: [sdav] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328283] sd 1:0:58:0: [sdbf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328285] sd 1:0:58:0: [sdbf] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328434] sd 1:0:59:0: [sdbg] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328456] sd 1:0:104:0: [sdcy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328458] sd 1:0:104:0: [sdcy] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328537] sd 1:0:90:0: [sdck] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328539] sd 1:0:90:0: [sdck] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328578] sd 1:0:103:0: [sdcx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328581] sd 1:0:103:0: [sdcx] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328650] sd 1:0:56:0: [sdbd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328653] sd 1:0:56:0: [sdbd] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328709] sd 1:0:27:0: [sdaa] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328711] sd 1:0:27:0: [sdaa] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328716] sd 1:0:51:0: [sday] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328755] sd 1:0:24:0: [sdx] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328769] sd 1:0:25:0: [sdy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.328803] sd 1:0:28:0: [sdab] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328804] sd 1:0:28:0: [sdab] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328819] sd 1:0:4:0: [sdd] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328820] sd 1:0:23:0: [sdw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:37 2019][ 31.328830] sd 1:0:55:0: [sdbc] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328832] sd 1:0:57:0: [sdbe] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328850] sd 1:0:49:0: [sdaw] Write Protect is off [Mon Dec 9 06:17:37 2019][ 31.328915] sd 1:0:94:0: [sdco] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328917] sd 1:0:94:0: [sdco] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328936] sd 1:0:70:0: [sdbq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328938] sd 1:0:66:0: [sdbm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328940] sd 1:0:70:0: [sdbq] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328942] sd 1:0:66:0: [sdbm] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328951] sd 1:0:111:0: [sddf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328953] sd 1:0:111:0: [sddf] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328954] sd 1:0:95:0: [sdcp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328956] sd 1:0:95:0: [sdcp] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328963] sd 1:0:114:0: [sddi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328966] sd 1:0:114:0: [sddi] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328976] sd 1:0:98:0: [sdcs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328977] sd 1:0:98:0: [sdcs] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.328987] sd 1:0:71:0: [sdbr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.328989] sd 1:0:71:0: [sdbr] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.329005] sd 1:0:50:0: [sdax] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.329008] sd 1:0:117:0: [sddl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.329011] sd 1:0:50:0: [sdax] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.329013] sd 1:0:117:0: [sddl] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.329024] sd 1:0:82:0: [sdcc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.329025] sd 1:0:82:0: [sdcc] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.329047] sd 1:0:107:0: [sddb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.329049] sd 1:0:107:0: [sddb] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.329372] sd 1:0:91:0: [sdcl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.329374] sd 1:0:91:0: [sdcl] 4096-byte physical blocks [Mon Dec 9 06:17:37 2019][ 31.329450] sd 1:0:109:0: [sddd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:37 2019][ 31.329453] sd 1:0:109:0: [sddd] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.329460] sd 1:0:60:0: [sdbh] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.329467] sd 1:0:78:0: [sdby] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.329468] sd 1:0:78:0: [sdby] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.329578] sd 1:0:29:0: [sdac] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.329812] sd 1:0:102:0: [sdcw] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.329919] sd 1:0:63:0: [sdbj] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.329964] sd 1:0:60:0: [sdbh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.329991] sd 1:0:56:0: [sdbd] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330112] sd 1:0:48:0: [sdav] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330184] sd 1:0:103:0: [sdcx] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330217] sd 1:0:111:0: [sddf] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330242] sd 1:0:58:0: [sdbf] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330324] sd 1:0:51:0: [sday] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.330354] sd 1:0:82:0: [sdcc] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330379] sd 1:0:94:0: [sdco] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330390] sd 1:0:107:0: [sddb] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330491] sd 1:0:117:0: [sddl] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330492] sd 1:0:50:0: [sdax] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330504] sd 1:0:70:0: [sdbq] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330530] sd 1:0:90:0: [sdck] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330531] sd 1:0:112:0: [sddg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.330533] sd 1:0:112:0: [sddg] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.330578] sd 1:0:119:0: [sddn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.330579] sd 1:0:119:0: [sddn] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.330603] sd 1:0:91:0: [sdcl] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.330686] sd 1:0:69:0: [sdbp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.330689] sd 1:0:69:0: [sdbp] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.330837] sd 1:0:102:0: [sdcw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.330894] sd 1:0:79:0: [sdbz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.330895] sd 1:0:79:0: [sdbz] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.330920] sd 1:0:71:0: [sdbr] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.331064] sd 1:0:63:0: [sdbj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331090] sd 1:0:109:0: [sddd] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.331167] sd 1:0:56:0: [sdbd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331174] sd 1:0:111:0: [sddf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331181] sd 1:0:66:0: [sdbm] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.331196] sd 1:0:90:0: [sdck] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331217] sd 1:0:94:0: [sdco] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331281] sd 1:0:103:0: [sdcx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331346] sd 1:0:70:0: [sdbq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331355] sd 1:0:82:0: [sdcc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331401] sd 1:0:117:0: [sddl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331401] sd 1:0:50:0: [sdax] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331452] sd 1:0:48:0: [sdav] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331489] sd 1:0:119:0: [sddn] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.331505] sd 1:0:91:0: [sdcl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331576] sd 1:0:107:0: [sddb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331717] sd 1:0:58:0: [sdbf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.331744] sd 1:0:69:0: [sdbp] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.331760] sd 1:0:95:0: [sdcp] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.332100] sd 1:0:79:0: [sdbz] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.332264] sd 1:0:78:0: [sdby] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.332573] sd 1:0:79:0: [sdbz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.332664] sd 1:0:34:0: [sdah] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.332817] sd 1:0:109:0: [sddd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.332827] sd 1:0:118:0: [sddm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.332828] sd 1:0:118:0: [sddm] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.332988] sd 1:0:121:0: [sddp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.332990] sd 1:0:121:0: [sddp] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.333073] sd 1:0:119:0: [sddn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.333113] sd 1:0:86:0: [sdcg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.333115] sd 1:0:86:0: [sdcg] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.333157] sd 1:0:71:0: [sdbr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.333165] sd 1:0:78:0: [sdby] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.333231] sd 1:0:5:0: [sde] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.333386] sd 1:0:66:0: [sdbm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.333707] sd 1:0:121:0: [sddp] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.333869] sd 1:0:57:0: [sdbe] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.334008] sd 1:0:84:0: [sdce] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.334009] sd 1:0:84:0: [sdce] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.334041] sd 1:0:27:0: [sdaa] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.334094] sd 1:0:110:0: [sdde] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.334096] sd 1:0:110:0: [sdde] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.334169] sd 1:0:118:0: [sddm] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.334210] sd 1:0:88:0: [sdci] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.334211] sd 1:0:88:0: [sdci] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.334436] sd 1:0:73:0: [sdbt] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.334437] sd 1:0:73:0: [sdbt] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.334528] sd 1:0:106:0: [sdda] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.334535] sd 1:0:106:0: [sdda] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.334582] sd 1:0:49:0: [sdaw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.334677] sd 1:0:86:0: [sdcg] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.334825] sd 1:0:121:0: [sddp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.335113] sd 1:0:88:0: [sdci] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.335123] sd 1:0:12:0: [sdl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.335126] sd 1:0:45:0: [sdas] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.335148] sd 1:0:72:0: [sdbs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.335149] sd 1:0:72:0: [sdbs] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.335348] sd 1:0:27:0: [sdaa] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.335426] sd 1:0:59:0: [sdbg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.335506] sd 1:0:108:0: [sddc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.335507] sd 1:0:108:0: [sddc] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.335730] sd 1:0:116:0: [sddk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.335732] sd 1:0:116:0: [sddk] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.335771] sd 1:0:42:0: [sdap] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.335927] sd 1:0:67:0: [sdbn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.335929] sd 1:0:67:0: [sdbn] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.335984] sd 1:0:30:0: [sdad] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.335995] sd 1:0:84:0: [sdce] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.335998] sd 1:0:106:0: [sdda] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.336012] sd 1:0:110:0: [sdde] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.336039] sd 1:0:120:0: [sddo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.336041] sd 1:0:120:0: [sddo] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.336053] sd 1:0:114:0: [sddi] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.336280] sd 1:0:83:0: [sdcd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.336282] sd 1:0:83:0: [sdcd] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.336562] sd 1:0:55:0: [sdbc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.336664] sd 1:0:88:0: [sdci] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.336742] sd 1:0:67:0: [sdbn] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.336793] sd 1:0:96:0: [sdcq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.336795] sd 1:0:96:0: [sdcq] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.336870] sd 1:0:116:0: [sddk] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.336899] sd 1:0:86:0: [sdcg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.337049] sd 1:0:85:0: [sdcf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.337050] sd 1:0:85:0: [sdcf] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.337204] sd 1:0:67:0: [sdbn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.337235] sd 1:0:21:0: [sdu] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.337283] sd 1:0:108:0: [sddc] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.337587] sd 1:0:114:0: [sddi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.337641] sd 1:0:106:0: [sdda] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.338015] sd 1:0:2:0: [sdb] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.338094] sd 1:0:84:0: [sdce] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.338147] sd 1:0:83:0: [sdcd] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.338262] sd 1:0:85:0: [sdcf] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.338380] sd 1:0:80:0: [sdca] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.338382] sd 1:0:80:0: [sdca] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.338394] sd 1:0:24:0: [sdx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.339023] sd 1:0:120:0: [sddo] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.339062] sd 1:0:116:0: [sddk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.339316] sd 1:0:110:0: [sdde] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.339481] sd 1:0:83:0: [sdcd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.339694] sd 1:0:104:0: [sdcy] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.339727] sd 1:0:85:0: [sdcf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.339813] sd 1:0:73:0: [sdbt] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.339963] sd 1:0:115:0: [sddj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.339964] sd 1:0:115:0: [sddj] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.340086] sd 1:0:95:0: [sdcp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.340157] sd 1:0:97:0: [sdcr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.340159] sd 1:0:97:0: [sdcr] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.340548] sd 1:0:10:0: [sdj] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.340678] sd 1:0:112:0: [sddg] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.340716] sd 1:0:96:0: [sdcq] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.340977] sd 1:0:98:0: [sdcs] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.341308] sd 1:0:22:0: [sdv] Attached SCSI disk [Mon Dec 9 06:17:38 2019][ 31.341475] sd 1:0:98:0: [sdcs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.342032] sd 1:0:74:0: [sdbu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:38 2019][ 31.342034] sd 1:0:74:0: [sdbu] 4096-byte physical blocks [Mon Dec 9 06:17:38 2019][ 31.342111] sd 1:0:73:0: [sdbt] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.342541] sd 1:0:112:0: [sddg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.342556] sd 1:0:99:0: [sdct] Write Protect is off [Mon Dec 9 06:17:38 2019][ 31.342592] sd 1:0:118:0: [sddm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:38 2019][ 31.343012] sd 1:0:124:0: [sddr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.343014] sd 1:0:124:0: [sddr] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.343673] sd 1:0:120:0: [sddo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.344546] sd 1:0:74:0: [sdbu] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.344810] sd 1:0:53:0: [sdba] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.345294] sd 1:0:105:0: [sdcz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.345296] sd 1:0:105:0: [sdcz] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.345466] sd 1:0:87:0: [sdch] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.345467] sd 1:0:87:0: [sdch] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.345694] sd 1:0:80:0: [sdca] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.345706] sd 1:0:69:0: [sdbp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.345731] sd 1:0:108:0: [sddc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.345821] sd 1:0:9:0: [sdi] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.345855] sd 1:0:17:0: [sdq] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.346350] sd 1:0:64:0: [sdbk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.346352] sd 1:0:64:0: [sdbk] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.346360] sd 1:0:97:0: [sdcr] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.346751] sd 1:0:80:0: [sdca] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.347549] sd 1:0:96:0: [sdcq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.347697] sd 1:0:115:0: [sddj] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.347959] sd 1:0:105:0: [sdcz] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.348027] sd 1:0:99:0: [sdct] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.348617] sd 1:0:75:0: [sdbv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.348619] sd 1:0:75:0: [sdbv] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.349075] sd 1:0:37:0: [sdak] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.349144] sd 1:0:14:0: [sdn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.349806] sd 1:0:46:0: [sdat] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.350115] sd 1:0:126:0: [sddt] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.350117] sd 1:0:126:0: [sddt] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.350432] sd 1:0:75:0: [sdbv] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.350503] sd 1:0:72:0: [sdbs] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.351771] sd 1:0:72:0: [sdbs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.351909] sd 1:0:127:0: [sddu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.351911] sd 1:0:127:0: [sddu] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.352074] sd 1:0:124:0: [sddr] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.352639] sd 1:0:75:0: [sdbv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.352647] sd 1:0:71:0: [sdbr] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.353090] sd 1:0:104:0: [sdcy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.353451] sd 1:0:128:0: [sddv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.353453] sd 1:0:128:0: [sddv] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.353505] sd 1:0:82:0: [sdcc] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.353969] sd 1:0:79:0: [sdbz] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.354962] sd 1:0:16:0: [sdp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.354964] sd 1:0:16:0: [sdp] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.355037] sd 1:0:64:0: [sdbk] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.355175] sd 1:0:38:0: [sdal] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.355278] sd 1:0:125:0: [sdds] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.355280] sd 1:0:125:0: [sdds] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.355427] sd 1:0:77:0: [sdbx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.355435] sd 1:0:77:0: [sdbx] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.355679] sd 1:0:97:0: [sdcr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.355827] sd 1:0:126:0: [sddt] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.356458] sd 1:0:130:0: [sddx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.356461] sd 1:0:130:0: [sddx] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.356710] sd 1:0:40:0: [sdan] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.356712] sd 1:0:40:0: [sdan] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.357847] sd 1:0:65:0: [sdbl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.357852] sd 1:0:65:0: [sdbl] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.357979] sd 1:0:101:0: [sdcv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.357989] sd 1:0:101:0: [sdcv] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.358131] sd 1:0:74:0: [sdbu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.358536] sd 1:0:64:0: [sdbk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.358558] sd 1:0:40:0: [sdan] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.358726] sd 1:0:98:0: [sdcs] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.359194] sd 1:0:101:0: [sdcv] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.359415] sd 1:0:40:0: [sdan] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.359441] sd 1:0:115:0: [sddj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.360224] sd 1:0:125:0: [sdds] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.360701] sd 1:0:125:0: [sdds] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.360804] sd 1:0:101:0: [sdcv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.361072] sd 1:0:111:0: [sddf] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.362743] sd 1:0:70:0: [sdbq] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.362757] sd 1:0:13:0: [sdm] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.363084] sd 1:0:126:0: [sddt] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.363202] sd 1:0:127:0: [sddu] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.363983] sd 1:0:16:0: [sdp] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.364096] sd 1:0:77:0: [sdbx] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.364224] sd 1:0:50:0: [sdax] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.364563] sd 1:0:87:0: [sdch] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.364756] sd 1:0:135:0: [sdec] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.364758] sd 1:0:135:0: [sdec] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.364888] sd 1:0:131:0: [sddy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.364890] sd 1:0:131:0: [sddy] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.364930] sd 1:0:130:0: [sddx] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.365049] sd 1:0:87:0: [sdch] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.365091] sd 1:0:77:0: [sdbx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.365259] sd 1:0:114:0: [sddi] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.365321] sd 1:0:16:0: [sdp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.365393] sd 1:0:105:0: [sdcz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.365608] sd 1:0:131:0: [sddy] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.365762] sd 1:0:4:0: [sdd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.366074] sd 1:0:65:0: [sdbl] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.366234] sd 1:0:68:0: [sdbo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.366237] sd 1:0:68:0: [sdbo] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.366625] sd 1:0:129:0: [sddw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.366627] sd 1:0:129:0: [sddw] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.366637] sd 1:0:131:0: [sddy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.368114] sd 1:0:127:0: [sddu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.368291] sd 1:0:65:0: [sdbl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.368849] sd 1:0:15:0: [sdo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.368851] sd 1:0:15:0: [sdo] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.370291] sd 1:0:14:0: [sdn] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.370723] sd 1:0:130:0: [sddx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.370787] sd 1:0:132:0: [sddz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.370789] sd 1:0:132:0: [sddz] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.371544] sd 1:0:132:0: [sddz] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.371863] sd 1:0:92:0: [sdcm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.371865] sd 1:0:92:0: [sdcm] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.372065] sd 1:0:138:0: [sdef] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.372066] sd 1:0:138:0: [sdef] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.372273] sd 1:0:124:0: [sddr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.372600] sd 1:0:132:0: [sddz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.372747] sd 1:0:135:0: [sdec] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.373529] sd 1:0:41:0: [sdao] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.373541] sd 1:0:11:0: [sdk] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.374688] sd 1:0:136:0: [sded] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.374690] sd 1:0:136:0: [sded] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.374821] sd 1:0:33:0: [sdag] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.374907] sd 1:0:139:0: [sdeg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.374909] sd 1:0:139:0: [sdeg] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.375093] sd 1:0:75:0: [sdbv] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.375476] sd 1:0:39:0: [sdam] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.375477] sd 1:0:39:0: [sdam] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.375664] sd 1:0:94:0: [sdco] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.375960] sd 1:0:76:0: [sdbw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.375975] sd 1:0:76:0: [sdbw] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.376306] sd 1:0:129:0: [sddw] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.376355] sd 1:0:15:0: [sdo] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.376464] sd 1:0:61:0: [sdbi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.376466] sd 1:0:61:0: [sdbi] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.376602] sd 1:0:100:0: [sdcu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.376604] sd 1:0:100:0: [sdcu] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.376780] sd 1:0:129:0: [sddw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.377123] sd 1:0:15:0: [sdo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.377306] sd 1:0:39:0: [sdam] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.377591] sd 1:0:128:0: [sddv] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.377709] sd 1:0:100:0: [sdcu] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.377821] sd 1:0:102:0: [sdcw] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.378073] sd 1:0:128:0: [sddv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.379159] sd 1:0:72:0: [sdbs] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.379900] sd 1:0:137:0: [sdee] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.379902] sd 1:0:137:0: [sdee] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.380548] sd 1:0:28:0: [sdab] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.381405] sd 1:0:140:0: [sdeh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:39 2019][ 31.381414] sd 1:0:140:0: [sdeh] 4096-byte physical blocks [Mon Dec 9 06:17:39 2019][ 31.381769] sd 1:0:135:0: [sdec] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.383578] sd 1:0:24:0: [sdx] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.384119] sd 1:0:99:0: [sdct] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.384375] sd 1:0:107:0: [sddb] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.384437] sd 1:0:85:0: [sdcf] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.384979] sd 1:0:138:0: [sdef] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.385077] sd 1:0:136:0: [sded] Write Protect is off [Mon Dec 9 06:17:39 2019][ 31.385454] sd 1:0:138:0: [sdef] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:39 2019][ 31.385682] sd 1:0:12:0: [sdl] Attached SCSI disk [Mon Dec 9 06:17:39 2019][ 31.386005] sd 1:0:83:0: [sdcd] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.387022] sd 1:0:54:0: [sdbb] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.387032] sd 1:0:44:0: [sdar] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.387611] sd 1:0:95:0: [sdcp] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.388033] sd 1:0:43:0: [sdaq] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.388229] sd 1:0:93:0: [sdcn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.388231] sd 1:0:93:0: [sdcn] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.388600] sd 1:0:73:0: [sdbt] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.389173] sd 1:0:122:0: [sddq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.389175] sd 1:0:122:0: [sddq] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.389581] sd 1:0:35:0: [sdai] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.390099] sd 1:0:52:0: [sdaz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.390197] sd 1:0:76:0: [sdbw] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.390241] sd 1:0:48:0: [sdav] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.390820] sd 1:0:137:0: [sdee] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.391298] sd 1:0:137:0: [sdee] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.392274] sd 1:0:106:0: [sdda] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.392435] sd 1:0:142:0: [sdej] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.392437] sd 1:0:142:0: [sdej] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.392865] sd 1:0:141:0: [sdei] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.392866] sd 1:0:141:0: [sdei] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.392949] sd 1:0:139:0: [sdeg] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.394364] sd 1:0:56:0: [sdbd] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.395287] sd 1:0:134:0: [sdeb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.395289] sd 1:0:134:0: [sdeb] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.395693] sd 1:0:45:0: [sdas] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.395894] sd 1:0:47:0: [sdau] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.396270] sd 1:0:19:0: [sds] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.398141] sd 1:0:133:0: [sdea] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.398143] sd 1:0:133:0: [sdea] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.398151] sd 1:0:117:0: [sddl] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.398549] sd 1:0:140:0: [sdeh] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.398601] sd 1:0:139:0: [sdeg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.398737] sd 1:0:142:0: [sdej] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.399026] sd 1:0:140:0: [sdeh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.399049] sd 1:0:104:0: [sdcy] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.399680] sd 1:0:26:0: [sdz] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.399700] sd 1:0:109:0: [sddd] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.400122] sd 1:0:134:0: [sdeb] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.400781] sd 1:0:55:0: [sdbc] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.400785] sd 1:0:143:0: [sdek] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.400787] sd 1:0:143:0: [sdek] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.401579] sd 1:0:143:0: [sdek] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.402565] sd 1:0:68:0: [sdbo] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.403091] sd 1:0:87:0: [sdch] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.403228] sd 1:0:136:0: [sded] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.403422] sd 1:0:119:0: [sddn] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.403737] sd 1:0:96:0: [sdcq] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.404281] sd 1:0:58:0: [sdbf] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.404343] sd 1:0:59:0: [sdbg] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.404345] sd 1:0:68:0: [sdbo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.404716] sd 1:0:80:0: [sdca] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.405477] sd 1:0:141:0: [sdei] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.405801] sd 1:0:134:0: [sdeb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.405854] sd 1:0:74:0: [sdbu] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.405895] sd 1:0:89:0: [sdcj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.405896] sd 1:0:89:0: [sdcj] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.405932] sd 1:0:141:0: [sdei] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.406792] sd 1:0:89:0: [sdcj] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.407018] sd 1:0:28:0: [sdab] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.408360] sd 1:0:89:0: [sdcj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.408978] sd 1:0:126:0: [sddt] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.409127] sd 1:0:125:0: [sdds] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.409454] sd 1:0:49:0: [sdaw] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.409930] sd 1:0:142:0: [sdej] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.410423] sd 1:0:115:0: [sddj] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.411610] sd 1:0:25:0: [sdy] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.412304] sd 1:0:116:0: [sddk] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.412471] sd 1:0:110:0: [sdde] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.413860] sd 1:0:113:0: [sddh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.413862] sd 1:0:113:0: [sddh] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.413921] sd 1:0:7:0: [sdg] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.415027] sd 1:0:86:0: [sdcg] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.415927] sd 1:0:60:0: [sdbh] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.418926] sd 1:0:124:0: [sddr] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.418940] sd 1:0:93:0: [sdcn] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.419767] sd 1:0:130:0: [sddx] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.419809] sd 1:0:108:0: [sddc] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.420474] sd 1:0:93:0: [sdcn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.422160] sd 1:0:127:0: [sddu] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.422775] sd 1:0:129:0: [sddw] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.422788] sd 1:0:36:0: [sdaj] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.422905] sd 1:0:145:0: [sdem] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.422906] sd 1:0:145:0: [sdem] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.423164] sd 1:0:120:0: [sddo] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.423643] sd 1:0:23:0: [sdw] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.425946] sd 1:0:32:0: [sdaf] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.425973] sd 1:0:90:0: [sdck] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.426331] sd 1:0:136:0: [sded] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.426811] sd 1:0:105:0: [sdcz] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.426843] sd 1:0:97:0: [sdcr] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.427286] sd 1:0:61:0: [sdbi] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.427356] sd 1:0:143:0: [sdek] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.427722] sd 1:0:39:0: [sdam] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.428516] sd 1:0:100:0: [sdcu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.430554] sd 1:0:138:0: [sdef] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.430788] sd 1:0:145:0: [sdem] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.431092] sd 1:0:121:0: [sddp] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.434480] sd 1:0:84:0: [sdce] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.434925] sd 1:0:57:0: [sdbe] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.435025] sd 1:0:113:0: [sddh] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.435048] sd 1:0:64:0: [sdbk] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.435629] sd 1:0:128:0: [sddv] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.435692] sd 1:0:135:0: [sdec] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.436055] sd 1:0:144:0: [sdel] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.436056] sd 1:0:144:0: [sdel] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.436439] sd 1:0:133:0: [sdea] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.436681] sd 1:0:118:0: [sddm] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.437585] sd 1:0:122:0: [sddq] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.438531] sd 1:0:146:0: [sden] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.438533] sd 1:0:146:0: [sden] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.438872] sd 1:0:92:0: [sdcm] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.440131] sd 1:0:137:0: [sdee] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.443487] sd 1:0:4:0: [sdd] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.444658] sd 1:0:145:0: [sdem] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.445854] sd 1:0:81:0: [sdcb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.445855] sd 1:0:81:0: [sdcb] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.446167] sd 1:0:61:0: [sdbi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.446452] sd 1:0:139:0: [sdeg] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.446465] sd 1:0:122:0: [sddq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.447110] sd 1:0:133:0: [sdea] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.447213] sd 1:0:146:0: [sden] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.447632] sd 1:0:20:0: [sdt] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.452168] sd 1:0:147:0: [sdeo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.452170] sd 1:0:147:0: [sdeo] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.452899] sd 1:0:8:0: [sdh] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.452901] sd 1:0:147:0: [sdeo] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.453369] sd 1:0:147:0: [sdeo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.454472] sd 1:0:141:0: [sdei] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.455955] sd 1:0:76:0: [sdbw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.457261] sd 1:0:113:0: [sddh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.459607] sd 1:0:146:0: [sden] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.465876] sd 1:0:144:0: [sdel] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.467035] sd 1:0:148:0: [sdep] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.467036] sd 1:0:148:0: [sdep] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.467749] sd 1:0:148:0: [sdep] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.468208] sd 1:0:148:0: [sdep] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.474997] sd 1:0:140:0: [sdeh] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.477250] sd 1:0:149:0: [sdeq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.477252] sd 1:0:149:0: [sdeq] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.477263] sd 1:0:93:0: [sdcn] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.477303] sd 1:0:134:0: [sdeb] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.480094] sd 1:0:103:0: [sdcx] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.482543] sd 1:0:91:0: [sdcl] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.484367] sd 1:0:92:0: [sdcm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.485181] sd 1:0:150:0: [sder] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.485183] sd 1:0:150:0: [sder] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.489257] sd 1:0:81:0: [sdcb] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.490095] sd 1:0:149:0: [sdeq] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.490858] sd 1:0:67:0: [sdbn] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.491291] sd 1:0:27:0: [sdaa] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.492160] sd 1:0:88:0: [sdci] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.493430] sd 1:0:28:0: [sdab] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.495996] sd 1:0:147:0: [sdeo] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.497534] sd 1:0:69:0: [sdbp] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.497651] sd 1:0:151:0: [sdes] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:40 2019][ 31.497652] sd 1:0:151:0: [sdes] 4096-byte physical blocks [Mon Dec 9 06:17:40 2019][ 31.498366] sd 1:0:151:0: [sdes] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.498609] sd 1:0:78:0: [sdby] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.498663] sd 1:0:51:0: [sday] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.498687] sd 1:0:150:0: [sder] Write Protect is off [Mon Dec 9 06:17:40 2019][ 31.498840] sd 1:0:151:0: [sdes] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.499167] sd 1:0:150:0: [sder] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.500318] sd 1:0:149:0: [sdeq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.503253] sd 1:0:89:0: [sdcj] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.505600] sd 1:0:144:0: [sdel] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:40 2019][ 31.507270] sd 1:0:63:0: [sdbj] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.508035] sd 1:0:133:0: [sdea] Attached SCSI disk [Mon Dec 9 06:17:40 2019][ 31.509931] sd 1:0:65:0: [sdbl] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.511466] sd 1:0:112:0: [sddg] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.511937] sd 1:0:146:0: [sden] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.515360] sd 1:0:39:0: [sdam] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.516709] sd 1:0:152:0: [sdet] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.516711] sd 1:0:152:0: [sdet] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.516905] sd 1:0:92:0: [sdcm] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.517416] sd 1:0:152:0: [sdet] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.517883] sd 1:0:152:0: [sdet] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.523403] sd 1:0:81:0: [sdcb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.523598] sd 1:0:16:0: [sdp] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.524561] sd 1:0:77:0: [sdbx] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.524895] sd 1:0:66:0: [sdbm] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.526000] sd 1:0:100:0: [sdcu] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.532011] sd 1:0:148:0: [sdep] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.533355] sd 1:0:122:0: [sddq] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.533396] sd 1:0:145:0: [sdem] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.533652] sd 1:0:142:0: [sdej] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.536070] sd 1:0:153:0: [sdeu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.536072] sd 1:0:153:0: [sdeu] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.536792] sd 1:0:153:0: [sdeu] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.537264] sd 1:0:153:0: [sdeu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.538383] sd 1:0:52:0: [sdaz] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.540923] sd 1:0:149:0: [sdeq] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.541726] sd 1:0:154:0: [sdev] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.541727] sd 1:0:154:0: [sdev] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.541736] sd 1:0:113:0: [sddh] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.541833] sd 1:0:61:0: [sdbi] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.546924] sd 1:0:154:0: [sdev] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.547802] sd 1:0:155:0: [sdew] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.547804] sd 1:0:155:0: [sdew] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.548779] sd 1:0:131:0: [sddy] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.550506] sd 1:0:132:0: [sddz] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.550631] sd 1:0:154:0: [sdev] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.553812] sd 1:0:81:0: [sdcb] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.556737] sd 1:0:155:0: [sdew] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.556772] sd 1:0:68:0: [sdbo] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.563645] sd 1:0:152:0: [sdet] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.569059] sd 1:0:156:0: [sdex] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.569061] sd 1:0:156:0: [sdex] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.574729] sd 1:0:158:0: [sdez] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.574732] sd 1:0:158:0: [sdez] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.575458] sd 1:0:158:0: [sdez] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.575942] sd 1:0:158:0: [sdez] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.577140] sd 1:0:151:0: [sdes] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.577349] sd 1:0:150:0: [sder] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.579122] sd 1:0:156:0: [sdex] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.581150] sd 1:0:153:0: [sdeu] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.582323] sd 1:0:159:0: [sdfa] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.582325] sd 1:0:159:0: [sdfa] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.583024] sd 1:0:159:0: [sdfa] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.583469] sd 1:0:159:0: [sdfa] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.585980] sd 1:0:160:0: [sdfb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.585982] sd 1:0:160:0: [sdfb] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.586702] sd 1:0:160:0: [sdfb] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.586801] sd 1:0:157:0: [sdey] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.586803] sd 1:0:157:0: [sdey] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.587184] sd 1:0:160:0: [sdfb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.587379] sd 1:0:154:0: [sdev] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.593130] sd 1:0:157:0: [sdey] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.593539] sd 1:0:161:0: [sdfc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.593541] sd 1:0:161:0: [sdfc] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.594287] sd 1:0:161:0: [sdfc] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.594762] sd 1:0:161:0: [sdfc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.598132] sd 1:0:162:0: [sdfd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.598133] sd 1:0:162:0: [sdfd] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.600843] sd 1:0:163:0: [sdfe] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.600844] sd 1:0:163:0: [sdfe] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.605308] sd 1:0:164:0: [sdff] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.605309] sd 1:0:164:0: [sdff] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.605831] sd 1:0:162:0: [sdfd] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.606040] sd 1:0:164:0: [sdff] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.606519] sd 1:0:164:0: [sdff] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.608864] sd 1:0:155:0: [sdew] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.609682] sd 1:0:156:0: [sdex] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.611626] sd 1:0:163:0: [sdfe] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.611907] sd 1:0:166:0: [sdfh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.611908] sd 1:0:166:0: [sdfh] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.612458] sd 1:0:157:0: [sdey] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.612622] sd 1:0:166:0: [sdfh] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.613089] sd 1:0:166:0: [sdfh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.613531] sd 1:0:101:0: [sdcv] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.614391] sd 1:0:40:0: [sdan] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.614450] sd 1:0:167:0: [sdfi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.614451] sd 1:0:167:0: [sdfi] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.614575] sd 1:0:163:0: [sdfe] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.614953] sd 1:0:159:0: [sdfa] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.616728] sd 1:0:165:0: [sdfg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.616730] sd 1:0:165:0: [sdfg] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.617641] sd 1:0:169:0: [sdfk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.617643] sd 1:0:169:0: [sdfk] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.617721] sd 1:0:162:0: [sdfd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.620471] sd 1:0:173:0: [sdfo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.620472] sd 1:0:173:0: [sdfo] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.620986] sd 1:0:175:0: [sdfq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.620988] sd 1:0:175:0: [sdfq] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.621717] sd 1:0:175:0: [sdfq] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.622060] sd 1:0:165:0: [sdfg] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.622178] sd 1:0:175:0: [sdfq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.622535] sd 1:0:165:0: [sdfg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.622589] sd 1:0:160:0: [sdfb] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.623098] sd 1:0:177:0: [sdfs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.623099] sd 1:0:177:0: [sdfs] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.624165] sd 1:0:170:0: [sdfl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.624167] sd 1:0:170:0: [sdfl] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.624195] sd 1:0:178:0: [sdft] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.624196] sd 1:0:178:0: [sdft] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.624240] sd 1:0:167:0: [sdfi] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.624636] sd 1:0:174:0: [sdfp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.624637] sd 1:0:174:0: [sdfp] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.624704] sd 1:0:167:0: [sdfi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.625390] sd 1:0:179:0: [sdfu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.625392] sd 1:0:179:0: [sdfu] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.626642] sd 1:0:180:0: [sdfv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.626644] sd 1:0:180:0: [sdfv] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.626743] sd 1:0:169:0: [sdfk] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.627373] sd 1:0:180:0: [sdfv] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.628271] sd 1:0:180:0: [sdfv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.628669] sd 1:0:176:0: [sdfr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.628671] sd 1:0:176:0: [sdfr] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.628763] sd 1:0:181:0: [sdfw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.628764] sd 1:0:181:0: [sdfw] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.629083] sd 1:0:172:0: [sdfn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.629085] sd 1:0:172:0: [sdfn] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.630233] sd 1:0:15:0: [sdo] Attached SCSI disk [Mon Dec 9 06:17:41 2019][ 31.631173] sd 1:0:173:0: [sdfo] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.631787] sd 1:0:168:0: [sdfj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.631789] sd 1:0:168:0: [sdfj] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.631827] sd 1:0:171:0: [sdfm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.631828] sd 1:0:171:0: [sdfm] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.632986] sd 1:0:174:0: [sdfp] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.632999] sd 1:0:183:0: [sdfy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.633001] sd 1:0:183:0: [sdfy] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.633464] sd 1:0:174:0: [sdfp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.633714] sd 1:0:183:0: [sdfy] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.634184] sd 1:0:183:0: [sdfy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.634476] sd 1:0:179:0: [sdfu] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.635235] sd 1:0:168:0: [sdfj] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.636051] sd 1:0:185:0: [sdfz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.636053] sd 1:0:185:0: [sdfz] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.636443] sd 1:0:177:0: [sdfs] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.636611] sd 1:0:181:0: [sdfw] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.636759] sd 1:0:185:0: [sdfz] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.637091] sd 1:0:177:0: [sdfs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.637194] sd 1:0:170:0: [sdfl] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.637211] sd 1:0:185:0: [sdfz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.637304] sd 1:0:186:0: [sdga] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.637306] sd 1:0:186:0: [sdga] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.637654] sd 1:0:170:0: [sdfl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.637934] sd 1:0:178:0: [sdft] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.638317] sd 1:0:182:0: [sdfx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.638319] sd 1:0:182:0: [sdfx] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.638354] sd 1:0:169:0: [sdfk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:41 2019][ 31.638658] sd 1:0:187:0: [sdgb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:41 2019][ 31.638660] sd 1:0:187:0: [sdgb] 4096-byte physical blocks [Mon Dec 9 06:17:41 2019][ 31.639367] sd 1:0:187:0: [sdgb] Write Protect is off [Mon Dec 9 06:17:41 2019][ 31.639602] sd 1:0:188:0: [sdgc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.639603] sd 1:0:188:0: [sdgc] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.639839] sd 1:0:187:0: [sdgb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.640288] sd 1:0:172:0: [sdfn] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.640568] sd 1:0:189:0: [sdgd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.640570] sd 1:0:189:0: [sdgd] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.640762] sd 1:0:172:0: [sdfn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.641279] sd 1:0:189:0: [sdgd] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.641316] sd 1:0:190:0: [sdge] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.641318] sd 1:0:190:0: [sdge] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.641570] sd 1:0:182:0: [sdfx] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.641758] sd 1:0:189:0: [sdgd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.642041] sd 1:0:182:0: [sdfx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.642297] sd 1:0:171:0: [sdfm] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.642436] sd 1:0:164:0: [sdff] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.642527] sd 1:0:176:0: [sdfr] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.642775] sd 1:0:171:0: [sdfm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.642799] sd 1:0:191:0: [sdgf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.642800] sd 1:0:191:0: [sdgf] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.643172] sd 1:0:179:0: [sdfu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.643615] sd 1:0:166:0: [sdfh] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.644213] sd 1:0:173:0: [sdfo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.644364] sd 1:0:178:0: [sdft] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.645335] sd 1:0:181:0: [sdfw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.648762] sd 1:0:194:0: [sdgi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.648764] sd 1:0:194:0: [sdgi] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.649970] sd 1:0:186:0: [sdga] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.650445] sd 1:0:186:0: [sdga] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.650771] sd 1:0:191:0: [sdgf] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.651249] sd 1:0:191:0: [sdgf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.651364] sd 1:0:196:0: [sdgk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.651367] sd 1:0:196:0: [sdgk] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.651616] sd 1:0:176:0: [sdfr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.652092] sd 1:0:196:0: [sdgk] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.652368] sd 1:0:197:0: [sdgl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.652369] sd 1:0:197:0: [sdgl] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.652563] sd 1:0:196:0: [sdgk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.653100] sd 1:0:197:0: [sdgl] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.653568] sd 1:0:197:0: [sdgl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.654264] sd 1:0:198:0: [sdgm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.654265] sd 1:0:175:0: [sdfq] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.654266] sd 1:0:198:0: [sdgm] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.654780] sd 1:0:190:0: [sdge] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.654977] sd 1:0:198:0: [sdgm] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.655356] sd 1:0:190:0: [sdge] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.655444] sd 1:0:198:0: [sdgm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.655928] sd 1:0:199:0: [sdgn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.655930] sd 1:0:199:0: [sdgn] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.656656] sd 1:0:199:0: [sdgn] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.657135] sd 1:0:199:0: [sdgn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.657620] sd 1:0:194:0: [sdgi] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.658685] sd 1:0:188:0: [sdgc] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.659506] sd 1:0:194:0: [sdgi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.659904] sd 1:0:201:0: [sdgp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.659906] sd 1:0:201:0: [sdgp] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.661091] sd 1:0:180:0: [sdfv] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.661873] sd 1:0:202:0: [sdgq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.661875] sd 1:0:202:0: [sdgq] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.662599] sd 1:0:202:0: [sdgq] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.663063] sd 1:0:202:0: [sdgq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.663449] sd 1:0:182:0: [sdfx] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.663610] sd 1:0:203:0: [sdgr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.663612] sd 1:0:203:0: [sdgr] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.664319] sd 1:0:203:0: [sdgr] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.664390] sd 1:0:191:0: [sdgf] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.664788] sd 1:0:203:0: [sdgr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.665250] sd 1:0:196:0: [sdgk] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.666033] sd 1:0:172:0: [sdfn] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.666333] sd 1:0:197:0: [sdgl] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.667641] sd 1:0:200:0: [sdgo] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.667643] sd 1:0:200:0: [sdgo] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.668320] sd 1:0:179:0: [sdfu] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.668361] sd 1:0:200:0: [sdgo] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.668591] sd 1:0:157:0: [sdey] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.668828] sd 1:0:200:0: [sdgo] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.669704] sd 1:0:206:0: [sdgu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.669706] sd 1:0:206:0: [sdgu] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.670125] sd 1:0:170:0: [sdfl] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.670471] sd 1:0:168:0: [sdfj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.671632] sd 1:0:201:0: [sdgp] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.672092] sd 1:0:201:0: [sdgp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.672351] sd 1:0:188:0: [sdgc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.674242] sd 1:0:165:0: [sdfg] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.674254] sd 1:0:163:0: [sdfe] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.676884] sd 1:0:161:0: [sdfc] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.676922] sd 1:0:208:0: [sdgw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.676924] sd 1:0:208:0: [sdgw] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.677021] sd 1:0:186:0: [sdga] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.677621] sd 1:0:208:0: [sdgw] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.677744] sd 1:0:202:0: [sdgq] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.678065] sd 1:0:208:0: [sdgw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.678292] sd 1:0:185:0: [sdfz] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.678640] sd 1:0:183:0: [sdfy] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.678758] sd 1:0:203:0: [sdgr] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.679754] sd 1:0:209:0: [sdgx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.679756] sd 1:0:209:0: [sdgx] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.679913] sd 1:0:173:0: [sdfo] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.680445] sd 1:0:189:0: [sdgd] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.680464] sd 1:0:209:0: [sdgx] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.681840] sd 1:0:181:0: [sdfw] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.682232] sd 1:0:200:0: [sdgo] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.682533] sd 1:0:210:0: [sdgy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.682534] sd 1:0:210:0: [sdgy] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.682697] sd 1:0:207:0: [sdgv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.682698] sd 1:0:207:0: [sdgv] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.683278] sd 1:0:210:0: [sdgy] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.683389] sd 1:0:169:0: [sdfk] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.683758] sd 1:0:210:0: [sdgy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.684023] sd 1:0:174:0: [sdfp] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.684805] sd 1:0:211:0: [sdgz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.684808] sd 1:0:211:0: [sdgz] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.685019] sd 1:0:171:0: [sdfm] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.685414] sd 1:0:190:0: [sdge] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.685519] sd 1:0:211:0: [sdgz] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.685972] sd 1:0:211:0: [sdgz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.687622] sd 1:0:212:0: [sdha] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.687624] sd 1:0:212:0: [sdha] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.687625] sd 1:0:193:0: [sdgh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.687627] sd 1:0:193:0: [sdgh] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.687666] sd 1:0:195:0: [sdgj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.687668] sd 1:0:195:0: [sdgj] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.688337] sd 1:0:212:0: [sdha] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.688392] sd 1:0:195:0: [sdgj] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.688809] sd 1:0:212:0: [sdha] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.688872] sd 1:0:195:0: [sdgj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.688892] sd 1:0:206:0: [sdgu] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.688962] sd 1:0:207:0: [sdgv] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.689347] sd 1:0:176:0: [sdfr] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.689715] sd 1:0:213:0: [sdhb] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.689716] sd 1:0:213:0: [sdhb] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.690447] sd 1:0:213:0: [sdhb] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.690509] sd 1:0:192:0: [sdgg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.690511] sd 1:0:192:0: [sdgg] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.690920] sd 1:0:213:0: [sdhb] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.691189] sd 1:0:162:0: [sdfd] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.691722] sd 1:0:208:0: [sdgw] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.691898] sd 1:0:214:0: [sdhc] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.691899] sd 1:0:214:0: [sdhc] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.692591] sd 1:0:214:0: [sdhc] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.693060] sd 1:0:214:0: [sdhc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.693666] sd 1:0:206:0: [sdgu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.695225] sd 1:0:215:0: [sdhd] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.695227] sd 1:0:215:0: [sdhd] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.695917] sd 1:0:215:0: [sdhd] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.696213] sd 1:0:158:0: [sdez] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.696360] sd 1:0:215:0: [sdhd] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.696600] sd 1:0:198:0: [sdgm] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.702201] sd 1:0:188:0: [sdgc] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.702303] sd 1:0:207:0: [sdgv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.702375] sd 1:0:209:0: [sdgx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:42 2019][ 31.703051] sd 1:0:187:0: [sdgb] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.706269] sd 1:0:201:0: [sdgp] Attached SCSI disk [Mon Dec 9 06:17:42 2019][ 31.707388] sd 1:0:218:0: [sdhg] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:42 2019][ 31.707390] sd 1:0:218:0: [sdhg] 4096-byte physical blocks [Mon Dec 9 06:17:42 2019][ 31.708096] sd 1:0:218:0: [sdhg] Write Protect is off [Mon Dec 9 06:17:42 2019][ 31.708199] sd 1:0:205:0: [sdgt] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.708201] sd 1:0:205:0: [sdgt] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.708286] sd 1:0:195:0: [sdgj] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.708567] sd 1:0:218:0: [sdhg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.708715] sd 1:0:214:0: [sdhc] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.710282] sd 1:0:215:0: [sdhd] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.711578] sd 1:0:194:0: [sdgi] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.711657] sd 1:0:219:0: [sdhh] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.711658] sd 1:0:219:0: [sdhh] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.712364] sd 1:0:219:0: [sdhh] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.712828] sd 1:0:219:0: [sdhh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.714037] sd 1:0:220:0: [sdhi] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.714038] sd 1:0:220:0: [sdhi] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.714755] sd 1:0:220:0: [sdhi] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.715199] sd 1:0:220:0: [sdhi] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.716309] sd 1:0:209:0: [sdgx] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.716949] sd 1:0:221:0: [sdhj] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.716951] sd 1:0:221:0: [sdhj] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.717668] sd 1:0:221:0: [sdhj] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.718141] sd 1:0:221:0: [sdhj] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.719137] sd 1:0:204:0: [sdgs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.719139] sd 1:0:204:0: [sdgs] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.720644] sd 1:0:222:0: [sdhk] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.720646] sd 1:0:222:0: [sdhk] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.721349] sd 1:0:222:0: [sdhk] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.721821] sd 1:0:222:0: [sdhk] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.721928] sd 1:0:211:0: [sdgz] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.723019] sd 1:0:193:0: [sdgh] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.723198] sd 1:0:192:0: [sdgg] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.723441] sd 1:0:207:0: [sdgv] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.725122] sd 1:0:223:0: [sdhl] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.725124] sd 1:0:223:0: [sdhl] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.725320] sd 1:0:199:0: [sdgn] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.725816] sd 1:0:205:0: [sdgt] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.725862] sd 1:0:223:0: [sdhl] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.726354] sd 1:0:223:0: [sdhl] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.728607] sd 1:0:224:0: [sdhm] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.728610] sd 1:0:224:0: [sdhm] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.729316] sd 1:0:224:0: [sdhm] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.729785] sd 1:0:224:0: [sdhm] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.730816] sd 1:0:220:0: [sdhi] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.733715] sd 1:0:225:0: [sdhn] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.733716] sd 1:0:225:0: [sdhn] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.734096] sd 1:0:205:0: [sdgt] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.734458] sd 1:0:225:0: [sdhn] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.734923] sd 1:0:225:0: [sdhn] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.735305] sd 1:0:210:0: [sdgy] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.735749] sd 1:0:206:0: [sdgu] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.736066] sd 1:0:226:0: [sdho] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.736069] sd 1:0:226:0: [sdho] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.736826] sd 1:0:226:0: [sdho] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.737288] sd 1:0:226:0: [sdho] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.737746] sd 1:0:222:0: [sdhk] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.739470] sd 1:0:218:0: [sdhg] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.739810] sd 1:0:227:0: [sdhp] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.739811] sd 1:0:227:0: [sdhp] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.739909] sd 1:0:204:0: [sdgs] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.740366] sd 1:0:223:0: [sdhl] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.740532] sd 1:0:227:0: [sdhp] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.740978] sd 1:0:227:0: [sdhp] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.742090] sd 1:0:212:0: [sdha] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.742092] sd 1:0:221:0: [sdhj] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.743788] sd 1:0:204:0: [sdgs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.743932] sd 1:0:224:0: [sdhm] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.749052] sd 1:0:225:0: [sdhn] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.749252] sd 1:0:213:0: [sdhb] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.750510] sd 1:0:192:0: [sdgg] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.750683] sd 1:0:143:0: [sdek] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.751003] sd 1:0:216:0: [sdhe] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.751005] sd 1:0:216:0: [sdhe] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.751826] sd 1:0:226:0: [sdho] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.754532] sd 1:0:219:0: [sdhh] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.754892] sd 1:0:227:0: [sdhp] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.755256] sd 1:0:76:0: [sdbw] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.756126] sd 1:0:231:0: [sdht] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.756127] sd 1:0:231:0: [sdht] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.756637] sd 1:0:193:0: [sdgh] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.756837] sd 1:0:231:0: [sdht] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.757308] sd 1:0:231:0: [sdht] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.760958] sd 1:0:232:0: [sdhu] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.760960] sd 1:0:232:0: [sdhu] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.761687] sd 1:0:232:0: [sdhu] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.762173] sd 1:0:232:0: [sdhu] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.763937] sd 1:0:233:0: [sdhv] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.763939] sd 1:0:233:0: [sdhv] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.764160] sd 1:0:230:0: [sdhs] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.764162] sd 1:0:230:0: [sdhs] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.764656] sd 1:0:233:0: [sdhv] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.764764] sd 1:0:204:0: [sdgs] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.764894] sd 1:0:230:0: [sdhs] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.765111] sd 1:0:233:0: [sdhv] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.765421] sd 1:0:230:0: [sdhs] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.767089] sd 1:0:234:0: [sdhw] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.767091] sd 1:0:234:0: [sdhw] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.767359] sd 1:0:228:0: [sdhq] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.767361] sd 1:0:228:0: [sdhq] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.767812] sd 1:0:234:0: [sdhw] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.768847] sd 1:0:234:0: [sdhw] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.768909] sd 1:0:217:0: [sdhf] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.768914] sd 1:0:217:0: [sdhf] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.769023] sd 1:0:228:0: [sdhq] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.770648] sd 1:0:193:0: [sdgh] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.770690] sd 1:0:231:0: [sdht] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.774144] sd 1:0:236:0: [sdhy] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.774146] sd 1:0:236:0: [sdhy] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.776016] sd 1:0:237:0: [sdhz] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.776018] sd 1:0:237:0: [sdhz] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.776762] sd 1:0:237:0: [sdhz] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.777227] sd 1:0:237:0: [sdhz] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.777838] sd 1:0:144:0: [sdel] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.777963] sd 1:0:228:0: [sdhq] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.778636] sd 1:0:230:0: [sdhs] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.782302] sd 1:0:229:0: [sdhr] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.782304] sd 1:0:229:0: [sdhr] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.783920] sd 1:0:240:0: [sdic] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.783922] sd 1:0:240:0: [sdic] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.784449] sd 1:0:236:0: [sdhy] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.784695] sd 1:0:240:0: [sdic] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.784699] sd 1:0:216:0: [sdhe] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.784908] sd 1:0:236:0: [sdhy] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.785185] sd 1:0:240:0: [sdic] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.786479] sd 1:0:241:0: [sdid] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.786481] sd 1:0:241:0: [sdid] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.787211] sd 1:0:241:0: [sdid] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.787691] sd 1:0:241:0: [sdid] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.788313] sd 1:0:235:0: [sdhx] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.788315] sd 1:0:235:0: [sdhx] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.788373] sd 1:0:242:0: [sdie] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.788375] sd 1:0:242:0: [sdie] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.789143] sd 1:0:242:0: [sdie] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.789166] sd 1:0:217:0: [sdhf] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.789611] sd 1:0:242:0: [sdie] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.789860] sd 1:0:243:0: [sdif] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.789862] sd 1:0:243:0: [sdif] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.790582] sd 1:0:243:0: [sdif] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.790793] sd 1:0:237:0: [sdhz] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.791055] sd 1:0:243:0: [sdif] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.800363] sd 1:0:244:0: [sdig] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:43 2019][ 31.800364] sd 1:0:244:0: [sdig] 4096-byte physical blocks [Mon Dec 9 06:17:43 2019][ 31.801019] sd 1:0:192:0: [sdgg] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.801076] sd 1:0:244:0: [sdig] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.801534] sd 1:0:244:0: [sdig] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.802027] sd 1:0:229:0: [sdhr] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.802915] sd 1:0:235:0: [sdhx] Write Protect is off [Mon Dec 9 06:17:43 2019][ 31.806075] sd 1:0:235:0: [sdhx] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.812896] sd 1:0:217:0: [sdhf] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.813506] sd 1:0:241:0: [sdid] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.813541] sd 1:0:244:0: [sdig] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.813603] sd 1:0:232:0: [sdhu] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.814205] sd 1:0:242:0: [sdie] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.814940] sd 1:0:240:0: [sdic] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.817604] sd 1:0:229:0: [sdhr] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.817730] sd 1:0:233:0: [sdhv] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.819102] sd 1:0:234:0: [sdhw] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.820073] sd 1:0:216:0: [sdhe] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:43 2019][ 31.822026] sd 1:0:243:0: [sdif] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.827862] sd 1:0:236:0: [sdhy] Attached SCSI disk [Mon Dec 9 06:17:43 2019][ 31.831404] sd 1:0:235:0: [sdhx] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 31.869136] sd 1:0:238:0: [sdia] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:44 2019][ 31.869138] sd 1:0:238:0: [sdia] 4096-byte physical blocks [Mon Dec 9 06:17:44 2019][ 31.873268] sd 1:0:167:0: [sdfi] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 31.881364] sd 1:0:239:0: [sdib] 15628053168 512-byte logical blocks: (8.00 TB/7.27 TiB) [Mon Dec 9 06:17:44 2019][ 31.881365] sd 1:0:239:0: [sdib] 4096-byte physical blocks [Mon Dec 9 06:17:44 2019][ 31.886900] sd 1:0:239:0: [sdib] Write Protect is off [Mon Dec 9 06:17:44 2019][ 31.891023] sd 1:0:228:0: [sdhq] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 31.894258] sd 1:0:205:0: [sdgt] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 31.899945] sd 1:0:168:0: [sdfj] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 31.909223] sd 1:0:239:0: [sdib] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:44 2019][ 31.937424] sd 1:0:238:0: [sdia] Write Protect is off [Mon Dec 9 06:17:44 2019][ 31.938245] sd 1:0:155:0: [sdew] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 31.983172] sd 1:0:216:0: [sdhe] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.045746] sd 1:0:238:0: [sdia] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:44 2019][ 32.061830] sd 1:0:229:0: [sdhr] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.120234] sd 1:0:156:0: [sdex] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.170339] sd 1:0:178:0: [sdft] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.459480] sd 1:0:217:0: [sdhf] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.538296] sd 1:0:239:0: [sdib] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.625166] sd 1:0:177:0: [sdfs] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 32.879708] sd 1:0:238:0: [sdia] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ 38.910007] sd 1:0:3:0: [sdc] 4096-byte physical blocks [Mon Dec 9 06:17:44 2019][ 38.915954] sd 1:0:3:0: [sdc] Write Protect is off [Mon Dec 9 06:17:44 2019][ 38.921199] sd 1:0:3:0: [sdc] Write cache: enabled, read cache: enabled, supports DPO and FUA [Mon Dec 9 06:17:44 2019][ 39.042872] sd 1:0:3:0: [sdc] Attached SCSI disk [Mon Dec 9 06:17:44 2019][ OK ] Found device PERC_H330_Mini os. [Mon Dec 9 06:17:44 2019][ OK ] Started dracut initqueue hook. [Mon Dec 9 06:17:44 2019][ OK ] Reached target Remote File Systems (Pre). [Mon Dec 9 06:17:44 2019][ OK ] Reached target Remote File Systems. [Mon Dec 9 06:17:44 2019] Starting File System Check on /dev/...4-e7db-49b7-baed-d6c7905c5cdc... [Mon Dec 9 06:17:44 2019][ OK ] Started File System Check on /dev/d...4c4-e7db-49b7-baed-d6c7905c5cdc. [Mon Dec 9 06:17:44 2019] Mounting /sysroot... [Mon Dec 9 06:17:44 2019][ 39.131528] EXT4-fs (sda2): mounted filesystem with ordered data mode. Opts: (null) [Mon Dec 9 06:17:44 2019][ OK ] Mounted /sysroot. [Mon Dec 9 06:17:44 2019][ OK ] Reached target Initrd Root File System. [Mon Dec 9 06:17:44 2019] Starting Reload Configuration from the Real Root... [Mon Dec 9 06:17:44 2019][ OK ] Started Reload Configuration from the Real Root. [Mon Dec 9 06:17:44 2019][ OK ] Reached target Initrd File Systems. [Mon Dec 9 06:17:44 2019][ OK ] Reached target Initrd Default Target. [Mon Dec 9 06:17:44 2019] Starting dracut pre-pivot and cleanup hook... [Mon Dec 9 06:17:44 2019][ OK ] Started dracut pre-pivot and cleanup hook. [Mon Dec 9 06:17:44 2019] Starting Cleaning Up and Shutting Down Daemons... [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Timers. [Mon Dec 9 06:17:44 2019] Starting Plymouth switch root service... [Mon Dec 9 06:17:44 2019][ OK ] Stopped Cleaning Up and Shutting Down Daemons. [Mon Dec 9 06:17:44 2019][ OK ] Stopped dracut pre-pivot and cleanup hook. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Remote File Systems. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Remote File Systems (Pre). [Mon Dec 9 06:17:44 2019][ OK ] Stopped dracut initqueue hook. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Initrd Default Target. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Basic System. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target System Initialization. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Local File Systems. [Mon Dec 9 06:17:44 2019] Stopping udev Kernel Device Manager..[ 39.438534] systemd-journald[365]: Received SIGTERM from PID 1 (systemd). [Mon Dec 9 06:17:44 2019]. [Mon Dec 9 06:17:44 2019][ OK ] Stopped Apply Kernel Variables. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Swap. [Mon Dec 9 06:17:44 2019][ OK ] Stopped udev Coldplug all Devices. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Slices. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Paths. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Sockets. [Mon Dec 9 06:17:44 2019][ OK ] Stopped udev Kernel Device Manager. [Mon Dec 9 06:17:44 2019][ OK [0[ 39.475859] SELinux: Disabled at runtime. [Mon Dec 9 06:17:44 2019]m] Stopped Create Static Device Nodes in /dev. [Mon Dec 9 06:17:44 2019][ OK ] Stopped Create list of required sta...ce nodes for the current kernel. [Mon Dec 9 06:17:44 2019][ OK ] Stopped dracut pre-udev hook. [Mon Dec 9 06:17:44 2019][ OK ] Stopped dracut cmdline hook. [Mon Dec 9 06:17:44 2019][ OK ] Closed udev Kernel Socket. [Mon Dec 9 06:17:44 2019][ OK ] Closed udev Control Socket. [Mon Dec 9 06:17:44 2019] Starting Cleanup udevd DB... [Mon Dec 9 06:17:44 2019][ OK ] Started Plymouth switch root service. [Mon Dec 9 06:17:44 2019][ OK ] Started Cleanup udevd DB. [Mon Dec 9 06:17:44 2019][[ 39.521533] type=1404 audit(1575901064.012:2): selinux=0 auid=4294967295 ses=4294967295 [Mon Dec 9 06:17:44 2019][32m OK ] Reached target Switch Root. [Mon Dec 9 06:17:44 2019] Starting Switch Root... [Mon Dec 9 06:17:44 2019][ 39.552759] ip_tables: (C) 2000-2006 Netfilter Core Team [Mon Dec 9 06:17:44 2019][ 39.558776] systemd[1]: Inserted module 'ip_tables' [Mon Dec 9 06:17:44 2019] [Mon Dec 9 06:17:44 2019]Welcome to CentOS Linux 7 (Core)! [Mon Dec 9 06:17:44 2019] [Mon Dec 9 06:17:44 2019][ OK ] Stopped Switch Root. [Mon Dec 9 06:17:44 2019][ OK ] Stopped Journal Service. [Mon Dec 9 06:17:44 2019] Starting Journal Service... [Mon Dec 9 06:17:44 2019] Starting Create list of required st... nodes for the current kernel... [Mon Dec 9 06:17:44 2019][[ 39.666045] EXT4-fs (sda2): re-mounted. Opts: (null) [Mon Dec 9 06:17:44 2019] OK ] Started Forward Password Requests to Wall Directory Watch. [Mon Dec 9 06:17:44 2019][ OK ] Set up automount Arbitrary Executab...ats File System Automount [ 39.684129] systemd-journald[5778]: Received request to flush runtime journal from PID 1 [Mon Dec 9 06:17:44 2019]Point. [Mon Dec 9 06:17:44 2019][ OK ] Created slice User and Session Slice. [Mon Dec 9 06:17:44 2019][ OK ] Reached target Local Encrypted Volumes. [Mon Dec 9 06:17:44 2019] Mounting Huge Pages File System... [Mon Dec 9 06:17:44 2019] Mounting POSIX Message Queue File System... [Mon Dec 9 06:17:44 2019][ OK ] Listening on Delayed Shutdown Socket. [Mon Dec 9 06:17:44 2019][ OK ] Reached target Paths. [Mon Dec 9 06:17:44 2019] Starting Collect Read-Ahead Data... [Mon Dec 9 06:17:44 2019][ OK ] Reached target Slices. [Mon Dec 9 06:17:44 2019] Mounting Debug File System... [Mon Dec 9 06:17:44 2019][ OK ] Reached target RPC Port Mapper. [Mon Dec 9 06:17:44 2019] Starting Read and set NIS domainname from /etc/sysconfig/network... [Mon Dec 9 06:17:44 2019][ OK ] Listening on udev Kernel Socket. [Mon Dec 9 06:17:44 2019] Starting Availability of block devices... [Mon Dec 9 06:17:44 2019][ OK ] Created slice system-serial\x2dgetty.slice. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Switch Root. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Initrd File Systems. [Mon Dec 9 06:17:44 2019][ OK ] Stopped target Initrd Root File System. [Mon Dec 9 06:17:44 2019][ OK ] Li[ 39.774789] piix4_smbus 0000:00:14.0: SMBus Host Controller at 0xb00, revision 0 [Mon Dec 9 06:17:45 2019]stening on /dev/[ 39.782438] piix4_smbus 0000:00:14.0: Using register 0x2e for SMBus port selection [Mon Dec 9 06:17:45 2019]initctl Compatibility Named Pipe. [Mon Dec 9 06:17:45 2019] Starting Replay Read-Ahead Data... [Mon Dec 9 06:17:45 2019][ OK ] Created slice system-selinux\x2dpol...grate\x2dlocal\x2dchanges.slice. [Mon Dec 9 06:17:45 2019][ OK ] C[ 39.807059] ACPI Error: reated slice sysNo handler for Region [SYSI] (ffff8913a9e7da68) [IPMI]tem-getty.slice. (20130517/evregion-162) [Mon Dec 9 06:17:45 2019][ 39.820149] ACPI Error: [Mon Dec 9 06:17:45 2019][ OK Region IPMI (ID=7) has no handler (20130517/exfldio-305) [Mon Dec 9 06:17:45 2019][0m] Listening o[ 39.829034] ACPI Error: n udev Control SMethod parse/execution failed ocket. [Mon Dec 9 06:17:45 2019][ [\_SB_.PMI0._GHL] (Node ffff8913a9e7a5a0) OK ] Start, AE_NOT_EXISTed Collect Read- (20130517/psparse-536) [Mon Dec 9 06:17:45 2019]Ahead Data. [Mon Dec 9 06:17:45 2019][[ 39.847923] ACPI Error: [32m OK ] Method parse/execution failed Started Create l[\_SB_.PMI0._PMC] (Node ffff8913a9e7a500)ist of required , AE_NOT_EXISTsta...ce nodes f (20130517/psparse-536) [Mon Dec 9 06:17:45 2019]or the current k[ 39.866864] ACPI Exception: AE_NOT_EXIST, ernel. [Mon Dec 9 06:17:45 2019][ Evaluating _PMC OK ] Start (20130517/power_meter-753) [Mon Dec 9 06:17:45 2019]ed Availability of block devices[ 39.880616] ipmi message handler version 39.2 [Mon Dec 9 06:17:45 2019]. [Mon Dec 9 06:17:45 2019][ OK [ 39.880745] ccp 0000:02:00.2: 3 command queues available [Mon Dec 9 06:17:45 2019]] Started Re[ 39.880831] ccp 0000:02:00.2: Queue 2 can access 4 LSB regions [Mon Dec 9 06:17:45 2019]play Read-Ahead [ 39.880833] ccp 0000:02:00.2: Queue 3 can access 4 LSB regions [Mon Dec 9 06:17:45 2019][ 39.880835] ccp 0000:02:00.2: Queue 4 can access 4 LSB regions [Mon Dec 9 06:17:45 2019]Data. [Mon Dec 9 06:17:45 2019] [ 39.880836] ccp 0000:02:00.2: Queue 0 gets LSB 4 [Mon Dec 9 06:17:45 2019] Starting Remoun[ 39.880837] ccp 0000:02:00.2: Queue 1 gets LSB 5 [Mon Dec 9 06:17:45 2019][ 39.880838] ccp 0000:02:00.2: Queue 2 gets LSB 6 [Mon Dec 9 06:17:45 2019]t Root and Kerne[ 39.881173] ccp 0000:02:00.2: enabled [Mon Dec 9 06:17:45 2019][ 39.881289] ccp 0000:03:00.1: 5 command queues available [Mon Dec 9 06:17:45 2019]l File Systems..[ 39.881348] ccp 0000:03:00.1: Queue 0 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]. [Mon Dec 9 06:17:45 2019] Sta[ 39.881349] ccp 0000:03:00.1: Queue 1 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]rting Create Sta[ 39.881351] ccp 0000:03:00.1: Queue 2 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]tic Device Nodes[ 39.881353] ccp 0000:03:00.1: Queue 3 can access 7 LSB regions [Mon Dec 9 06:17:45 2019] in /dev... [Mon Dec 9 06:17:45 2019] [ 39.881354] ccp 0000:03:00.1: Queue 4 can access 7 LSB regions [Mon Dec 9 06:17:45 2019] Starting [ 39.881356] ccp 0000:03:00.1: Queue 0 gets LSB 1 [Mon Dec 9 06:17:45 2019]Apply Kernel Var[ 39.881357] ccp 0000:03:00.1: Queue 1 gets LSB 2 [Mon Dec 9 06:17:45 2019]iables... [Mon Dec 9 06:17:45 2019][[3[ 39.881358] ccp 0000:03:00.1: Queue 2 gets LSB 3 [Mon Dec 9 06:17:45 2019]2m OK ] Mo[ 39.881359] ccp 0000:03:00.1: Queue 3 gets LSB 4 [Mon Dec 9 06:17:45 2019]unted Huge Pages[ 39.881360] ccp 0000:03:00.1: Queue 4 gets LSB 5 [Mon Dec 9 06:17:45 2019] File System. [Mon Dec 9 06:17:45 2019][ 39.881791] ccp 0000:03:00.1: enabled [Mon Dec 9 06:17:45 2019][ OK [ 39.881985] ccp 0000:41:00.2: 3 command queues available [Mon Dec 9 06:17:45 2019]] Mounted Debug [ 39.882090] ccp 0000:41:00.2: Queue 2 can access 4 LSB regions [Mon Dec 9 06:17:45 2019]File System. [Mon Dec 9 06:17:45 2019][[ 39.882092] ccp 0000:41:00.2: Queue 3 can access 4 LSB regions [Mon Dec 9 06:17:45 2019] OK ][ 39.882094] ccp 0000:41:00.2: Queue 4 can access 4 LSB regions [Mon Dec 9 06:17:45 2019] Mounted POSIX M[ 39.882096] ccp 0000:41:00.2: Queue 0 gets LSB 4 [Mon Dec 9 06:17:45 2019]essage Queue Fil[ 39.882097] ccp 0000:41:00.2: Queue 1 gets LSB 5 [Mon Dec 9 06:17:45 2019]e System. [Mon Dec 9 06:17:45 2019][[3[ 39.882099] ccp 0000:41:00.2: Queue 2 gets LSB 6 [Mon Dec 9 06:17:45 2019]2m OK ] St[ 39.882415] ccp 0000:41:00.2: enabled [Mon Dec 9 06:17:45 2019]arted Journal Se[ 39.882557] ccp 0000:42:00.1: 5 command queues available [Mon Dec 9 06:17:45 2019]rvice. [Mon Dec 9 06:17:45 2019][ [ 39.882622] ccp 0000:42:00.1: Queue 0 can access 7 LSB regions [Mon Dec 9 06:17:45 2019] OK ] Start[ 39.882624] ccp 0000:42:00.1: Queue 1 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.882627] ccp 0000:42:00.1: Queue 2 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]ed Read and set [ 39.882629] ccp 0000:42:00.1: Queue 3 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.882632] ccp 0000:42:00.1: Queue 4 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]NIS domainname f[ 39.882633] ccp 0000:42:00.1: Queue 0 gets LSB 1 [Mon Dec 9 06:17:45 2019][ 39.882634] ccp 0000:42:00.1: Queue 1 gets LSB 2 [Mon Dec 9 06:17:45 2019]rom /etc/sysconf[ 39.882636] ccp 0000:42:00.1: Queue 2 gets LSB 3 [Mon Dec 9 06:17:45 2019][ 39.882637] ccp 0000:42:00.1: Queue 3 gets LSB 4 [Mon Dec 9 06:17:45 2019]ig/network. [Mon Dec 9 06:17:45 2019][[ 39.882638] ccp 0000:42:00.1: Queue 4 gets LSB 5 [Mon Dec 9 06:17:45 2019][ 39.883043] ccp 0000:42:00.1: enabled [Mon Dec 9 06:17:45 2019][32m OK ] [ 39.883203] ccp 0000:85:00.2: 3 command queues available [Mon Dec 9 06:17:45 2019]Started Remount [ 39.883291] ccp 0000:85:00.2: Queue 2 can access 4 LSB regions [Mon Dec 9 06:17:45 2019][ 39.883293] ccp 0000:85:00.2: Queue 3 can access 4 LSB regions [Mon Dec 9 06:17:45 2019]Root and Kernel [ 39.883295] ccp 0000:85:00.2: Queue 4 can access 4 LSB regions [Mon Dec 9 06:17:45 2019][ 39.883297] ccp 0000:85:00.2: Queue 0 gets LSB 4 [Mon Dec 9 06:17:45 2019]File Systems. [Mon Dec 9 06:17:45 2019][ 39.883298] ccp 0000:85:00.2: Queue 1 gets LSB 5 [Mon Dec 9 06:17:45 2019][ 39.883299] ccp 0000:85:00.2: Queue 2 gets LSB 6 [Mon Dec 9 06:17:45 2019] Startin[ 39.883691] ccp 0000:85:00.2: enabled [Mon Dec 9 06:17:45 2019][ 39.883796] ccp 0000:86:00.1: 5 command queues available [Mon Dec 9 06:17:45 2019]g udev Coldplug [ 39.883860] ccp 0000:86:00.1: Queue 0 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.883862] ccp 0000:86:00.1: Queue 1 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]all Devices... [ 39.883865] ccp 0000:86:00.1: Queue 2 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.883867] ccp 0000:86:00.1: Queue 3 can access 7 LSB regions [Mon Dec 9 06:17:45 2019] [Mon Dec 9 06:17:45 2019] Starti[ 39.883869] ccp 0000:86:00.1: Queue 4 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.883871] ccp 0000:86:00.1: Queue 0 gets LSB 1 [Mon Dec 9 06:17:45 2019]ng Configure rea[ 39.883872] ccp 0000:86:00.1: Queue 1 gets LSB 2 [Mon Dec 9 06:17:45 2019][ 39.883874] ccp 0000:86:00.1: Queue 2 gets LSB 3 [Mon Dec 9 06:17:45 2019]d-only root supp[ 39.883875] ccp 0000:86:00.1: Queue 3 gets LSB 4 [Mon Dec 9 06:17:45 2019][ 39.883876] ccp 0000:86:00.1: Queue 4 gets LSB 5 [Mon Dec 9 06:17:45 2019]ort... [Mon Dec 9 06:17:45 2019] [ 39.884336] ccp 0000:86:00.1: enabled [Mon Dec 9 06:17:45 2019][ 39.884514] ccp 0000:c2:00.2: 3 command queues available [Mon Dec 9 06:17:45 2019] Starting Flush[ 39.884609] ccp 0000:c2:00.2: Queue 2 can access 4 LSB regions [Mon Dec 9 06:17:45 2019][ 39.884611] ccp 0000:c2:00.2: Queue 3 can access 4 LSB regions [Mon Dec 9 06:17:45 2019] Journal to Pers[ 39.884613] ccp 0000:c2:00.2: Queue 4 can access 4 LSB regions [Mon Dec 9 06:17:45 2019][ 39.884614] ccp 0000:c2:00.2: Queue 0 gets LSB 4 [Mon Dec 9 06:17:45 2019]istent Storage..[ 39.884616] ccp 0000:c2:00.2: Queue 1 gets LSB 5 [Mon Dec 9 06:17:45 2019][ 39.884617] ccp 0000:c2:00.2: Queue 2 gets LSB 6 [Mon Dec 9 06:17:45 2019]. [Mon Dec 9 06:17:45 2019][ OK [ 39.884927] ccp 0000:c2:00.2: enabled [Mon Dec 9 06:17:45 2019][ 39.885035] ccp 0000:c3:00.1: 5 command queues available [Mon Dec 9 06:17:45 2019]] Started Ap[ 39.885087] ccp 0000:c3:00.1: Queue 0 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.885089] ccp 0000:c3:00.1: Queue 1 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]ply Kernel Varia[ 39.885091] ccp 0000:c3:00.1: Queue 2 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.885092] ccp 0000:c3:00.1: Queue 3 can access 7 LSB regions [Mon Dec 9 06:17:45 2019]bles. [Mon Dec 9 06:17:45 2019][ [ 39.885094] ccp 0000:c3:00.1: Queue 4 can access 7 LSB regions [Mon Dec 9 06:17:45 2019][ 39.885095] ccp 0000:c3:00.1: Queue 0 gets LSB 1 [Mon Dec 9 06:17:45 2019]OK ] Starte[ 39.885096] ccp 0000:c3:00.1: Queue 1 gets LSB 2 [Mon Dec 9 06:17:45 2019][ 39.885097] ccp 0000:c3:00.1: Queue 2 gets LSB 3 [Mon Dec 9 06:17:45 2019]d Create Static [ 39.885098] ccp 0000:c3:00.1: Queue 3 gets LSB 4 [Mon Dec 9 06:17:45 2019][ 39.885099] ccp 0000:c3:00.1: Queue 4 gets LSB 5 [Mon Dec 9 06:17:45 2019]Device Nodes in [ 39.885462] ccp 0000:c3:00.1: enabled [Mon Dec 9 06:17:45 2019][ 39.978579] sd 0:2:0:0: Attached scsi generic sg0 type 0 [Mon Dec 9 06:17:45 2019]/dev. [Mon Dec 9 06:17:45 2019][ [ 39.978782] scsi 1:0:0:0: Attached scsi generic sg1 type 13 [Mon Dec 9 06:17:45 2019][ 39.979237] scsi 1:0:1:0: Attached scsi generic sg2 type 13 [Mon Dec 9 06:17:45 2019]OK ] Reache[ 39.979772] sd 1:0:2:0: Attached scsi generic sg3 type 0 [Mon Dec 9 06:17:45 2019][ 39.980475] sd 1:0:3:0: Attached scsi generic sg4 type 0 [Mon Dec 9 06:17:45 2019]d target Local F[ 39.981364] sd 1:0:4:0: Attached scsi generic sg5 type 0 [Mon Dec 9 06:17:45 2019][ 39.981781] sd 1:0:5:0: Attached scsi generic sg6 type 0 [Mon Dec 9 06:17:45 2019]ile Systems (Pre[ 39.982219] sd 1:0:6:0: Attached scsi generic sg7 type 0 [Mon Dec 9 06:17:45 2019][ 39.982748] sd 1:0:7:0: Attached scsi generic sg8 type 0 [Mon Dec 9 06:17:45 2019]). [Mon Dec 9 06:17:45 2019] St[ 39.983085] sd 1:0:8:0: Attached scsi generic sg9 type 0 [Mon Dec 9 06:17:45 2019][ 39.983451] sd 1:0:9:0: Attached scsi generic sg10 type 0 [Mon Dec 9 06:17:45 2019]arting udev Kern[ 39.983767] sd 1:0:10:0: Attached scsi generic sg11 type 0 [Mon Dec 9 06:17:45 2019][ 39.984304] sd 1:0:11:0: Attached scsi generic sg12 type 0 [Mon Dec 9 06:17:45 2019]el Device Manage[ 39.984869] sd 1:0:12:0: Attached scsi generic sg13 type 0 [Mon Dec 9 06:17:45 2019]r... [Mon Dec 9 06:17:45 2019][ O[ 39.985272] sd 1:0:13:0: Attached scsi generic sg14 type 0 [Mon Dec 9 06:17:45 2019][ 39.985828] sd 1:0:14:0: Attached scsi generic sg15 type 0 [Mon Dec 9 06:17:45 2019]K ] Started[ 39.986197] sd 1:0:15:0: Attached scsi generic sg16 type 0 [Mon Dec 9 06:17:45 2019][ 39.986459] sd 1:0:16:0: Attached scsi generic sg17 type 0 [Mon Dec 9 06:17:45 2019] Configure read-[ 39.986698] sd 1:0:17:0: Attached scsi generic sg18 type 0 [Mon Dec 9 06:17:45 2019][ 39.987545] sd 1:0:18:0: Attached scsi generic sg19 type 0 [Mon Dec 9 06:17:45 2019]only root suppor[ 39.988001] sd 1:0:19:0: Attached scsi generic sg20 type 0 [Mon Dec 9 06:17:45 2019][ 39.988360] sd 1:0:20:0: Attached scsi generic sg21 type 0 [Mon Dec 9 06:17:45 2019]t. [Mon Dec 9 06:17:45 2019] St[ 39.988620] sd 1:0:21:0: Attached scsi generic sg22 type 0 [Mon Dec 9 06:17:45 2019][ 39.988875] sd 1:0:22:0: Attached scsi generic sg23 type 0 [Mon Dec 9 06:17:45 2019]arting Load/Save[ 39.989676] sd 1:0:23:0: Attached scsi generic sg24 type 0 [Mon Dec 9 06:17:45 2019][ 39.990382] sd 1:0:24:0: Attached scsi generic sg25 type 0 [Mon Dec 9 06:17:45 2019] Random Seed... [ 39.991164] sd 1:0:25:0: Attached scsi generic sg26 type 0 [Mon Dec 9 06:17:45 2019][ 39.991680] sd 1:0:26:0: Attached scsi generic sg27 type 0 [Mon Dec 9 06:17:45 2019] [Mon Dec 9 06:17:45 2019][ OK [[ 39.992131] sd 1:0:27:0: Attached scsi generic sg28 type 0 [Mon Dec 9 06:17:45 2019][ 39.992596] sd 1:0:28:0: Attached scsi generic sg29 type 0 [Mon Dec 9 06:17:45 2019]0m] Started Flus[ 39.992887] sd 1:0:29:0: Attached scsi generic sg30 type 0 [Mon Dec 9 06:17:45 2019][ 39.993941] sd 1:0:30:0: Attached scsi generic sg31 type 0 [Mon Dec 9 06:17:45 2019]h Journal to Per[ 39.994451] sd 1:0:31:0: Attached scsi generic sg32 type 0 [Mon Dec 9 06:17:45 2019][ 39.995036] sd 1:0:32:0: Attached scsi generic sg33 type 0 [Mon Dec 9 06:17:45 2019]sistent Storage.[ 39.995597] sd 1:0:33:0: Attached scsi generic sg34 type 0 [Mon Dec 9 06:17:45 2019][ 39.995859] sd 1:0:34:0: Attached scsi generic sg35 type 0 [Mon Dec 9 06:17:45 2019] [Mon Dec 9 06:17:45 2019][ OK [ 39.997725] sd 1:0:35:0: Attached scsi generic sg36 type 0 [Mon Dec 9 06:17:45 2019][0m] Started Loa[ 39.999311] sd 1:0:36:0: Attached scsi generic sg37 type 0 [Mon Dec 9 06:17:45 2019][ 40.000600] sd 1:0:37:0: Attached scsi generic sg38 type 0 [Mon Dec 9 06:17:45 2019]d/Save Random Se[ 40.000952] sd 1:0:38:0: Attached scsi generic sg39 type 0 [Mon Dec 9 06:17:45 2019][ 40.001234] sd 1:0:39:0: Attached scsi generic sg40 type 0 [Mon Dec 9 06:17:45 2019]ed. [Mon Dec 9 06:17:45 2019][ OK[ 40.001409] sd 1:0:40:0: Attached scsi generic sg41 type 0 [Mon Dec 9 06:17:45 2019][ 40.003122] sd 1:0:41:0: Attached scsi generic sg42 type 0 [Mon Dec 9 06:17:45 2019] ] Started [ 40.003664] sd 1:0:42:0: Attached scsi generic sg43 type 0 [Mon Dec 9 06:17:45 2019][ 40.004289] sd 1:0:43:0: Attached scsi generic sg44 type 0 [Mon Dec 9 06:17:45 2019]udev Kernel Devi[ 40.008943] sd 1:0:44:0: Attached scsi generic sg45 type 0 [Mon Dec 9 06:17:45 2019][ 40.009505] sd 1:0:45:0: Attached scsi generic sg46 type 0 [Mon Dec 9 06:17:45 2019]ce Manager. [Mon Dec 9 06:17:45 2019][ 40.009952] sd 1:0:46:0: Attached scsi generic sg47 type 0 [Mon Dec 9 06:17:45 2019][ 40.012087] sd 1:0:47:0: Attached scsi generic sg48 type 0 [Mon Dec 9 06:17:45 2019][ 40.013766] sd 1:0:48:0: Attached scsi generic sg49 type 0 [Mon Dec 9 06:17:45 2019][ 40.014150] sd 1:0:49:0: Attached scsi generic sg50 type 0 [Mon Dec 9 06:17:45 2019][ 40.016746] sd 1:0:50:0: Attached scsi generic sg51 type 0 [Mon Dec 9 06:17:45 2019][ 40.017210] sd 1:0:51:0: Attached scsi generic sg52 type 0 [Mon Dec 9 06:17:45 2019][ 40.017674] sd 1:0:52:0: Attached scsi generic sg53 type 0 [Mon Dec 9 06:17:45 2019][ 40.018220] sd 1:0:53:0: Attached scsi generic sg54 type 0 [Mon Dec 9 06:17:45 2019][ 40.018968] sd 1:0:54:0: Attached scsi generic sg55 type 0 [Mon Dec 9 06:17:45 2019][ 40.019521] sd 1:0:55:0: Attached scsi generic sg56 type 0 [Mon Dec 9 06:17:45 2019][ 40.021127] sd 1:0:56:0: Attached scsi generic sg57 type 0 [Mon Dec 9 06:17:45 2019][ 40.021541] sd 1:0:57:0: Attached scsi generic sg58 type 0 [Mon Dec 9 06:17:45 2019][ 40.023123] sd 1:0:58:0: Attached scsi generic sg59 type 0 [Mon Dec 9 06:17:45 2019][ 40.023550] sd 1:0:59:0: Attached scsi generic sg60 type 0 [Mon Dec 9 06:17:45 2019][ 40.025883] sd 1:0:60:0: Attached scsi generic sg61 type 0 [Mon Dec 9 06:17:45 2019][ 40.026256] sd 1:0:61:0: Attached scsi generic sg62 type 0 [Mon Dec 9 06:17:45 2019][ 40.026485] scsi 1:0:62:0: Attached scsi generic sg63 type 13 [Mon Dec 9 06:17:45 2019][ 40.026678] sd 1:0:63:0: Attached scsi generic sg64 type 0 [Mon Dec 9 06:17:45 2019][ 40.026809] sd 1:0:64:0: Attached scsi generic sg65 type 0 [Mon Dec 9 06:17:45 2019][ 40.026878] sd 1:0:65:0: Attached scsi generic sg66 type 0 [Mon Dec 9 06:17:45 2019][ 40.026919] sd 1:0:66:0: Attached scsi generic sg67 type 0 [Mon Dec 9 06:17:45 2019][ 40.026961] sd 1:0:67:0: Attached scsi generic sg68 type 0 [Mon Dec 9 06:17:46 2019][ 40.027083] sd 1:0:68:0: Attached scsi generic sg69 type 0 [Mon Dec 9 06:17:46 2019][ 40.027433] sd 1:0:69:0: Attached scsi generic sg70 type 0 [Mon Dec 9 06:17:46 2019][ 40.029594] sd 1:0:70:0: Attached scsi generic sg71 type 0 [Mon Dec 9 06:17:46 2019][ 40.029931] sd 1:0:71:0: Attached scsi generic sg72 type 0 [Mon Dec 9 06:17:46 2019][ 40.030284] sd 1:0:72:0: Attached scsi generic sg73 type 0 [Mon Dec 9 06:17:46 2019][ 40.031785] sd 1:0:73:0: Attached scsi generic sg74 type 0 [Mon Dec 9 06:17:46 2019][ 40.033119] sd 1:0:74:0: Attached scsi generic sg75 type 0 [Mon Dec 9 06:17:46 2019][ 40.035467] sd 1:0:75:0: Attached scsi generic sg76 type 0 [Mon Dec 9 06:17:46 2019][ 40.036846] sd 1:0:76:0: Attached scsi generic sg77 type 0 [Mon Dec 9 06:17:46 2019][ 40.037137] sd 1:0:77:0: Attached scsi generic sg78 type 0 [Mon Dec 9 06:17:46 2019][ 40.037345] sd 1:0:78:0: Attached scsi generic sg79 type 0 [Mon Dec 9 06:17:46 2019][ 40.037592] sd 1:0:79:0: Attached scsi generic sg80 type 0 [Mon Dec 9 06:17:46 2019][ 40.038648] sd 1:0:80:0: Attached scsi generic sg81 type 0 [Mon Dec 9 06:17:46 2019][ 40.039831] sd 1:0:81:0: Attached scsi generic sg82 type 0 [Mon Dec 9 06:17:46 2019][ 40.040675] sd 1:0:82:0: Attached scsi generic sg83 type 0 [Mon Dec 9 06:17:46 2019][ 40.041209] sd 1:0:83:0: Attached scsi generic sg84 type 0 [Mon Dec 9 06:17:46 2019][ 40.041610] sd 1:0:84:0: Attached scsi generic sg85 type 0 [Mon Dec 9 06:17:46 2019][ 40.044140] sd 1:0:85:0: Attached scsi generic sg86 type 0 [Mon Dec 9 06:17:46 2019][ 40.044524] sd 1:0:86:0: Attached scsi generic sg87 type 0 [Mon Dec 9 06:17:46 2019][ 40.046308] sd 1:0:87:0: Attached scsi generic sg88 type 0 [Mon Dec 9 06:17:46 2019][ 40.046660] sd 1:0:88:0: Attached scsi generic sg89 type 0 [Mon Dec 9 06:17:46 2019][ 40.047483] sd 1:0:89:0: Attached scsi generic sg90 type 0 [Mon Dec 9 06:17:46 2019][ 40.048100] sd 1:0:90:0: Attached scsi generic sg91 type 0 [Mon Dec 9 06:17:46 2019][ 40.048653] sd 1:0:91:0: Attached scsi generic sg92 type 0 [Mon Dec 9 06:17:46 2019][ 40.049326] sd 1:0:92:0: Attached scsi generic sg93 type 0 [Mon Dec 9 06:17:46 2019][ 40.050057] sd 1:0:93:0: Attached scsi generic sg94 type 0 [Mon Dec 9 06:17:46 2019][ 40.050610] sd 1:0:94:0: Attached scsi generic sg95 type 0 [Mon Dec 9 06:17:46 2019][ 40.051299] sd 1:0:95:0: Attached scsi generic sg96 type 0 [Mon Dec 9 06:17:46 2019][ 40.052011] sd 1:0:96:0: Attached scsi generic sg97 type 0 [Mon Dec 9 06:17:46 2019][ 40.052751] sd 1:0:97:0: Attached scsi generic sg98 type 0 [Mon Dec 9 06:17:46 2019][ 40.053355] sd 1:0:98:0: Attached scsi generic sg99 type 0 [Mon Dec 9 06:17:46 2019][ 40.054131] sd 1:0:99:0: Attached scsi generic sg100 type 0 [Mon Dec 9 06:17:46 2019][ 40.056040] sd 1:0:100:0: Attached scsi generic sg101 type 0 [Mon Dec 9 06:17:46 2019][ 40.058756] sd 1:0:101:0: Attached scsi generic sg102 type 0 [Mon Dec 9 06:17:46 2019][ 40.063950] sd 1:0:102:0: Attached scsi generic sg103 type 0 [Mon Dec 9 06:17:46 2019][ 40.064363] sd 1:0:103:0: Attached scsi generic sg104 type 0 [Mon Dec 9 06:17:46 2019][ 40.064663] sd 1:0:104:0: Attached scsi generic sg105 type 0 [Mon Dec 9 06:17:46 2019][ 40.066695] sd 1:0:105:0: Attached scsi generic sg106 type 0 [Mon Dec 9 06:17:46 2019][ 40.067289] sd 1:0:106:0: Attached scsi generic sg107 type 0 [Mon Dec 9 06:17:46 2019][ 40.070381] sd 1:0:107:0: Attached scsi generic sg108 type 0 [Mon Dec 9 06:17:46 2019][ 40.071065] sd 1:0:108:0: Attached scsi generic sg109 type 0 [Mon Dec 9 06:17:46 2019][ 40.071658] sd 1:0:109:0: Attached scsi generic sg110 type 0 [Mon Dec 9 06:17:46 2019][ 40.072264] sd 1:0:110:0: Attached scsi generic sg111 type 0 [Mon Dec 9 06:17:46 2019][ OK [ 41.016318] sd 1:0:111:0: Attached scsi generic sg112 type 0 [Mon Dec 9 06:17:46 2019]] Started udev C[ 41.018549] ipmi device interface [Mon Dec 9 06:17:46 2019]oldplug all Devices. [Mon Dec 9 06:17:46 2019] Starting Device-[ 41.030678] sd 1:0:112:0: Attached scsi generic sg113 type 0 [Mon Dec 9 06:17:46 2019]Mapper Multipath Device Controll[ 41.038979] sd 1:0:113:0: Attached scsi generic sg114 type 0 [Mon Dec 9 06:17:46 2019]er... [Mon Dec 9 06:17:46 2019][ 41.041079] IPMI System Interface driver [Mon Dec 9 06:17:46 2019][ 41.041125] ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS [Mon Dec 9 06:17:46 2019][ 41.041128] ipmi_si: SMBIOS: io 0xca8 regsize 1 spacing 4 irq 10 [Mon Dec 9 06:17:46 2019][ 41.041129] ipmi_si: Adding SMBIOS-specified kcs state machine [Mon Dec 9 06:17:46 2019][ 41.041177] ipmi_si IPI0001:00: ipmi_platform: probing via ACPI [Mon Dec 9 06:17:46 2019][ 41.041202] ipmi_si IPI0001:00: [io 0x0ca8] regsize 1 spacing 4 irq 10 [Mon Dec 9 06:17:46 2019][ 41.041204] ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI [Mon Dec 9 06:17:46 2019][ 41.041204] ipmi_si: Adding ACPI-specified kcs state machine [Mon Dec 9 06:17:46 2019][ 41.041564] ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca8, slave address 0x20, irq 10 [Mon Dec 9 06:17:46 2019][ 41.066626] ipmi_si IPI0001:00: The BMC does not support setting the recv irq bit, compensating, but the BMC needs to be fixed. [Mon Dec 9 06:17:46 2019][ 41.074679] ipmi_si IPI0001:00: Using irq 10 [Mon Dec 9 06:17:46 2019][ 41.100010] ipmi_si IPI0001:00: Found new BMC (man_id: 0x0002a2, prod_id: 0x0100, dev_id: 0x20) [Mon Dec 9 06:17:46 2019][ 41.130277] sd 1:0:114:0: Attached scsi generic sg115 type 0 [Mon Dec 9 06:17:46 2019][ 41.134196] device-mapper: uevent: version 1.0.3 [Mon Dec 9 06:17:46 2019][ 41.138049] device-mapper: ioctl: 4.37.1-ioctl (2018-04-03) initialised: dm-devel@redhat.com [Mon Dec 9 06:17:46 2019][ 41.153114] sd 1:0:115:0: Attached scsi generic sg116 type 0 [Mon Dec 9 06:17:46 2019][ 41.159138] sd 1:0:116:0: Attached scsi generic sg117 type 0 [Mon Dec 9 06:17:46 2019][ 41.166824] sd 1:0:117:0: Attached scsi generic sg118 type 0 [Mon Dec 9 06:17:46 2019][ 41.173200] sd 1:0:118:0: Attached scsi generic sg119 type 0 [Mon Dec 9 06:17:46 2019][ 41.179262] sd 1:0:119:0: Attached scsi generic sg120 type 0 [Mon Dec 9 06:17:46 2019][ 41.184534] ipmi_si IPI0001:00: IPMI kcs interface initialized [Mon Dec 9 06:17:46 2019][ OK ] Started Device-Mapper Multipath Device Controller. [Mon Dec 9 06:17:46 2019][ 41.197001] sd 1:0:120:0: Attached scsi generic sg121 type 0 [Mon Dec 9 06:17:46 2019][ 41.203602] sd 1:0:121:0: Attached scsi generic sg122 type 0 [Mon Dec 9 06:17:46 2019][ 41.210854] sd 1:0:122:0: Attached scsi generic sg123 type 0 [Mon Dec 9 06:17:46 2019][ 41.218846] scsi 1:0:123:0: Attached scsi generic sg124 type 13 [Mon Dec 9 06:17:46 2019][ 41.226222] sd 1:0:124:0: Attached scsi generic sg125 type 0 [Mon Dec 9 06:17:46 2019][ 41.232499] sd 1:0:125:0: Attached scsi generic sg126 type 0 [Mon Dec 9 06:17:46 2019][ 41.238880] sd 1:0:126:0: Attached scsi generic sg127 type 0 [Mon Dec 9 06:17:46 2019][ 41.247689] sd 1:0:127:0: Attached scsi generic sg128 type 0 [Mon Dec 9 06:17:46 2019][ 41.254079] sd 1:0:128:0: Attached scsi generic sg129 type 0 [Mon Dec 9 06:17:46 2019][ 41.260439] sd 1:0:129:0: Attached scsi generic sg130 type 0 [Mon Dec 9 06:17:46 2019][ 41.267986] sd 1:0:130:0: Attached scsi generic sg131 type 0 [Mon Dec 9 06:17:46 2019][ 41.274038] sd 1:0:131:0: Attached scsi generic sg132 type 0 [Mon Dec 9 06:17:46 2019][ 41.280543] sd 1:0:132:0: Attached scsi generic sg133 type 0 [Mon Dec 9 06:17:46 2019][ 41.286763] sd 1:0:133:0: Attached scsi generic sg134 type 0 [Mon Dec 9 06:17:46 2019][ 41.293040] sd 1:0:134:0: Attached scsi generic sg135 type 0 [Mon Dec 9 06:17:46 2019][ 41.301028] sd 1:0:135:0: Attached scsi generic sg136 type 0 [Mon Dec 9 06:17:46 2019][ 41.307132] sd 1:0:136:0: Attached scsi generic sg137 type 0 [Mon Dec 9 06:17:46 2019][ 41.314248] sd 1:0:137:0: Attached scsi generic sg138 type 0 [Mon Dec 9 06:17:46 2019][ 41.320309] sd 1:0:138:0: Attached scsi generic sg139 type 0 [Mon Dec 9 06:17:46 2019][ 41.326443] sd 1:0:139:0: Attached scsi generic sg140 type 0 [Mon Dec 9 06:17:46 2019][ 41.332765] sd 1:0:140:0: Attached scsi generic sg141 type 0 [Mon Dec 9 06:17:46 2019][ 41.338768] sd 1:0:141:0: Attached scsi generic sg142 type 0 [Mon Dec 9 06:17:46 2019][ 41.344830] sd 1:0:142:0: Attached scsi generic sg143 type 0 [Mon Dec 9 06:17:46 2019][ 41.351071] sd 1:0:143:0: Attached scsi generic sg144 type 0 [Mon Dec 9 06:17:46 2019][ 41.357222] sd 1:0:144:0: Attached scsi generic sg145 type 0 [Mon Dec 9 06:17:46 2019][ 41.363205] sd 1:0:145:0: Attached scsi generic sg146 type 0 [Mon Dec 9 06:17:46 2019][ 41.369107] sd 1:0:146:0: Attached scsi generic sg147 type 0 [Mon Dec 9 06:17:46 2019][ 41.375532] sd 1:0:147:0: Attached scsi generic sg148 type 0 [Mon Dec 9 06:17:46 2019][ 41.381258] sd 1:0:148:0: Attached scsi generic sg149 type 0 [Mon Dec 9 06:17:46 2019][ 41.387214] sd 1:0:149:0: Attached scsi generic sg150 type 0 [Mon Dec 9 06:17:46 2019][ 41.393498] sd 1:0:150:0: Attached scsi generic sg151 type 0 [Mon Dec 9 06:17:46 2019][ 41.399730] sd 1:0:151:0: Attached scsi generic sg152 type 0 [Mon Dec 9 06:17:46 2019][ 41.405958] sd 1:0:152:0: Attached scsi generic sg153 type 0 [Mon Dec 9 06:17:46 2019][ 41.412226] sd 1:0:153:0: Attached scsi generic sg154 type 0 [Mon Dec 9 06:17:46 2019][ 41.420016] sd 1:0:154:0: Attached scsi generic sg155 type 0 [Mon Dec 9 06:17:46 2019][ 41.426076] sd 1:0:155:0: Attached scsi generic sg156 type 0 [Mon Dec 9 06:17:46 2019][ 41.432318] sd 1:0:156:0: Attached scsi generic sg157 type 0 [Mon Dec 9 06:17:46 2019][ 41.439160] sd 1:0:157:0: Attached scsi generic sg158 type 0 [Mon Dec 9 06:17:46 2019][ 41.445400] sd 1:0:158:0: Attached scsi generic sg159 type 0 [Mon Dec 9 06:17:46 2019][ 41.451484] sd 1:0:159:0: Attached scsi generic sg160 type 0 [Mon Dec 9 06:17:46 2019][ 41.458614] sd 1:0:160:0: Attached scsi generic sg161 type 0 [Mon Dec 9 06:17:46 2019][ 41.466218] sd 1:0:161:0: Attached scsi generic sg162 type 0 [Mon Dec 9 06:17:46 2019][ 41.473285] sd 1:0:162:0: Attached scsi generic sg163 type 0 [Mon Dec 9 06:17:46 2019][ 41.479548] sd 1:0:163:0: Attached scsi generic sg164 type 0 [Mon Dec 9 06:17:46 2019][ 41.485777] sd 1:0:164:0: Attached scsi generic sg165 type 0 [Mon Dec 9 06:17:46 2019][ 41.492726] sd 1:0:165:0: Attached scsi generic sg166 type 0 [Mon Dec 9 06:17:46 2019][ 41.499066] sd 1:0:166:0: Attached scsi generic sg167 type 0 [Mon Dec 9 06:17:46 2019][ 41.505354] sd 1:0:167:0: Attached scsi generic sg168 type 0 [Mon Dec 9 06:17:46 2019][ 41.511522] sd 1:0:168:0: Attached scsi generic sg169 type 0 [Mon Dec 9 06:17:46 2019][ 41.517710] sd 1:0:169:0: Attached scsi generic sg170 type 0 [Mon Dec 9 06:17:46 2019][ 41.523860] sd 1:0:170:0: Attached scsi generic sg171 type 0 [Mon Dec 9 06:17:46 2019][ 41.530058] sd 1:0:171:0: Attached scsi generic sg172 type 0 [Mon Dec 9 06:17:46 2019][ 41.536124] sd 1:0:172:0: Attached scsi generic sg173 type 0 [Mon Dec 9 06:17:46 2019][ 41.542479] sd 1:0:173:0: Attached scsi generic sg174 type 0 [Mon Dec 9 06:17:46 2019][ 41.549159] sd 1:0:174:0: Attached scsi generic sg175 type 0 [Mon Dec 9 06:17:46 2019][ 41.555247] sd 1:0:175:0: Attached scsi generic sg176 type 0 [Mon Dec 9 06:17:46 2019][ 41.561114] sd 1:0:176:0: Attached scsi generic sg177 type 0 [Mon Dec 9 06:17:46 2019][ 41.567475] sd 1:0:177:0: Attached scsi generic sg178 type 0 [Mon Dec 9 06:17:46 2019][ 41.573641] sd 1:0:178:0: Attached scsi generic sg179 type 0 [Mon Dec 9 06:17:46 2019][ 41.580125] sd 1:0:179:0: Attached scsi generic sg180 type 0 [Mon Dec 9 06:17:46 2019][ 41.586444] sd 1:0:180:0: Attached scsi generic sg181 type 0 [Mon Dec 9 06:17:46 2019][ 41.592620] sd 1:0:181:0: Attached scsi generic sg182 type 0 [Mon Dec 9 06:17:46 2019][ 41.599135] sd 1:0:182:0: Attached scsi generic sg183 type 0 [Mon Dec 9 06:17:46 2019][ 41.605200] sd 1:0:183:0: Attached scsi generic sg184 type 0 [Mon Dec 9 06:17:46 2019][ 41.611259] scsi 1:0:184:0: Attached scsi generic sg185 type 13 [Mon Dec 9 06:17:46 2019][ 41.617786] sd 1:0:185:0: Attached scsi generic sg186 type 0 [Mon Dec 9 06:17:46 2019][ 41.625144] sd 1:0:186:0: Attached scsi generic sg187 type 0 [Mon Dec 9 06:17:46 2019][ 41.631113] sd 1:0:187:0: Attached scsi generic sg188 type 0 [Mon Dec 9 06:17:46 2019][ 41.637306] sd 1:0:188:0: Attached scsi generic sg189 type 0 [Mon Dec 9 06:17:46 2019][ 41.643163] sd 1:0:189:0: Attached scsi generic sg190 type 0 [Mon Dec 9 06:17:46 2019][ 41.649153] sd 1:0:190:0: Attached scsi generic sg191 type 0 [Mon Dec 9 06:17:46 2019][ 41.655794] sd 1:0:191:0: Attached scsi generic sg192 type 0 [Mon Dec 9 06:17:46 2019][ 41.661669] sd 1:0:192:0: Attached scsi generic sg193 type 0 [Mon Dec 9 06:17:46 2019][ 41.667527] sd 1:0:193:0: Attached scsi generic sg194 type 0 [Mon Dec 9 06:17:46 2019][ 41.673356] sd 1:0:194:0: Attached scsi generic sg195 type 0 [Mon Dec 9 06:17:46 2019][ 41.679092] sd 1:0:195:0: Attached scsi generic sg196 type 0 [Mon Dec 9 06:17:46 2019][ 41.684847] sd 1:0:196:0: Attached scsi generic sg197 type 0 [Mon Dec 9 06:17:46 2019][ 41.690598] sd 1:0:197:0: Attached scsi generic sg198 type 0 [Mon Dec 9 06:17:46 2019][ 41.696312] sd 1:0:198:0: Attached scsi generic sg199 type 0 [Mon Dec 9 06:17:46 2019][ 41.702041] sd 1:0:199:0: Attached scsi generic sg200 type 0 [Mon Dec 9 06:17:46 2019][ 41.707748] sd 1:0:200:0: Attached scsi generic sg201 type 0 [Mon Dec 9 06:17:46 2019][ 41.713454] sd 1:0:201:0: Attached scsi generic sg202 type 0 [Mon Dec 9 06:17:46 2019][ 41.719176] sd 1:0:202:0: Attached scsi generic sg203 type 0 [Mon Dec 9 06:17:46 2019][ 41.724884] sd 1:0:203:0: Attached scsi generic sg204 type 0 [Mon Dec 9 06:17:46 2019][ 41.730599] sd 1:0:204:0: Attached scsi generic sg205 type 0 [Mon Dec 9 06:17:46 2019][ 41.736299] sd 1:0:205:0: Attached scsi generic sg206 type 0 [Mon Dec 9 06:17:46 2019][ 41.742022] sd 1:0:206:0: Attached scsi generic sg207 type 0 [Mon Dec 9 06:17:46 2019][ 41.747725] sd 1:0:207:0: Attached scsi generic sg208 type 0 [Mon Dec 9 06:17:46 2019][ 41.753444] sd 1:0:208:0: Attached scsi generic sg209 type 0 [Mon Dec 9 06:17:46 2019][ 41.759144] sd 1:0:209:0: Attached scsi generic sg210 type 0 [Mon Dec 9 06:17:46 2019][ 41.764853] sd 1:0:210:0: Attached scsi generic sg211 type 0 [Mon Dec 9 06:17:46 2019][ 41.770549] sd 1:0:211:0: Attached scsi generic sg212 type 0 [Mon Dec 9 06:17:46 2019][ 41.776269] sd 1:0:212:0: Attached scsi generic sg213 type 0 [Mon Dec 9 06:17:47 2019][ 41.781975] sd 1:0:213:0: Attached scsi generic sg214 type 0 [Mon Dec 9 06:17:47 2019][ 41.787692] sd 1:0:214:0: Attached scsi generic sg215 type 0 [Mon Dec 9 06:17:47 2019][ 41.793400] sd 1:0:215:0: Attached scsi generic sg216 type 0 [Mon Dec 9 06:17:47 2019][ 41.799104] sd 1:0:216:0: Attached scsi generic sg217 type 0 [Mon Dec 9 06:17:47 2019][ 41.804818] sd 1:0:217:0: Attached scsi generic sg218 type 0 [Mon Dec 9 06:17:47 2019][ 41.810527] sd 1:0:218:0: Attached scsi generic sg219 type 0 [Mon Dec 9 06:17:47 2019][ 41.816225] sd 1:0:219:0: Attached scsi generic sg220 type 0 [Mon Dec 9 06:17:47 2019][ 41.821930] sd 1:0:220:0: Attached scsi generic sg221 type 0 [Mon Dec 9 06:17:47 2019][ 41.827631] sd 1:0:221:0: Attached scsi generic sg222 type 0 [Mon Dec 9 06:17:47 2019][ 41.833351] sd 1:0:222:0: Attached scsi generic sg223 type 0 [Mon Dec 9 06:17:47 2019][ 41.839059] sd 1:0:223:0: Attached scsi generic sg224 type 0 [Mon Dec 9 06:17:47 2019][ 41.844775] sd 1:0:224:0: Attached scsi generic sg225 type 0 [Mon Dec 9 06:17:47 2019][ 41.850480] sd 1:0:225:0: Attached scsi generic sg226 type 0 [Mon Dec 9 06:17:47 2019][ 41.856185] sd 1:0:226:0: Attached scsi generic sg227 type 0 [Mon Dec 9 06:17:47 2019][ 41.861897] sd 1:0:227:0: Attached scsi generic sg228 type 0 [Mon Dec 9 06:17:47 2019][ 41.867613] sd 1:0:228:0: Attached scsi generic sg229 type 0 [Mon Dec 9 06:17:47 2019][ 41.873314] sd 1:0:229:0: Attached scsi generic sg230 type 0 [Mon Dec 9 06:17:47 2019][ 41.879034] sd 1:0:230:0: Attached scsi generic sg231 type 0 [Mon Dec 9 06:17:47 2019][ 41.884735] sd 1:0:231:0: Attached scsi generic sg232 type 0 [Mon Dec 9 06:17:47 2019][ 41.890443] sd 1:0:232:0: Attached scsi generic sg233 type 0 [Mon Dec 9 06:17:47 2019][ 41.896143] sd 1:0:233:0: Attached scsi generic sg234 type 0 [Mon Dec 9 06:17:47 2019][ 41.901849] sd 1:0:234:0: Attached scsi generic sg235 type 0 [Mon Dec 9 06:17:47 2019][ 41.907551] sd 1:0:235:0: Attached scsi generic sg236 type 0 [Mon Dec 9 06:17:47 2019][ 41.913257] sd 1:0:236:0: Attached scsi generic sg237 type 0 [Mon Dec 9 06:17:47 2019][ 41.918971] sd 1:0:237:0: Attached scsi generic sg238 type 0 [Mon Dec 9 06:17:47 2019][ 41.924675] sd 1:0:238:0: Attached scsi generic sg239 type 0 [Mon Dec 9 06:17:47 2019][ 41.930382] sd 1:0:239:0: Attached scsi generic sg240 type 0 [Mon Dec 9 06:17:47 2019][ 41.936084] sd 1:0:240:0: Attached scsi generic sg241 type 0 [Mon Dec 9 06:17:47 2019][ 41.941798] sd 1:0:241:0: Attached scsi generic sg242 type 0 [Mon Dec 9 06:17:47 2019][ 41.947507] sd 1:0:242:0: Attached scsi generic sg243 type 0 [Mon Dec 9 06:17:47 2019][ 41.953204] sd 1:0:243:0: Attached scsi generic sg244 type 0 [Mon Dec 9 06:17:47 2019][ 41.958911] sd 1:0:244:0: Attached scsi generic sg245 type 0 [Mon Dec 9 06:17:49 2019][ 44.574939] device-mapper: multipath service-time: version 0.3.0 loaded [Mon Dec 9 06:17:54 2019][ 49.035616] ses 1:0:0:0: Attached Enclosure device [Mon Dec 9 06:17:54 2019][ 49.047420] ses 1:0:1:0: Attached Enclosure device [Mon Dec 9 06:17:54 2019][ 49.052325] ses 1:0:62:0: Attached Enclosure device [Mon Dec 9 06:17:54 2019][ 49.057324] ses 1:0:123:0: Attached Enclosure device [Mon Dec 9 06:17:54 2019][ 49.062417] ses 1:0:184:0: Attached Enclosure device [Mon Dec 9 06:17:54 2019][ 49.555777] input: PC Speaker as /devices/platform/pcspkr/input/input2 [Mon Dec 9 06:17:54 2019][ OK ] Found device /dev/ttyS0. [Mon Dec 9 06:17:54 2019][ 49.668905] cryptd: max_cpu_qlen set to 1000 [Mon Dec 9 06:17:54 2019][ 49.707892] AVX2 version of gcm_enc/dec engaged. [Mon Dec 9 06:17:54 2019][ 49.712536] AES CTR mode by8 optimization enabled [Mon Dec 9 06:17:54 2019][ 49.720922] alg: No test for __gcm-aes-aesni (__driver-gcm-aes-aesni) [Mon Dec 9 06:17:54 2019][ 49.727517] alg: No test for __generic-gcm-aes-aesni (__driver-generic-gcm-aes-aesni) [Mon Dec 9 06:17:55 2019][ 49.819024] kvm: Nested Paging enabled [Mon Dec 9 06:17:55 2019][ 49.827753] MCE: In-kernel MCE decoding enabled. [Mon Dec 9 06:17:55 2019][ 49.836456] AMD64 EDAC driver v3.4.0 [Mon Dec 9 06:17:55 2019][ 49.840068] EDAC amd64: DRAM ECC enabled. [Mon Dec 9 06:17:55 2019][ 49.844102] EDAC amd64: F17h detected (node 0). [Mon Dec 9 06:17:55 2019][ 49.848707] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 49.853424] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 49.858146] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 49.862864] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 49.867586] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 49.872292] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 49.877011] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 49.881722] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 49.886438] EDAC amd64: using x8 syndromes. [Mon Dec 9 06:17:55 2019][ 49.890635] EDAC amd64: MCT channel count: 2 [Mon Dec 9 06:17:55 2019][ 49.901276] EDAC MC0: Giving out device to 'amd64_edac' 'F17h': DEV 0000:00:18.3 [Mon Dec 9 06:17:55 2019][ 49.908683] EDAC amd64: DRAM ECC enabled. [Mon Dec 9 06:17:55 2019][ 49.912700] EDAC amd64: F17h detected (node 1). [Mon Dec 9 06:17:55 2019][ 49.917287] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 49.921999] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 49.926714] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 49.931431] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 49.936151] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 49.940867] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 49.945581] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 49.950294] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 49.955007] EDAC amd64: using x8 syndromes. [Mon Dec 9 06:17:55 2019][ 49.959199] EDAC amd64: MCT channel count: 2 [Mon Dec 9 06:17:55 2019][ 49.970360] EDAC MC1: Giving out device to 'amd64_edac' 'F17h': DEV 0000:00:19.3 [Mon Dec 9 06:17:55 2019][ 49.978033] EDAC amd64: DRAM ECC enabled. [Mon Dec 9 06:17:55 2019][ 49.982056] EDAC amd64: F17h detected (node 2). [Mon Dec 9 06:17:55 2019][ 49.986652] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 49.991446] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 49.996186] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 50.000908] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 50.005640] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 50.010370] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 50.015080] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 50.019795] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 50.024508] EDAC amd64: using x8 syndromes. [Mon Dec 9 06:17:55 2019][ 50.028706] EDAC amd64: MCT channel count: 2 [Mon Dec 9 06:17:55 2019][ 50.036602] EDAC MC2: Giving out device to 'amd64_edac' 'F17h': DEV 0000:00:1a.3 [Mon Dec 9 06:17:55 2019][ 50.044009] EDAC amd64: DRAM ECC enabled. [Mon Dec 9 06:17:55 2019][ 50.048034] EDAC amd64: F17h detected (node 3). [Mon Dec 9 06:17:55 2019][ 50.052633] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 50.057352] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 50.062065] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 50.066778] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 50.071498] EDAC amd64: MC: 0: 0MB 1: 0MB [Mon Dec 9 06:17:55 2019][ 50.076207] EDAC amd64: MC: 2: 16383MB 3: 16383MB [Mon Dec 9 06:17:55 2019][ 50.080926] EDAC amd64: MC: 4: 0MB 5: 0MB [Mon Dec 9 06:17:55 2019][ 50.085638] EDAC amd64: MC: 6: 0MB 7: 0MB [Mon Dec 9 06:17:55 2019][ 50.090351] EDAC amd64: using x8 syndromes. [Mon Dec 9 06:17:55 2019][ 50.094550] EDAC amd64: MCT channel count: 2 [Mon Dec 9 06:17:55 2019][ 50.105189] EDAC MC3: Giving out device to 'amd64_edac' 'F17h': DEV 0000:00:1b.3 [Mon Dec 9 06:17:55 2019][ 50.112934] EDAC PCI0: Giving out device to module 'amd64_edac' controller 'EDAC PCI controller': DEV '0000:00:18.0' (POLLED) [Mon Dec 9 06:17:56 2019][ 50.843845] dcdbas dcdbas: Dell Systems Management Base Driver (version 5.6.0-3.3) [Mon Dec 9 06:17:56 2019]%G%G[ OK ] Found device PERC_H330_Mini EFI\x20System\x20Partition. [Mon Dec 9 06:18:20 2019] [ 75.482153] Adding 4194300k swap on /dev/sda3. Priority:-2 extents:1 across:4194300k FS [Mon Dec 9 06:18:20 2019] Mounting /boot/efi... [Mon Dec 9 06:18:20 2019][ OK ] Found device PERC_H330_Mini 3. [Mon Dec 9 06:18:20 2019] Activating swap /dev/disk/by-uuid/4...7-253b-4b35-98bd-0ebd94f347e5... [Mon Dec 9 06:18:20 2019][ OK ] Activated swap /dev/disk/by-uuid/401ce0e7-253b-4b35-98bd-0ebd94f347e5. [Mon Dec 9 06:18:20 2019][ OK ] Reached target Swap. [Mon Dec 9 06:18:20 2019][ OK ] Mounted /boot/efi. [Mon Dec 9 06:18:20 2019][ OK ] Reached target Local File Systems[ 75.523442] type=1305 audit(1575901100.013:3): audit_pid=49307 old=0 auid=4294967295 ses=4294967295 res=1 [Mon Dec 9 06:18:20 2019]. [Mon Dec 9 06:18:20 2019] Starting Preprocess NFS configuration... [Mon Dec 9 06:18:20 2019] Starting Tell Plymouth To Write Out Runtime Data... [Mon Dec 9 06:18:20 2019] Starting Import [ 75.545883] RPC: Registered named UNIX socket transport module. [Mon Dec 9 06:18:20 2019]network configur[ 75.552220] RPC: Registered udp transport module. [Mon Dec 9 06:18:20 2019]ation from initr[ 75.558315] RPC: Registered tcp transport module. [Mon Dec 9 06:18:20 2019]amfs... [Mon Dec 9 06:18:20 2019][[ 75.564406] RPC: Registered tcp NFSv4.1 backchannel transport module. [Mon Dec 9 06:18:20 2019] OK ] Started Preprocess NFS configuration. [Mon Dec 9 06:18:20 2019][ OK ] Started Tell Plymouth To Write Out Runtime Data. [Mon Dec 9 06:18:20 2019][ OK ] Started Import network configuration from initramfs. [Mon Dec 9 06:18:20 2019] Starting Create Volatile Files and Directories... [Mon Dec 9 06:18:20 2019][ OK ] Started Create Volatile Files and Directories. [Mon Dec 9 06:18:20 2019] Starting Security Auditing Service... [Mon Dec 9 06:18:20 2019] Mounting RPC Pipe File System... [Mon Dec 9 06:18:20 2019][ OK ] Mounted RPC Pipe File System. [Mon Dec 9 06:18:20 2019][ OK ] Reached target rpc_pipefs.target. [Mon Dec 9 06:18:20 2019][ OK ] Started Security Auditing Service. [Mon Dec 9 06:18:20 2019] Starting Update UTMP about System Boot/Shutdown... [Mon Dec 9 06:18:20 2019][ OK ] Started Update UTMP about System Boot/Shutdown. [Mon Dec 9 06:18:20 2019][ OK ] Reached target System Initialization. [Mon Dec 9 06:18:20 2019] Starting openibd - configure Mellanox devices... [Mon Dec 9 06:18:20 2019][ OK ] Listening on D-Bus System Message Bus Socket. [Mon Dec 9 06:18:20 2019][ OK ] Started Daily Cleanup of Temporary Directories. [Mon Dec 9 06:18:20 2019][ OK ] Reached target Timers. [Mon Dec 9 06:18:20 2019][ OK ] Listening on RPCbind Server Activation Socket. [Mon Dec 9 06:18:20 2019][ OK ] Reached target Sockets. [Mon Dec 9 06:18:20 2019][ OK ] Reached target Basic System. [Mon Dec 9 06:18:20 2019] Starting Load CPU microcode update... [Mon Dec 9 06:18:20 2019] Starting Systems Management Event Management... [Mon Dec 9 06:18:20 2019] Starting Resets System Activity Logs... [Mon Dec 9 06:18:20 2019][ OK ] Started D-Bus System Message Bus. [Mon Dec 9 06:18:20 2019] Starting Authorization Manager... [Mon Dec 9 06:18:20 2019] Starting Software RAID monitoring and management... [Mon Dec 9 06:18:20 2019] Starting System Security Services Daemon... [Mon Dec 9 06:18:20 2019] Starting NTP client/server... [Mon Dec 9 06:18:20 2019][ OK ] Started Self Monitoring and Reporting Technology (SMART) Daemon. [Mon Dec 9 06:18:20 2019] Starting Systems Management Device Drivers... [Mon Dec 9 06:18:20 2019] Starting GSSAPI Proxy Daemon... [Mon Dec 9 06:18:20 2019][ OK ] Started irqbalance daemon. [Mon Dec 9 06:18:20 2019] Starting Dump dmesg to /var/log/dmesg... [Mon Dec 9 06:18:20 2019] Starting RPC bind service... [Mon Dec 9 06:18:20 2019][ OK ] Started Systems Management Event Management. [Mon Dec 9 06:18:20 2019][ OK ] Started Software RAID monitoring and management. [Mon Dec 9 06:18:20 2019][ OK ] Started Resets System Activity Logs. [Mon Dec 9 06:18:20 2019][ OK ] Started GSSAPI Proxy Daemon. [Mon Dec 9 06:18:21 2019][ OK ] Reached target NFS client services. [Mon Dec 9 06:18:21 2019][ OK ] Started Authorization Manager. [Mon Dec 9 06:18:21 2019][ OK ] Started RPC bind service. [Mon Dec 9 06:18:21 2019][ OK ] Started Load CPU microcode update. [Mon Dec 9 06:18:21 2019][ OK ] Started Dump dmesg to /var/log/dmesg. [Mon Dec 9 06:18:21 2019][ OK ] Started NTP client/server. [Mon Dec 9 06:18:21 2019][ OK ] Started System Security Services Daemon. [Mon Dec 9 06:18:21 2019][ OK ] Reached target User and Group Name Lookups. [Mon Dec 9 06:18:21 2019] Starting Login Service... [Mon Dec 9 06:18:21 2019][ OK ] Started Login Service. [Mon Dec 9 06:18:21 2019][ 76.205183] mlx5_core 0000:01:00.0: slow_pci_heuristic:5575:(pid 49586): Max link speed = 100000, PCI BW = 126016 [Mon Dec 9 06:18:21 2019][ 76.215508] mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) [Mon Dec 9 06:18:21 2019][ 76.223801] mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) [Mon Dec 9 06:18:21 2019][ OK ] Created slice system-mlnx_interface_mgr.slice. [Mon Dec 9 06:18:21 2019][ OK ] Started mlnx_interface_mgr - configure ib0. [Mon Dec 9 06:18:21 2019][ OK ] Started openibd - configure Mellanox devices. [Mon Dec 9 06:18:21 2019][ OK ] Reached target Remote File Systems (Pre). [Mon Dec 9 06:18:21 2019] Starting LSB: Bring up/down networking... [Mon Dec 9 06:18:29 2019]0m] Started Systems Management Device Drivers. [Mon Dec 9 06:18:29 2019] Starting Systems Management Data Engine... [Mon Dec 9 06:18:29 2019][ 76.810728] IPv6: ADDRCONF(NETDEV_UP): em1: link is not ready [Mon Dec 9 06:18:29 2019][ 80.343797] tg3 0000:81:00.0 em1: Link is up at 1000 Mbps, full duplex [Mon Dec 9 06:18:29 2019][ 80.350338] tg3 0000:81:00.0 em1: Flow control is on for TX and on for RX [Mon Dec 9 06:18:29 2019][ 80.357183] tg3 0000:81:00.0 em1: EEE is enabled [Mon Dec 9 06:18:29 2019][ 80.361816] IPv6: ADDRCONF(NETDEV_CHANGE): em1: link becomes ready [Mon Dec 9 06:18:29 2019][ 81.170910] IPv6: ADDRCONF(NETDEV_UP): ib0: link is not ready [Mon Dec 9 06:18:29 2019][ 81.444140] IPv6: ADDRCONF(NETDEV_CHANGE): ib0: link becomes ready [Mon Dec 9 06:18:30 2019][ OK ] Started LSB: Bring up/down networking. [Mon Dec 9 06:18:30 2019][ OK ] Reached target Network. [Mon Dec 9 06:18:30 2019][ OK ] Reached target Network is Online. [Mon Dec 9 06:18:30 2019] Starting N[ 85.577696] FS-Cache: Loaded [Mon Dec 9 06:18:30 2019]otify NFS peers of a restart... [Mon Dec 9 06:18:30 2019] Starting Collectd statistics daemon... [Mon Dec 9 06:18:30 2019] Starting Postfix Mail Transport Agent... [Mon Dec 9 06:18:30 2019] Starting Dynamic System Tuning Daemon... [Mon Dec 9 06:18:30 2019] Starting System Logging Service... [Mon Dec 9 06:18:30 2019] Mounting /share... [Mon Dec 9 06:18:30 2019] Starting OpenSSH server daemon... [Mon Dec 9 06:18:30 2019][ OK [ 85.608522] FS-Cache: Netfs 'nfs' registered for caching [Mon Dec 9 06:18:30 2019] ] Started Notify NFS peers of a restart. [Mon Dec 9 06:18:30 2019][ 85.618255] Key type dns_resolver registered [Mon Dec 9 06:18:30 2019][ OK ] Started Collectd statistics daemon. [Mon Dec 9 06:18:30 2019][ OK ] Started System Logging Service. [Mon Dec 9 06:18:30 2019] Starting xcat service on compute no...script and update node status... [Mon Dec 9 06:18:30 2019][ OK ] Started OpenSSH server daemon. [Mon Dec 9 06:18:30 2019][ OK ] Started xcat s[ 85.646616] NFS: Registering the id_resolver key type [Mon Dec 9 06:18:30 2019]ervice on comput[ 85.651939] Key type id_resolver registered [Mon Dec 9 06:18:30 2019]e nod...otscript[ 85.657486] Key type id_legacy registered [Mon Dec 9 06:18:30 2019] and update node status. [Mon Dec 9 06:18:30 2019][ OK ] Mounted /share. [Mon Dec 9 06:18:30 2019][ OK ] Reached target Remote File Systems. [Mon Dec 9 06:18:30 2019] Starting Crash recovery kernel arming... [Mon Dec 9 06:18:30 2019] Starting Permit User Sessions... [Mon Dec 9 06:18:30 2019][ OK ] Started Permit User Sessions. [Mon Dec 9 06:18:30 2019][ OK ] Started Lookout metrics collector. [Mon Dec 9 06:18:30 2019] Starting Terminate Plymouth Boot Screen... [Mon Dec 9 06:18:30 2019][ OK ] Started Command Scheduler. [Mon Dec 9 06:18:30 2019] Starting Wait for Plymouth Boot Screen to Quit... [Mon Dec 9 06:18:36 2019] [Mon Dec 9 06:18:36 2019]CentOS Linux 7 (Core) [Mon Dec 9 06:18:36 2019]Kernel 3.10.0-957.27.2.el7_lustre.pl2.x86_64 on an x86_64 [Mon Dec 9 06:18:36 2019] [Mon Dec 9 06:18:36 2019]fir-io8-s1 login: [ 191.261132] mpt3sas_cm0: log_info(0x31200205): originator(PL), code(0x20), sub_code(0x0205) [Mon Dec 9 06:22:08 2019][ 303.048704] LNet: HW NUMA nodes: 4, HW CPU cores: 48, npartitions: 4 [Mon Dec 9 06:22:08 2019][ 303.056201] alg: No test for adler32 (adler32-zlib) [Mon Dec 9 06:22:09 2019][ 303.856304] Lustre: Lustre: Build Version: 2.12.3_4_g142b4d4 [Mon Dec 9 06:22:09 2019][ 303.961403] LNet: 63766:0:(config.c:1627:lnet_inet_enumerate()) lnet: Ignoring interface em2: it's down [Mon Dec 9 06:22:09 2019][ 303.971183] LNet: Using FastReg for registration [Mon Dec 9 06:22:09 2019][ 303.988081] LNet: Added LNI 10.0.10.115@o2ib7 [8/256/0/180] [Mon Dec 9 06:52:17 2019][ 2112.595584] md: md6 stopped. [Mon Dec 9 06:52:17 2019][ 2112.606902] async_tx: api initialized (async) [Mon Dec 9 06:52:17 2019][ 2112.613260] xor: automatically using best checksumming function: [Mon Dec 9 06:52:17 2019][ 2112.628302] avx : 9596.000 MB/sec [Mon Dec 9 06:52:17 2019][ 2112.661307] raid6: sse2x1 gen() 6082 MB/s [Mon Dec 9 06:52:17 2019][ 2112.682304] raid6: sse2x2 gen() 11304 MB/s [Mon Dec 9 06:52:17 2019][ 2112.703304] raid6: sse2x4 gen() 12933 MB/s [Mon Dec 9 06:52:17 2019][ 2112.724303] raid6: avx2x1 gen() 14250 MB/s [Mon Dec 9 06:52:17 2019][ 2112.745309] raid6: avx2x2 gen() 18863 MB/s [Mon Dec 9 06:52:17 2019][ 2112.766304] raid6: avx2x4 gen() 18812 MB/s [Mon Dec 9 06:52:17 2019][ 2112.770579] raid6: using algorithm avx2x2 gen() (18863 MB/s) [Mon Dec 9 06:52:17 2019][ 2112.776241] raid6: using avx2x2 recovery algorithm [Mon Dec 9 06:52:17 2019][ 2112.797874] md/raid:md6: device dm-1 operational as raid disk 0 [Mon Dec 9 06:52:17 2019][ 2112.803814] md/raid:md6: device dm-34 operational as raid disk 9 [Mon Dec 9 06:52:17 2019][ 2112.809838] md/raid:md6: device dm-54 operational as raid disk 8 [Mon Dec 9 06:52:17 2019][ 2112.815854] md/raid:md6: device dm-16 operational as raid disk 7 [Mon Dec 9 06:52:17 2019][ 2112.821866] md/raid:md6: device dm-6 operational as raid disk 6 [Mon Dec 9 06:52:18 2019][ 2112.827795] md/raid:md6: device dm-42 operational as raid disk 5 [Mon Dec 9 06:52:18 2019][ 2112.833812] md/raid:md6: device dm-26 operational as raid disk 4 [Mon Dec 9 06:52:18 2019][ 2112.839823] md/raid:md6: device dm-17 operational as raid disk 3 [Mon Dec 9 06:52:18 2019][ 2112.845844] md/raid:md6: device dm-5 operational as raid disk 2 [Mon Dec 9 06:52:18 2019][ 2112.851766] md/raid:md6: device dm-43 operational as raid disk 1 [Mon Dec 9 06:52:18 2019][ 2112.858918] md/raid:md6: raid level 6 active with 10 out of 10 devices, algorithm 2 [Mon Dec 9 06:52:18 2019][ 2112.898947] md6: detected capacity change from 0 to 64011422924800 [Mon Dec 9 06:52:18 2019][ 2112.916672] md: md8 stopped. [Mon Dec 9 06:52:18 2019][ 2112.928560] md/raid:md8: device dm-44 operational as raid disk 0 [Mon Dec 9 06:52:18 2019][ 2112.934581] md/raid:md8: device dm-46 operational as raid disk 9 [Mon Dec 9 06:52:18 2019][ 2112.940592] md/raid:md8: device dm-23 operational as raid disk 8 [Mon Dec 9 06:52:18 2019][ 2112.946613] md/raid:md8: device dm-33 operational as raid disk 7 [Mon Dec 9 06:52:18 2019][ 2112.952647] md/raid:md8: device dm-45 operational as raid disk 6 [Mon Dec 9 06:52:18 2019][ 2112.958684] md/raid:md8: device dm-4 operational as raid disk 5 [Mon Dec 9 06:52:18 2019][ 2112.964624] md/raid:md8: device dm-8 operational as raid disk 4 [Mon Dec 9 06:52:18 2019][ 2112.970570] md/raid:md8: device dm-25 operational as raid disk 3 [Mon Dec 9 06:52:18 2019][ 2112.976615] md/raid:md8: device dm-21 operational as raid disk 2 [Mon Dec 9 06:52:18 2019][ 2112.982625] md/raid:md8: device dm-53 operational as raid disk 1 [Mon Dec 9 06:52:18 2019][ 2112.989407] md/raid:md8: raid level 6 active with 10 out of 10 devices, algorithm 2 [Mon Dec 9 06:52:18 2019][ 2113.025969] md8: detected capacity change from 0 to 64011422924800 [Mon Dec 9 06:52:18 2019][ 2113.055680] md: md4 stopped. [Mon Dec 9 06:52:18 2019][ 2113.067145] md/raid:md4: device dm-116 operational as raid disk 0 [Mon Dec 9 06:52:18 2019][ 2113.073272] md/raid:md4: device dm-100 operational as raid disk 9 [Mon Dec 9 06:52:18 2019][ 2113.079372] md/raid:md4: device dm-107 operational as raid disk 8 [Mon Dec 9 06:52:18 2019][ 2113.085471] md/raid:md4: device dm-94 operational as raid disk 7 [Mon Dec 9 06:52:18 2019][ 2113.091486] md/raid:md4: device dm-84 operational as raid disk 6 [Mon Dec 9 06:52:18 2019][ 2113.097499] md/raid:md4: device dm-76 operational as raid disk 5 [Mon Dec 9 06:52:18 2019][ 2113.103513] md/raid:md4: device dm-83 operational as raid disk 4 [Mon Dec 9 06:52:18 2019][ 2113.109531] md/raid:md4: device dm-66 operational as raid disk 3 [Mon Dec 9 06:52:18 2019][ 2113.115547] md/raid:md4: device dm-69 operational as raid disk 2 [Mon Dec 9 06:52:18 2019][ 2113.121564] md/raid:md4: device dm-117 operational as raid disk 1 [Mon Dec 9 06:52:18 2019][ 2113.128560] md/raid:md4: raid level 6 active with 10 out of 10 devices, algorithm 2 [Mon Dec 9 06:52:18 2019][ 2113.169475] md4: detected capacity change from 0 to 64011422924800 [Mon Dec 9 06:52:18 2019][ 2113.196823] md: md0 stopped. [Mon Dec 9 06:52:18 2019][ 2113.210826] md/raid:md0: device dm-60 operational as raid disk 0 [Mon Dec 9 06:52:18 2019][ 2113.216844] md/raid:md0: device dm-95 operational as raid disk 9 [Mon Dec 9 06:52:18 2019][ 2113.222857] md/raid:md0: device dm-91 operational as raid disk 8 [Mon Dec 9 06:52:18 2019][ 2113.228876] md/raid:md0: device dm-80 operational as raid disk 7 [Mon Dec 9 06:52:18 2019][ 2113.234891] md/raid:md0: device dm-88 operational as raid disk 6 [Mon Dec 9 06:52:18 2019][ 2113.240903] md/raid:md0: device dm-65 operational as raid disk 5 [Mon Dec 9 06:52:18 2019][ 2113.246915] md/raid:md0: device dm-64 operational as raid disk 4 [Mon Dec 9 06:52:18 2019][ 2113.252934] md/raid:md0: device dm-89 operational as raid disk 3 [Mon Dec 9 06:52:18 2019][ 2113.258949] md/raid:md0: device dm-74 operational as raid disk 2 [Mon Dec 9 06:52:18 2019][ 2113.264965] md/raid:md0: device dm-104 operational as raid disk 1 [Mon Dec 9 06:52:18 2019][ 2113.271903] md/raid:md0: raid level 6 active with 10 out of 10 devices, algorithm 2 [Mon Dec 9 06:52:18 2019][ 2113.308972] md0: detected capacity change from 0 to 64011422924800 [Mon Dec 9 06:52:18 2019][ 2113.341960] md: md10 stopped. [Mon Dec 9 06:52:18 2019][ 2113.354658] md/raid:md10: device dm-58 operational as raid disk 0 [Mon Dec 9 06:52:18 2019][ 2113.360759] md/raid:md10: device dm-18 operational as raid disk 9 [Mon Dec 9 06:52:18 2019][ 2113.366857] md/raid:md10: device dm-57 operational as raid disk 8 [Mon Dec 9 06:52:18 2019][ 2113.372959] md/raid:md10: device dm-15 operational as raid disk 7 [Mon Dec 9 06:52:18 2019][ 2113.379058] md/raid:md10: device dm-7 operational as raid disk 6 [Mon Dec 9 06:52:18 2019][ 2113.385073] md/raid:md10: device dm-27 operational as raid disk 5 [Mon Dec 9 06:52:18 2019][ 2113.391173] md/raid:md10: device dm-40 operational as raid disk 4 [Mon Dec 9 06:52:18 2019][ 2113.397273] md/raid:md10: device dm-28 operational as raid disk 3 [Mon Dec 9 06:52:18 2019][ 2113.403369] md/raid:md10: device dm-3 operational as raid disk 2 [Mon Dec 9 06:52:18 2019][ 2113.409384] md/raid:md10: device dm-56 operational as raid disk 1 [Mon Dec 9 06:52:18 2019][ 2113.416142] md/raid:md10: raid level 6 active with 10 out of 10 devices, algorithm 2 [Mon Dec 9 06:52:18 2019][ 2113.466385] md10: detected capacity change from 0 to 64011422924800 [Mon Dec 9 06:52:18 2019][ 2113.495745] md: md2 stopped. [Mon Dec 9 06:52:18 2019][ 2113.508936] md/raid:md2: device dm-119 operational as raid disk 0 [Mon Dec 9 06:52:18 2019][ 2113.515039] md/raid:md2: device dm-99 operational as raid disk 9 [Mon Dec 9 06:52:18 2019][ 2113.521056] md/raid:md2: device dm-114 operational as raid disk 8 [Mon Dec 9 06:52:18 2019][ 2113.527163] md/raid:md2: device dm-79 operational as raid disk 7 [Mon Dec 9 06:52:18 2019][ 2113.533173] md/raid:md2: device dm-86 operational as raid disk 6 [Mon Dec 9 06:52:18 2019][ 2113.539190] md/raid:md2: device dm-77 operational as raid disk 5 [Mon Dec 9 06:52:18 2019][ 2113.545206] md/raid:md2: device dm-73 operational as raid disk 4 [Mon Dec 9 06:52:18 2019][ 2113.551224] md/raid:md2: device dm-101 operational as raid disk 3 [Mon Dec 9 06:52:18 2019][ 2113.557329] md/raid:md2: device dm-105 operational as raid disk 2 [Mon Dec 9 06:52:18 2019][ 2113.563436] md/raid:md2: device dm-106 operational as raid disk 1 [Mon Dec 9 06:52:18 2019][ 2113.570345] md/raid:md2: raid level 6 active with 10 out of 10 devices, algorithm 2 [Mon Dec 9 06:52:18 2019][ 2113.592487] md2: detected capacity change from 0 to 64011422924800 [Mon Dec 9 06:52:18 2019][ 2113.817580] LDISKFS-fs (md6): file extents enabled, maximum tree depth=5 [Mon Dec 9 06:52:19 2019][ 2113.946595] LDISKFS-fs (md8): file extents enabled, maximum tree depth=5 [Mon Dec 9 06:52:19 2019][ 2114.147645] LDISKFS-fs (md6): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Mon Dec 9 06:52:19 2019][ 2114.202582] LDISKFS-fs (md4): file extents enabled, maximum tree depth=5 [Mon Dec 9 06:52:19 2019][ 2114.328693] LDISKFS-fs (md0): file extents enabled, maximum tree depth=5 [Mon Dec 9 06:52:19 2019][ 2114.342963] LDISKFS-fs (md8): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Mon Dec 9 06:52:19 2019][ 2114.534044] LDISKFS-fs (md4): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Mon Dec 9 06:52:19 2019][ 2114.659742] LDISKFS-fs (md0): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Mon Dec 9 06:52:19 2019][ 2114.683575] LDISKFS-fs (md10): file extents enabled, maximum tree depth=5 [Mon Dec 9 06:52:19 2019][ 2114.694623] LDISKFS-fs (md2): file extents enabled, maximum tree depth=5 [Mon Dec 9 06:52:19 2019][ 2114.774385] LustreError: 137-5: fir-OST0054_UUID: not available for connect from 10.8.27.17@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Mon Dec 9 06:52:19 2019][ 2114.791669] LustreError: Skipped 17 previous similar messages [Mon Dec 9 06:52:20 2019][ 2114.921315] Lustre: fir-OST005a: Not available for connect from 10.9.117.8@o2ib4 (not set up) [Mon Dec 9 06:52:20 2019][ 2114.929849] Lustre: Skipped 1 previous similar message [Mon Dec 9 06:52:20 2019][ 2115.042213] LDISKFS-fs (md10): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Mon Dec 9 06:52:20 2019][ 2115.282969] LustreError: 137-5: fir-OST005c_UUID: not available for connect from 10.9.117.27@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Mon Dec 9 06:52:20 2019][ 2115.300335] LustreError: Skipped 220 previous similar messages [Mon Dec 9 06:52:20 2019][ 2115.453675] Lustre: fir-OST005a: Not available for connect from 10.9.102.25@o2ib4 (not set up) [Mon Dec 9 06:52:20 2019][ 2115.462300] Lustre: Skipped 39 previous similar messages [Mon Dec 9 06:52:21 2019][ 2116.011575] LDISKFS-fs (md2): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Mon Dec 9 06:52:21 2019][ 2116.086764] Lustre: fir-OST005a: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Mon Dec 9 06:52:21 2019][ 2116.097595] Lustre: fir-OST005a: in recovery but waiting for the first client to connect [Mon Dec 9 06:52:21 2019][ 2116.097864] Lustre: fir-OST005a: Will be in recovery for at least 2:30, or until 1291 clients reconnect [Mon Dec 9 06:52:21 2019][ 2116.098139] Lustre: fir-OST005a: Connection restored to (at 10.8.24.17@o2ib6) [Mon Dec 9 06:52:21 2019][ 2116.301499] LustreError: 137-5: fir-OST0054_UUID: not available for connect from 10.9.105.45@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Mon Dec 9 06:52:21 2019][ 2116.318868] LustreError: Skipped 415 previous similar messages [Mon Dec 9 06:52:21 2019][ 2116.607374] Lustre: fir-OST005c: Connection restored to 7f6916f2-c589-3558-df52-0f5294f8fa05 (at 10.9.102.19@o2ib4) [Mon Dec 9 06:52:21 2019][ 2116.617640] Lustre: fir-OST0058: Not available for connect from 10.9.102.60@o2ib4 (not set up) [Mon Dec 9 06:52:21 2019][ 2116.617642] Lustre: Skipped 52 previous similar messages [Mon Dec 9 06:52:21 2019][ 2116.631749] Lustre: Skipped 46 previous similar messages [Mon Dec 9 06:52:21 2019][ 2116.753977] Lustre: fir-OST0058: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Mon Dec 9 06:52:21 2019][ 2116.764243] Lustre: Skipped 1 previous similar message [Mon Dec 9 06:52:21 2019][ 2116.769973] Lustre: fir-OST0058: in recovery but waiting for the first client to connect [Mon Dec 9 06:52:21 2019][ 2116.772765] Lustre: fir-OST0058: Will be in recovery for at least 2:30, or until 1291 clients reconnect [Mon Dec 9 06:52:21 2019][ 2116.772766] Lustre: Skipped 1 previous similar message [Mon Dec 9 06:52:21 2019][ 2116.792617] Lustre: Skipped 1 previous similar message [Mon Dec 9 06:52:22 2019][ 2117.610633] Lustre: fir-OST005e: Connection restored to (at 10.8.24.1@o2ib6) [Mon Dec 9 06:52:22 2019][ 2117.617780] Lustre: Skipped 209 previous similar messages [Mon Dec 9 06:52:22 2019][ 2117.757906] Lustre: fir-OST0056: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Mon Dec 9 06:52:22 2019][ 2117.768170] Lustre: Skipped 2 previous similar messages [Mon Dec 9 06:52:22 2019][ 2117.773910] Lustre: fir-OST0056: in recovery but waiting for the first client to connect [Mon Dec 9 06:52:22 2019][ 2117.782003] Lustre: Skipped 2 previous similar messages [Mon Dec 9 06:52:22 2019][ 2117.788887] Lustre: fir-OST0056: Will be in recovery for at least 2:30, or until 1291 clients reconnect [Mon Dec 9 06:52:22 2019][ 2117.798284] Lustre: Skipped 2 previous similar messages [Mon Dec 9 06:52:24 2019][ 2119.614365] Lustre: fir-OST005e: Connection restored to f5313c6f-3647-048f-259c-ceddb6cbc1d1 (at 10.9.103.43@o2ib4) [Mon Dec 9 06:52:24 2019][ 2119.614367] Lustre: fir-OST005c: Connection restored to f5313c6f-3647-048f-259c-ceddb6cbc1d1 (at 10.9.103.43@o2ib4) [Mon Dec 9 06:52:24 2019][ 2119.614369] Lustre: fir-OST0056: Connection restored to f5313c6f-3647-048f-259c-ceddb6cbc1d1 (at 10.9.103.43@o2ib4) [Mon Dec 9 06:52:24 2019][ 2119.614370] Lustre: fir-OST005a: Connection restored to f5313c6f-3647-048f-259c-ceddb6cbc1d1 (at 10.9.103.43@o2ib4) [Mon Dec 9 06:52:24 2019][ 2119.614371] Lustre: fir-OST0058: Connection restored to f5313c6f-3647-048f-259c-ceddb6cbc1d1 (at 10.9.103.43@o2ib4) [Mon Dec 9 06:52:24 2019][ 2119.614373] Lustre: Skipped 784 previous similar messages [Mon Dec 9 06:52:24 2019][ 2119.614374] Lustre: Skipped 784 previous similar messages [Mon Dec 9 06:52:24 2019][ 2119.614375] Lustre: Skipped 784 previous similar messages [Mon Dec 9 06:52:24 2019][ 2119.614379] Lustre: Skipped 784 previous similar messages [Mon Dec 9 06:52:24 2019][ 2119.688119] Lustre: Skipped 17 previous similar messages [Mon Dec 9 06:52:27 2019][ 2122.534723] Lustre: fir-OST005e: Denying connection for new client 5f11dd29-1211-44a2-2612-f8309cf085b3 (at 10.8.21.18@o2ib6), waiting for 1290 known clients (528 recovered, 17 in progress, and 0 evicted) to recover in 2:24 [Mon Dec 9 06:52:27 2019][ 2122.554537] Lustre: Skipped 2 previous similar messages [Mon Dec 9 06:52:32 2019][ 2127.578715] Lustre: fir-OST0058: Recovery over after 0:11, of 1291 clients 1291 recovered and 0 were evicted. [Mon Dec 9 06:52:32 2019][ 2127.604937] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11587727 to 0x1800000401:11587777 [Mon Dec 9 06:52:32 2019][ 2127.636081] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3010908 to 0x1a80000402:3010945 [Mon Dec 9 06:52:32 2019][ 2127.646210] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11643012 to 0x1980000401:11643041 [Mon Dec 9 06:52:32 2019][ 2127.651249] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3015076 to 0x1900000402:3015105 [Mon Dec 9 06:52:32 2019][ 2127.666810] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11601326 to 0x1a80000400:11601409 [Mon Dec 9 06:52:32 2019][ 2127.676743] Lustre: fir-OST005e: deleting orphan objects from 0x0:27453436 to 0x0:27453473 [Mon Dec 9 06:52:32 2019][ 2127.677004] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000400:11457723 to 0x1a00000400:11457761 [Mon Dec 9 06:52:32 2019][ 2127.700276] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:2978880 to 0x1a00000401:2978945 [Mon Dec 9 06:52:32 2019][ 2127.702764] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:792278 to 0x1980000400:792321 [Mon Dec 9 06:52:32 2019][ 2127.703020] Lustre: fir-OST005a: deleting orphan objects from 0x0:27548344 to 0x0:27548385 [Mon Dec 9 06:52:32 2019][ 2127.704825] Lustre: fir-OST0056: deleting orphan objects from 0x1880000402:3009048 to 0x1880000402:3009121 [Mon Dec 9 06:52:32 2019][ 2127.707244] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:788482 to 0x1a80000401:788513 [Mon Dec 9 06:52:32 2019][ 2127.712646] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3018544 to 0x1980000402:3018625 [Mon Dec 9 06:52:32 2019][ 2127.737264] Lustre: fir-OST0056: deleting orphan objects from 0x1880000400:11579442 to 0x1880000400:11579521 [Mon Dec 9 06:52:32 2019][ 2127.757860] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:789629 to 0x1900000400:789665 [Mon Dec 9 06:52:32 2019][ 2127.758850] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11597256 to 0x1900000401:11597281 [Mon Dec 9 06:52:32 2019][ 2127.782821] Lustre: fir-OST0058: deleting orphan objects from 0x0:27492955 to 0x0:27492993 [Mon Dec 9 06:52:32 2019][ 2127.784564] Lustre: fir-OST0056: deleting orphan objects from 0x1880000401:789050 to 0x1880000401:789089 [Mon Dec 9 06:52:32 2019][ 2127.801547] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3000247 to 0x1800000400:3000289 [Mon Dec 9 06:52:32 2019][ 2127.814050] Lustre: fir-OST005c: deleting orphan objects from 0x0:27178781 to 0x0:27178817 [Mon Dec 9 06:52:32 2019][ 2127.831910] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:777103 to 0x1a00000402:777121 [Mon Dec 9 06:52:33 2019][ 2127.849043] Lustre: fir-OST0054: deleting orphan objects from 0x0:27444185 to 0x0:27444225 [Mon Dec 9 06:52:33 2019][ 2127.852518] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:786910 to 0x1800000402:786945 [Mon Dec 9 06:52:33 2019][ 2127.914689] Lustre: fir-OST0056: deleting orphan objects from 0x0:27466201 to 0x0:27466241 [Mon Dec 9 06:52:52 2019][ 2147.537429] Lustre: fir-OST005e: Connection restored to 5f11dd29-1211-44a2-2612-f8309cf085b3 (at 10.8.21.18@o2ib6) [Mon Dec 9 06:52:52 2019][ 2147.547785] Lustre: Skipped 6485 previous similar messages [Mon Dec 9 08:53:21 2019][ 9376.001426] Lustre: fir-OST0054: haven't heard from client 798dc93c-11ba-328e-acec-b07846966ea5 (at 10.8.0.67@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892252ab7800, cur 1575910401 expire 1575910251 last 1575910174 [Mon Dec 9 09:49:21 2019][12736.930259] Lustre: fir-OST0054: Connection restored to 798dc93c-11ba-328e-acec-b07846966ea5 (at 10.8.0.67@o2ib6) [Mon Dec 9 09:49:21 2019][12736.940519] Lustre: Skipped 5 previous similar messages [Mon Dec 9 09:51:55 2019][12890.468581] Lustre: fir-OST0054: Connection restored to 9a70df35-6de0-4 (at 10.8.19.7@o2ib6) [Mon Dec 9 09:51:55 2019][12890.477027] Lustre: Skipped 5 previous similar messages [Mon Dec 9 09:52:03 2019][12898.060656] Lustre: fir-OST0058: haven't heard from client 1d08460b-716a-03a1-30aa-d26bf61d87fe (at 10.8.0.65@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8922804ed000, cur 1575913923 expire 1575913773 last 1575913696 [Mon Dec 9 09:52:03 2019][12898.082275] Lustre: Skipped 5 previous similar messages [Mon Dec 9 10:01:44 2019][13479.118042] Lustre: fir-OST0054: Connection restored to 8171b0fd-9423-4 (at 10.9.109.27@o2ib4) [Mon Dec 9 10:01:44 2019][13479.126662] Lustre: Skipped 4 previous similar messages [Mon Dec 9 10:28:31 2019][15086.261148] Lustre: fir-OST0054: Connection restored to 1d08460b-716a-03a1-30aa-d26bf61d87fe (at 10.8.0.65@o2ib6) [Mon Dec 9 10:28:31 2019][15086.271416] Lustre: Skipped 5 previous similar messages [Mon Dec 9 11:39:58 2019][19374.041469] LustreError: 67930:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005e: cli f9f503f0-6ff6-698f-9a8d-14bd128a6d42 claims 16801792 GRANT, real grant 16752640 [Mon Dec 9 12:35:20 2019][22695.260464] Lustre: fir-OST0054: haven't heard from client 83281a6e-8cdd-af0c-d930-afb3d26c7eba (at 10.8.23.14@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892252b8a000, cur 1575923720 expire 1575923570 last 1575923493 [Mon Dec 9 12:35:20 2019][22695.282202] Lustre: Skipped 5 previous similar messages [Mon Dec 9 12:35:35 2019][22710.880614] Lustre: fir-OST0054: Connection restored to 83281a6e-8cdd-af0c-d930-afb3d26c7eba (at 10.8.23.14@o2ib6) [Mon Dec 9 12:35:35 2019][22710.890963] Lustre: Skipped 5 previous similar messages [Mon Dec 9 12:43:09 2019][23164.274873] Lustre: fir-OST0056: haven't heard from client 50bb3322-2186-2682-e22f-d2e40908bd0d (at 10.8.23.14@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff890d9fd03800, cur 1575924189 expire 1575924039 last 1575923962 [Mon Dec 9 12:43:09 2019][23164.296576] Lustre: Skipped 5 previous similar messages [Mon Dec 9 12:43:32 2019][23187.511794] Lustre: fir-OST0054: Connection restored to 83281a6e-8cdd-af0c-d930-afb3d26c7eba (at 10.8.23.14@o2ib6) [Mon Dec 9 12:43:32 2019][23187.522142] Lustre: Skipped 5 previous similar messages [Mon Dec 9 12:53:36 2019][23791.966216] Lustre: fir-OST0054: Connection restored to 83281a6e-8cdd-af0c-d930-afb3d26c7eba (at 10.8.23.14@o2ib6) [Mon Dec 9 12:53:36 2019][23791.976564] Lustre: Skipped 5 previous similar messages [Mon Dec 9 12:54:01 2019][23816.286759] Lustre: fir-OST0054: haven't heard from client 75aebac8-89c1-69e4-9dfa-1727b2d47fae (at 10.8.23.14@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff890bba3a7c00, cur 1575924841 expire 1575924691 last 1575924614 [Mon Dec 9 12:54:01 2019][23816.308467] Lustre: Skipped 5 previous similar messages [Mon Dec 9 13:09:05 2019][24720.298181] Lustre: fir-OST0058: haven't heard from client 2867fefc-6124-ed47-3fcc-acf48d637860 (at 10.8.18.35@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892252ce6000, cur 1575925745 expire 1575925595 last 1575925518 [Mon Dec 9 13:09:05 2019][24720.319902] Lustre: Skipped 5 previous similar messages [Mon Dec 9 16:48:39 2019][37894.763171] perf: interrupt took too long (2503 > 2500), lowering kernel.perf_event_max_sample_rate to 79000 [Mon Dec 9 21:34:55 2019][55071.320836] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956084/real 1575956084] req@ffff88eded431680 x1652452367426464/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956095 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Mon Dec 9 21:34:55 2019][55071.320838] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956084/real 1575956084] req@ffff88e821f9c800 x1652452367426448/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956095 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Mon Dec 9 21:35:06 2019][55082.321056] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956095/real 1575956095] req@ffff88e821f9c800 x1652452367426448/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956106 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:35:17 2019][55093.348286] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956106/real 1575956106] req@ffff88e821f9c800 x1652452367426448/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956117 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:35:17 2019][55093.375649] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Mon Dec 9 21:35:28 2019][55104.375502] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956117/real 1575956117] req@ffff88eded431680 x1652452367426464/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956128 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:35:28 2019][55104.402874] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 1 previous similar message [Mon Dec 9 21:35:39 2019][55115.385726] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956128/real 1575956128] req@ffff88e821f9c800 x1652452367426448/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956139 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:35:39 2019][55115.413060] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 1 previous similar message [Mon Dec 9 21:35:50 2019][55126.412953] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956139/real 1575956139] req@ffff88eded431680 x1652452367426464/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956150 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:35:50 2019][55126.440294] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 1 previous similar message [Mon Dec 9 21:36:12 2019][55148.424402] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956161/real 1575956161] req@ffff88e821f9c800 x1652452367426448/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956172 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:36:12 2019][55148.451748] Lustre: 67848:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Mon Dec 9 21:36:45 2019][55181.452062] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575956194/real 1575956194] req@ffff88eded431680 x1652452367426464/t0(0) o106->fir-OST0054@10.9.106.54@o2ib4:15/16 lens 296/280 e 0 to 1 dl 1575956205 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Mon Dec 9 21:36:45 2019][55181.479405] Lustre: 67791:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 5 previous similar messages [Mon Dec 9 21:37:06 2019][55201.974017] Lustre: fir-OST0056: haven't heard from client 1316ac10-17f9-20d9-6734-8b32fc11fac2 (at 10.9.106.54@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892251566800, cur 1575956226 expire 1575956076 last 1575955999 [Mon Dec 9 21:37:06 2019][55201.995833] Lustre: Skipped 5 previous similar messages [Mon Dec 9 21:37:06 2019][55202.001208] LustreError: 67848:0:(ldlm_lockd.c:681:ldlm_handle_ast_error()) ### client (nid 10.9.106.54@o2ib4) failed to reply to glimpse AST (req@ffff88e821f9c800 x1652452367426448 status 0 rc -5), evict it ns: filter-fir-OST0054_UUID lock: ffff88fef56bc380/0x7066c9c1891f377c lrc: 3/0,0 mode: PW/PW res: [0x1800000401:0xa16b7b:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 67108864->68719476735) flags: 0x40000080000000 nid: 10.9.106.54@o2ib4 remote: 0xa907bc36138dc384 expref: 7 pid: 67663 timeout: 0 lvb_type: 0 [Mon Dec 9 21:37:06 2019][55202.001216] LustreError: 138-a: fir-OST0054: A client on nid 10.9.106.54@o2ib4 was evicted due to a lock glimpse callback time out: rc -5 [Mon Dec 9 21:37:06 2019][55202.001246] LustreError: 66071:0:(ldlm_lockd.c:256:expired_lock_main()) ### lock callback timer expired after 1575956226s: evicting client at 10.9.106.54@o2ib4 ns: filter-fir-OST0054_UUID lock: ffff88f5e8f93600/0x7066c9c1891f78a5 lrc: 3/0,0 mode: PW/PW res: [0x1800000401:0xb38953:0x0].0x0 rrc: 2 type: EXT [0->18446744073709551615] (req 34359738368->18446744073709551615) flags: 0x40000000000000 nid: 10.9.106.54@o2ib4 remote: 0xa907bc36138dc513 expref: 5 pid: 67900 timeout: 0 lvb_type: 0 [Mon Dec 9 21:37:06 2019][55202.102654] LustreError: 67848:0:(ldlm_lockd.c:681:ldlm_handle_ast_error()) Skipped 1 previous similar message [Mon Dec 9 21:37:16 2019][55211.915449] Lustre: fir-OST005e: haven't heard from client 1316ac10-17f9-20d9-6734-8b32fc11fac2 (at 10.9.106.54@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892250a28800, cur 1575956236 expire 1575956086 last 1575956009 [Mon Dec 9 21:37:16 2019][55211.937252] Lustre: Skipped 3 previous similar messages [Mon Dec 9 21:37:21 2019][55216.926307] Lustre: fir-OST005a: haven't heard from client 1316ac10-17f9-20d9-6734-8b32fc11fac2 (at 10.9.106.54@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff88f2aecda800, cur 1575956241 expire 1575956091 last 1575956014 [Mon Dec 9 21:39:41 2019][55357.690391] Lustre: fir-OST0054: Connection restored to 1316ac10-17f9-20d9-6734-8b32fc11fac2 (at 10.9.106.54@o2ib4) [Mon Dec 9 21:39:41 2019][55357.700832] Lustre: Skipped 5 previous similar messages [Tue Dec 10 02:48:10 2019][73866.289223] Lustre: fir-OST005a: haven't heard from client 0af2ee10-72ea-97a8-65e7-44544fdbc0b9 (at 10.9.108.39@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff88f2aec88800, cur 1575974890 expire 1575974740 last 1575974663 [Tue Dec 10 05:18:04 2019][82860.494528] Lustre: fir-OST0058: Client 67942120-f44f-42ea-60c3-96f62fccea78 (at 10.9.109.39@o2ib4) reconnecting [Tue Dec 10 05:18:04 2019][82860.504721] Lustre: fir-OST0058: Connection restored to 67942120-f44f-42ea-60c3-96f62fccea78 (at 10.9.109.39@o2ib4) [Tue Dec 10 05:18:12 2019][82868.720930] Lustre: fir-OST0056: Client c9911b4c-e55e-f4aa-416a-b652019239f7 (at 10.9.117.40@o2ib4) reconnecting [Tue Dec 10 05:18:12 2019][82868.731127] Lustre: fir-OST0056: Connection restored to c9911b4c-e55e-f4aa-416a-b652019239f7 (at 10.9.117.40@o2ib4) [Tue Dec 10 05:18:16 2019][82872.652414] Lustre: fir-OST0058: Client e72387a4-2bab-d686-07ea-8e45160d2e1d (at 10.9.117.23@o2ib4) reconnecting [Tue Dec 10 05:18:16 2019][82872.662616] Lustre: fir-OST0058: Connection restored to e72387a4-2bab-d686-07ea-8e45160d2e1d (at 10.9.117.23@o2ib4) [Tue Dec 10 05:18:18 2019][82874.824840] Lustre: fir-OST005a: Client d873db05-7c48-65ad-d97d-599447705616 (at 10.9.106.5@o2ib4) reconnecting [Tue Dec 10 05:18:18 2019][82874.834931] Lustre: Skipped 5 previous similar messages [Tue Dec 10 05:18:18 2019][82874.840194] Lustre: fir-OST005a: Connection restored to d873db05-7c48-65ad-d97d-599447705616 (at 10.9.106.5@o2ib4) [Tue Dec 10 05:18:18 2019][82874.850558] Lustre: Skipped 5 previous similar messages [Tue Dec 10 05:18:22 2019][82878.869944] Lustre: fir-OST005a: Client acd3ae51-2d23-93df-d1b3-33ff6a3945ef (at 10.9.114.5@o2ib4) reconnecting [Tue Dec 10 05:18:22 2019][82878.880035] Lustre: Skipped 64 previous similar messages [Tue Dec 10 05:18:22 2019][82878.885392] Lustre: fir-OST005a: Connection restored to acd3ae51-2d23-93df-d1b3-33ff6a3945ef (at 10.9.114.5@o2ib4) [Tue Dec 10 05:18:22 2019][82878.895765] Lustre: Skipped 64 previous similar messages [Tue Dec 10 05:18:25 2019][82881.458358] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.209@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:18:25 2019][82881.471341] LustreError: 67718:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff8922ca374050 x1649309827682656/t0(0) o4->c93954af-761b-f1eb-f651-9881322a7a72@10.9.108.51@o2ib4:698/0 lens 488/448 e 1 to 0 dl 1575983923 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:18:25 2019][82881.496115] Lustre: fir-OST0058: Bulk IO write error with c93954af-761b-f1eb-f651-9881322a7a72 (at 10.9.108.51@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:18:29 2019][82885.471409] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.210@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:18:29 2019][82885.484361] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 1 previous similar message [Tue Dec 10 05:18:29 2019][82885.495092] LustreError: 67822:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 14680064(16777216) req@ffff88f2fbc1e050 x1652122491273216/t0(0) o4->eb7e3af2-d117-4@10.9.101.1@o2ib4:702/0 lens 488/448 e 1 to 0 dl 1575983927 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:18:29 2019][82885.519616] Lustre: fir-OST0054: Bulk IO write error with eb7e3af2-d117-4 (at 10.9.101.1@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:18:30 2019][82886.919156] Lustre: fir-OST0054: Client 32524583-8f43-9f54-827c-15b3a46fedcc (at 10.8.30.10@o2ib6) reconnecting [Tue Dec 10 05:18:30 2019][82886.919272] Lustre: fir-OST005c: Connection restored to d69fcdf7-730b-8cda-70aa-8ec0410da18f (at 10.8.29.1@o2ib6) [Tue Dec 10 05:18:30 2019][82886.919275] Lustre: Skipped 104 previous similar messages [Tue Dec 10 05:18:30 2019][82886.944900] Lustre: Skipped 105 previous similar messages [Tue Dec 10 05:18:39 2019][82895.494630] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.210@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:18:39 2019][82895.507604] LustreError: 67989:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 9437184(12582912) req@ffff89224e0e5050 x1649309827682656/t0(0) o4->c93954af-761b-f1eb-f651-9881322a7a72@10.9.108.51@o2ib4:710/0 lens 488/448 e 0 to 0 dl 1575983935 ref 1 fl Interpret:/2/0 rc 0/0 [Tue Dec 10 05:18:39 2019][82895.533682] Lustre: fir-OST0058: Bulk IO write error with c93954af-761b-f1eb-f651-9881322a7a72 (at 10.9.108.51@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:18:44 2019][82900.507721] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:18:44 2019][82900.520690] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 2 previous similar messages [Tue Dec 10 05:18:44 2019][82900.531503] LustreError: 67722:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 13631488(16777216) req@ffff8902f6709050 x1652122491471360/t0(0) o4->eb7e3af2-d117-4@10.9.101.1@o2ib4:718/0 lens 488/448 e 1 to 0 dl 1575983943 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:18:44 2019][82900.556080] Lustre: fir-OST0054: Bulk IO write error with eb7e3af2-d117-4 (at 10.9.101.1@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:18:44 2019][82901.259219] LustreError: 137-5: fir-OST0059_UUID: not available for connect from 10.9.116.11@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:18:44 2019][82901.276627] LustreError: Skipped 204 previous similar messages [Tue Dec 10 05:18:44 2019][82901.347731] Lustre: 67873:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575983917/real 1575983917] req@ffff8912f703f080 x1652452421351632/t0(0) o105->fir-OST0054@10.9.101.29@o2ib4:15/16 lens 360/224 e 0 to 1 dl 1575983924 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Tue Dec 10 05:18:44 2019][82901.375070] Lustre: 67873:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Tue Dec 10 05:18:46 2019][82902.982452] Lustre: fir-OST005e: Client 91d6d3f9-54bc-fd90-b16e-873e3af76326 (at 10.9.106.44@o2ib4) reconnecting [Tue Dec 10 05:18:46 2019][82902.983396] Lustre: fir-OST0054: Connection restored to 91d6d3f9-54bc-fd90-b16e-873e3af76326 (at 10.9.106.44@o2ib4) [Tue Dec 10 05:18:46 2019][82902.983398] Lustre: Skipped 187 previous similar messages [Tue Dec 10 05:18:46 2019][82903.008483] Lustre: Skipped 187 previous similar messages [Tue Dec 10 05:18:50 2019][82906.576837] LustreError: 67929:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff8907c0ee1850 x1649049588956192/t0(0) o4->20463417-fb32-2f92-5aae-59bfa8e287e3@10.9.101.29@o2ib4:725/0 lens 488/448 e 1 to 0 dl 1575983950 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:18:50 2019][82906.601645] Lustre: fir-OST0054: Bulk IO write error with 20463417-fb32-2f92-5aae-59bfa8e287e3 (at 10.9.101.29@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:18:52 2019][82908.822798] LustreError: 137-5: fir-OST0059_UUID: not available for connect from 10.8.31.2@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:18:52 2019][82908.839992] LustreError: Skipped 1 previous similar message [Tue Dec 10 05:18:54 2019][82910.607090] LustreError: 137-5: fir-OST0059_UUID: not available for connect from 10.9.101.28@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:18:54 2019][82910.624484] LustreError: Skipped 2 previous similar messages [Tue Dec 10 05:18:59 2019][82915.531020] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:18:59 2019][82915.531026] LustreError: 67718:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 0(205619) req@ffff89224e0e3850 x1652123335105408/t0(0) o4->4cec062a-e1ff-4@10.9.101.3@o2ib4:732/0 lens 488/448 e 1 to 0 dl 1575983957 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:18:59 2019][82915.531049] Lustre: fir-OST0054: Bulk IO write error with 4cec062a-e1ff-4 (at 10.9.101.3@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:18:59 2019][82915.578309] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 1 previous similar message [Tue Dec 10 05:19:00 2019][82916.566038] LustreError: 67989:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff892251cea050 x1650930158014448/t0(0) o4->fe46e801-2d86-9439-0b24-b78514ed5486@10.9.109.8@o2ib4:739/0 lens 488/448 e 1 to 0 dl 1575983964 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:19:04 2019][82921.314277] LustreError: 137-5: fir-OST005f_UUID: not available for connect from 10.8.17.19@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:19:04 2019][82921.331565] LustreError: Skipped 2 previous similar messages [Tue Dec 10 05:19:09 2019][82926.121728] LustreError: 137-5: fir-OST0059_UUID: not available for connect from 10.9.105.11@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:19:09 2019][82926.139115] LustreError: Skipped 11 previous similar messages [Tue Dec 10 05:19:17 2019][82933.838387] LustreError: 67722:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff88f3a9515050 x1649292592204048/t0(0) o4->d269b7b3-c7ee-1895-0bbf-8293c505cff2@10.9.110.44@o2ib4:1/0 lens 488/448 e 1 to 0 dl 1575983981 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:19:17 2019][82933.862916] Lustre: fir-OST0058: Bulk IO write error with d269b7b3-c7ee-1895-0bbf-8293c505cff2 (at 10.9.110.44@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:19:17 2019][82933.876140] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:19:18 2019][82934.748407] LustreError: 137-5: fir-OST0055_UUID: not available for connect from 10.9.107.48@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:19:18 2019][82934.765772] LustreError: Skipped 14 previous similar messages [Tue Dec 10 05:19:18 2019][82934.985866] Lustre: fir-OST005c: Client f5acbb80-2671-675b-21f5-81352b190567 (at 10.9.110.49@o2ib4) reconnecting [Tue Dec 10 05:19:18 2019][82934.996050] Lustre: Skipped 1086 previous similar messages [Tue Dec 10 05:19:18 2019][82935.001544] Lustre: fir-OST005c: Connection restored to (at 10.9.110.49@o2ib4) [Tue Dec 10 05:19:18 2019][82935.008881] Lustre: Skipped 1086 previous similar messages [Tue Dec 10 05:19:34 2019][82950.477728] Lustre: 89234:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575983967/real 1575983967] req@ffff88fddaa5d580 x1652452421353168/t0(0) o105->fir-OST005a@10.9.110.16@o2ib4:15/16 lens 360/224 e 0 to 1 dl 1575983974 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Tue Dec 10 05:19:35 2019][82952.145857] LustreError: 137-5: fir-OST0057_UUID: not available for connect from 10.8.23.19@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:19:35 2019][82952.163151] LustreError: Skipped 40 previous similar messages [Tue Dec 10 05:19:39 2019][82955.589834] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:19:39 2019][82955.602809] LustreError: 67980:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 7864320(16252928) req@ffff88f2b024d850 x1649292592204048/t0(0) o4->d269b7b3-c7ee-1895-0bbf-8293c505cff2@10.9.110.44@o2ib4:15/0 lens 488/448 e 0 to 0 dl 1575983995 ref 1 fl Interpret:/2/0 rc 0/0 [Tue Dec 10 05:20:08 2019][82984.912818] LustreError: 137-5: fir-OST0055_UUID: not available for connect from 10.8.27.15@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:20:08 2019][82984.930100] LustreError: Skipped 77 previous similar messages [Tue Dec 10 05:20:22 2019][82999.030286] Lustre: fir-OST005c: Client 9622ebd9-08dd-84f5-187b-b07758b1dd55 (at 10.9.103.48@o2ib4) reconnecting [Tue Dec 10 05:20:22 2019][82999.030363] Lustre: fir-OST005a: Connection restored to 7fc3ef05-0495-25a3-7cdb-c6f981dcc2b9 (at 10.9.102.68@o2ib4) [Tue Dec 10 05:20:22 2019][82999.030365] Lustre: Skipped 1252 previous similar messages [Tue Dec 10 05:20:22 2019][82999.056380] Lustre: Skipped 1255 previous similar messages [Tue Dec 10 05:20:38 2019][83014.761030] Lustre: 67693:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575984031/real 1575984031] req@ffff8922f461d100 x1652452421355920/t0(0) o104->fir-OST0054@10.9.108.37@o2ib4:15/16 lens 296/224 e 0 to 1 dl 1575984038 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Tue Dec 10 05:20:41 2019][83017.628089] LustreError: 68013:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff88f2ab53a850 x1649049588956192/t0(0) o4->20463417-fb32-2f92-5aae-59bfa8e287e3@10.9.101.29@o2ib4:91/0 lens 488/448 e 0 to 0 dl 1575984071 ref 1 fl Interpret:/2/0 rc 0/0 [Tue Dec 10 05:20:41 2019][83017.652133] LustreError: 68013:0:(ldlm_lib.c:3256:target_bulk_io()) Skipped 1 previous similar message [Tue Dec 10 05:20:41 2019][83017.661969] Lustre: fir-OST0054: Bulk IO write error with 20463417-fb32-2f92-5aae-59bfa8e287e3 (at 10.9.101.29@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:20:41 2019][83017.675188] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:20:52 2019][83028.770318] LustreError: 67718:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff89224f960050 x1649292592204048/t0(0) o4->d269b7b3-c7ee-1895-0bbf-8293c505cff2@10.9.110.44@o2ib4:102/0 lens 488/448 e 0 to 0 dl 1575984082 ref 1 fl Interpret:/2/0 rc 0/0 [Tue Dec 10 05:21:12 2019][83048.937549] LustreError: 137-5: fir-OST005d_UUID: not available for connect from 10.9.104.8@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:21:12 2019][83048.954829] LustreError: Skipped 6990 previous similar messages [Tue Dec 10 05:21:31 2019][83067.484865] Lustre: fir-OST005a: haven't heard from client 2bacacc9-821b-1013-eb06-dd3bdbe6bf12 (at 10.9.104.8@o2ib4) in 162 seconds. I think it's dead, and I am evicting it. exp ffff8902cd64e400, cur 1575984091 expire 1575983941 last 1575983929 [Tue Dec 10 05:21:31 2019][83067.506600] Lustre: Skipped 5 previous similar messages [Tue Dec 10 05:21:32 2019][83068.476206] Lustre: fir-OST005e: haven't heard from client 2bacacc9-821b-1013-eb06-dd3bdbe6bf12 (at 10.9.104.8@o2ib4) in 163 seconds. I think it's dead, and I am evicting it. exp ffff8922509fb000, cur 1575984092 expire 1575983942 last 1575983929 [Tue Dec 10 05:25:20 2019][83296.692149] LustreError: 137-5: fir-OST0057_UUID: not available for connect from 10.9.109.11@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 05:25:20 2019][83296.709516] LustreError: Skipped 111 previous similar messages [Tue Dec 10 05:25:34 2019][83311.284314] Lustre: fir-OST0058: Client d758ce23-488a-e6d5-8c6f-41cbf6d78ec4 (at 10.9.105.21@o2ib4) reconnecting [Tue Dec 10 05:25:34 2019][83311.293300] Lustre: fir-OST005e: Connection restored to d758ce23-488a-e6d5-8c6f-41cbf6d78ec4 (at 10.9.105.21@o2ib4) [Tue Dec 10 05:25:34 2019][83311.293302] Lustre: Skipped 11935 previous similar messages [Tue Dec 10 05:25:34 2019][83311.310497] Lustre: Skipped 11936 previous similar messages [Tue Dec 10 05:25:49 2019][83325.611334] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:25:49 2019][83325.624301] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 2 previous similar messages [Tue Dec 10 05:25:49 2019][83325.635126] LustreError: 68003:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 12582912(16777216) req@ffff89025a28b850 x1652122504801216/t0(0) o4->eb7e3af2-d117-4@10.9.101.1@o2ib4:398/0 lens 488/448 e 0 to 0 dl 1575984378 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:25:49 2019][83325.659776] Lustre: fir-OST0054: Bulk IO write error with eb7e3af2-d117-4 (at 10.9.101.1@o2ib4), client will retry: rc = -110 [Tue Dec 10 05:25:49 2019][83325.671083] Lustre: Skipped 3 previous similar messages [Tue Dec 10 05:26:22 2019][83358.938007] LustreError: 67808:0:(ldlm_lib.c:3256:target_bulk_io()) @@@ Reconnect on bulk WRITE req@ffff8922ce6f2850 x1648661001299536/t0(0) o4->226a4739-0dcd-665f-3f64-f361283e71b8@10.9.105.16@o2ib4:433/0 lens 488/448 e 0 to 0 dl 1575984413 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:26:22 2019][83358.962179] LustreError: 67808:0:(ldlm_lib.c:3256:target_bulk_io()) Skipped 2 previous similar messages [Tue Dec 10 05:26:22 2019][83359.223675] md: md3 stopped. [Tue Dec 10 05:26:22 2019][83359.236553] md/raid:md3: device dm-108 operational as raid disk 0 [Tue Dec 10 05:26:22 2019][83359.242672] md/raid:md3: device dm-85 operational as raid disk 9 [Tue Dec 10 05:26:22 2019][83359.248686] md/raid:md3: device dm-97 operational as raid disk 8 [Tue Dec 10 05:26:22 2019][83359.254715] md/raid:md3: device dm-82 operational as raid disk 7 [Tue Dec 10 05:26:22 2019][83359.260729] md/raid:md3: device dm-98 operational as raid disk 6 [Tue Dec 10 05:26:22 2019][83359.266737] md/raid:md3: device dm-72 operational as raid disk 5 [Tue Dec 10 05:26:22 2019][83359.272751] md/raid:md3: device dm-81 operational as raid disk 4 [Tue Dec 10 05:26:22 2019][83359.278766] md/raid:md3: device dm-61 operational as raid disk 3 [Tue Dec 10 05:26:22 2019][83359.284780] md/raid:md3: device dm-103 operational as raid disk 2 [Tue Dec 10 05:26:22 2019][83359.290873] md/raid:md3: device dm-109 operational as raid disk 1 [Tue Dec 10 05:26:22 2019][83359.298897] md/raid:md3: raid level 6 active with 10 out of 10 devices, algorithm 2 [Tue Dec 10 05:26:22 2019][83359.337097] md3: detected capacity change from 0 to 64011422924800 [Tue Dec 10 05:26:22 2019][83359.367126] md: md1 stopped. [Tue Dec 10 05:26:22 2019][83359.377897] md/raid:md1: device dm-63 operational as raid disk 0 [Tue Dec 10 05:26:22 2019][83359.383913] md/raid:md1: device dm-102 operational as raid disk 9 [Tue Dec 10 05:26:22 2019][83359.390016] md/raid:md1: device dm-113 operational as raid disk 8 [Tue Dec 10 05:26:22 2019][83359.396120] md/raid:md1: device dm-96 operational as raid disk 7 [Tue Dec 10 05:26:22 2019][83359.402134] md/raid:md1: device dm-92 operational as raid disk 6 [Tue Dec 10 05:26:22 2019][83359.408144] md/raid:md1: device dm-67 operational as raid disk 5 [Tue Dec 10 05:26:22 2019][83359.414153] md/raid:md1: device dm-71 operational as raid disk 4 [Tue Dec 10 05:26:22 2019][83359.420166] md/raid:md1: device dm-112 operational as raid disk 3 [Tue Dec 10 05:26:22 2019][83359.426268] md/raid:md1: device dm-115 operational as raid disk 2 [Tue Dec 10 05:26:22 2019][83359.432370] md/raid:md1: device dm-118 operational as raid disk 1 [Tue Dec 10 05:26:22 2019][83359.439214] md/raid:md1: raid level 6 active with 10 out of 10 devices, algorithm 2 [Tue Dec 10 05:26:22 2019][83359.460496] md1: detected capacity change from 0 to 64011422924800 [Tue Dec 10 05:26:23 2019][83359.491278] md: md11 stopped. [Tue Dec 10 05:26:23 2019][83359.504426] md/raid:md11: device dm-51 operational as raid disk 0 [Tue Dec 10 05:26:23 2019][83359.510538] md/raid:md11: device dm-47 operational as raid disk 9 [Tue Dec 10 05:26:23 2019][83359.516646] md/raid:md11: device dm-50 operational as raid disk 8 [Tue Dec 10 05:26:23 2019][83359.522757] md/raid:md11: device dm-49 operational as raid disk 7 [Tue Dec 10 05:26:23 2019][83359.528866] md/raid:md11: device dm-20 operational as raid disk 6 [Tue Dec 10 05:26:23 2019][83359.534979] md/raid:md11: device dm-31 operational as raid disk 5 [Tue Dec 10 05:26:23 2019][83359.541088] md/raid:md11: device dm-32 operational as raid disk 4 [Tue Dec 10 05:26:23 2019][83359.547195] md/raid:md11: device dm-22 operational as raid disk 3 [Tue Dec 10 05:26:23 2019][83359.553301] md/raid:md11: device dm-11 operational as raid disk 2 [Tue Dec 10 05:26:23 2019][83359.559404] md/raid:md11: device dm-9 operational as raid disk 1 [Tue Dec 10 05:26:23 2019][83359.569083] md/raid:md11: raid level 6 active with 10 out of 10 devices, algorithm 2 [Tue Dec 10 05:26:23 2019][83359.591851] md11: detected capacity change from 0 to 64011422924800 [Tue Dec 10 05:26:23 2019][83359.619668] md: md7 stopped. [Tue Dec 10 05:26:23 2019][83359.638795] md/raid:md7: device dm-0 operational as raid disk 0 [Tue Dec 10 05:26:23 2019][83359.644738] md/raid:md7: device dm-55 operational as raid disk 9 [Tue Dec 10 05:26:23 2019][83359.650792] md/raid:md7: device dm-14 operational as raid disk 8 [Tue Dec 10 05:26:23 2019][83359.656817] md/raid:md7: device dm-13 operational as raid disk 7 [Tue Dec 10 05:26:23 2019][83359.662855] md/raid:md7: device dm-41 operational as raid disk 6 [Tue Dec 10 05:26:23 2019][83359.668899] md/raid:md7: device dm-29 operational as raid disk 5 [Tue Dec 10 05:26:23 2019][83359.674964] md/raid:md7: device dm-24 operational as raid disk 4 [Tue Dec 10 05:26:23 2019][83359.681029] md/raid:md7: device dm-35 operational as raid disk 3 [Tue Dec 10 05:26:23 2019][83359.687052] md/raid:md7: device dm-52 operational as raid disk 2 [Tue Dec 10 05:26:23 2019][83359.693078] md/raid:md7: device dm-30 operational as raid disk 1 [Tue Dec 10 05:26:23 2019][83359.700039] md/raid:md7: raid level 6 active with 10 out of 10 devices, algorithm 2 [Tue Dec 10 05:26:23 2019][83359.721696] md7: detected capacity change from 0 to 64011422924800 [Tue Dec 10 05:26:23 2019][83359.747849] md: md5 stopped. [Tue Dec 10 05:26:23 2019][83359.767611] md/raid:md5: device dm-110 operational as raid disk 0 [Tue Dec 10 05:26:23 2019][83359.773734] md/raid:md5: device dm-93 operational as raid disk 9 [Tue Dec 10 05:26:23 2019][83359.779766] md/raid:md5: device dm-111 operational as raid disk 8 [Tue Dec 10 05:26:23 2019][83359.785881] md/raid:md5: device dm-87 operational as raid disk 7 [Tue Dec 10 05:26:23 2019][83359.791911] md/raid:md5: device dm-90 operational as raid disk 6 [Tue Dec 10 05:26:23 2019][83359.797933] md/raid:md5: device dm-75 operational as raid disk 5 [Tue Dec 10 05:26:23 2019][83359.803959] md/raid:md5: device dm-78 operational as raid disk 4 [Tue Dec 10 05:26:23 2019][83359.809981] md/raid:md5: device dm-70 operational as raid disk 3 [Tue Dec 10 05:26:23 2019][83359.816010] md/raid:md5: device dm-62 operational as raid disk 2 [Tue Dec 10 05:26:23 2019][83359.822035] md/raid:md5: device dm-68 operational as raid disk 1 [Tue Dec 10 05:26:23 2019][83359.829120] md/raid:md5: raid level 6 active with 10 out of 10 devices, algorithm 2 [Tue Dec 10 05:26:23 2019][83359.863208] md5: detected capacity change from 0 to 64011422924800 [Tue Dec 10 05:26:23 2019][83359.900980] md: md9 stopped. [Tue Dec 10 05:26:23 2019][83359.924250] md/raid:md9: device dm-59 operational as raid disk 0 [Tue Dec 10 05:26:23 2019][83359.930270] md/raid:md9: device dm-19 operational as raid disk 9 [Tue Dec 10 05:26:23 2019][83359.936303] md/raid:md9: device dm-48 operational as raid disk 8 [Tue Dec 10 05:26:23 2019][83359.942419] md/raid:md9: device dm-39 operational as raid disk 7 [Tue Dec 10 05:26:23 2019][83359.948471] md/raid:md9: device dm-37 operational as raid disk 6 [Tue Dec 10 05:26:23 2019][83359.954508] md/raid:md9: device dm-10 operational as raid disk 5 [Tue Dec 10 05:26:23 2019][83359.960562] md/raid:md9: device dm-38 operational as raid disk 4 [Tue Dec 10 05:26:23 2019][83359.966609] md/raid:md9: device dm-2 operational as raid disk 3 [Tue Dec 10 05:26:23 2019][83359.972651] md/raid:md9: device dm-12 operational as raid disk 2 [Tue Dec 10 05:26:23 2019][83359.978740] md/raid:md9: device dm-36 operational as raid disk 1 [Tue Dec 10 05:26:23 2019][83359.988671] md/raid:md9: raid level 6 active with 10 out of 10 devices, algorithm 2 [Tue Dec 10 05:26:23 2019][83360.017458] md9: detected capacity change from 0 to 64011422924800 [Tue Dec 10 05:26:23 2019][83360.466294] LDISKFS-fs (md3): file extents enabled, maximum tree depth=5 [Tue Dec 10 05:26:24 2019][83360.515278] LDISKFS-fs (md1): file extents enabled, maximum tree depth=5 [Tue Dec 10 05:26:24 2019][83360.800303] LDISKFS-fs (md11): file extents enabled, maximum tree depth=5 [Tue Dec 10 05:26:24 2019][83360.805067] LDISKFS-fs (md3): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Tue Dec 10 05:26:24 2019][83360.861175] LDISKFS-fs (md1): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Tue Dec 10 05:26:24 2019][83361.154663] LDISKFS-fs (md11): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Tue Dec 10 05:26:24 2019][83361.224318] LDISKFS-fs (md7): file extents enabled, maximum tree depth=5 [Tue Dec 10 05:26:25 2019][83361.456315] LDISKFS-fs (md5): file extents enabled, maximum tree depth=5 [Tue Dec 10 05:26:25 2019][83361.481350] Lustre: fir-OST0057: Not available for connect from 10.8.30.26@o2ib6 (not set up) [Tue Dec 10 05:26:25 2019][83361.489945] Lustre: Skipped 29 previous similar messages [Tue Dec 10 05:26:25 2019][83361.525126] LDISKFS-fs (md9): file extents enabled, maximum tree depth=5 [Tue Dec 10 05:26:25 2019][83361.602428] Lustre: fir-OST0057: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Tue Dec 10 05:26:25 2019][83361.613606] Lustre: fir-OST0057: in recovery but waiting for the first client to connect [Tue Dec 10 05:26:25 2019][83361.637346] LDISKFS-fs (md7): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Tue Dec 10 05:26:25 2019][83361.653835] Lustre: fir-OST0057: Will be in recovery for at least 2:30, or until 1291 clients reconnect [Tue Dec 10 05:26:25 2019][83361.815987] LDISKFS-fs (md5): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Tue Dec 10 05:26:25 2019][83362.152906] LDISKFS-fs (md9): mounted filesystem with ordered data mode. Opts: errors=remount-ro,no_mbcache,nodelalloc [Tue Dec 10 05:26:25 2019][83362.162098] Lustre: fir-OST005f: Not available for connect from 10.9.108.14@o2ib4 (not set up) [Tue Dec 10 05:26:25 2019][83362.162101] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:25 2019][83362.282047] Lustre: fir-OST005f: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Tue Dec 10 05:26:25 2019][83362.292310] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:25 2019][83362.298236] Lustre: fir-OST005f: in recovery but waiting for the first client to connect [Tue Dec 10 05:26:25 2019][83362.306355] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:25 2019][83362.321596] Lustre: fir-OST005f: Will be in recovery for at least 2:30, or until 1291 clients reconnect [Tue Dec 10 05:26:25 2019][83362.330995] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:26 2019][83363.239466] Lustre: fir-OST005d: Not available for connect from 10.8.20.17@o2ib6 (not set up) [Tue Dec 10 05:26:26 2019][83363.247993] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:26 2019][83363.306303] Lustre: fir-OST005d: Imperative Recovery enabled, recovery window shrunk from 300-900 down to 150-900 [Tue Dec 10 05:26:26 2019][83363.316565] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:26:26 2019][83363.322667] Lustre: fir-OST005d: in recovery but waiting for the first client to connect [Tue Dec 10 05:26:26 2019][83363.330787] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:26:26 2019][83363.395627] Lustre: fir-OST005d: Will be in recovery for at least 2:30, or until 1290 clients reconnect [Tue Dec 10 05:26:26 2019][83363.405029] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:26:33 2019][83369.610785] Lustre: fir-OST0055: Client fc841094-f1fd-2756-1968-f74105b220e6 (at 10.8.8.30@o2ib6) reconnected, waiting for 1291 clients in recovery for 2:22 [Tue Dec 10 05:26:34 2019][83370.635246] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:26:34 2019][83370.635254] LustreError: 68000:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 2097152(2445312) req@ffff88f2fdf04050 x1648661001299536/t0(0) o4->226a4739-0dcd-665f-3f64-f361283e71b8@10.9.105.16@o2ib4:444/0 lens 488/448 e 0 to 0 dl 1575984424 ref 1 fl Interpret:/2/0 rc 0/0 [Tue Dec 10 05:26:34 2019][83370.673701] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 2 previous similar messages [Tue Dec 10 05:26:34 2019][83370.813831] Lustre: fir-OST005d: Client df232092-858e-a632-396d-0cfff0b9daea (at 10.9.110.47@o2ib4) reconnected, waiting for 1290 clients in recovery for 2:22 [Tue Dec 10 05:26:34 2019][83370.828004] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:35 2019][83372.045287] Lustre: fir-OST005d: Client 3c222422-1505-df45-a734-88e013dbd97d (at 10.9.102.41@o2ib4) reconnected, waiting for 1290 clients in recovery for 2:21 [Tue Dec 10 05:26:35 2019][83372.059447] Lustre: Skipped 3 previous similar messages [Tue Dec 10 05:26:39 2019][83375.472761] Lustre: fir-OST005d: Client 0cf25d46-002a-85b1-4e67-848f0710e2b1 (at 10.9.109.64@o2ib4) reconnected, waiting for 1290 clients in recovery for 2:17 [Tue Dec 10 05:26:39 2019][83375.486940] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:26:43 2019][83379.572113] Lustre: fir-OST005d: Client 1c30f97b-7d47-8c9d-c1e8-e8bf522ea702 (at 10.8.24.11@o2ib6) reconnected, waiting for 1290 clients in recovery for 2:13 [Tue Dec 10 05:26:43 2019][83379.586191] Lustre: Skipped 13 previous similar messages [Tue Dec 10 05:26:44 2019][83380.684456] LustreError: 67723:0:(ldlm_lib.c:3271:target_bulk_io()) @@@ truncated bulk READ 3145728(4194304) req@ffff88f2fbc53050 x1650576591366400/t0(0) o3->5499f23a-1ea6-ba5d-b45d-cc3f43f05d7e@10.9.109.20@o2ib4:443/0 lens 488/440 e 1 to 0 dl 1575984423 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:26:44 2019][83380.684688] Lustre: fir-OST0056: Bulk IO read error with 5499f23a-1ea6-ba5d-b45d-cc3f43f05d7e (at 10.9.109.20@o2ib4), client will retry: rc -110 [Tue Dec 10 05:26:44 2019][83380.722686] LustreError: 67723:0:(ldlm_lib.c:3271:target_bulk_io()) Skipped 1 previous similar message [Tue Dec 10 05:26:51 2019][83387.587269] Lustre: fir-OST005b: Client 7077f577-10fa-a102-c9d8-a4ca3b92f52f (at 10.9.110.25@o2ib4) reconnected, waiting for 1291 clients in recovery for 2:05 [Tue Dec 10 05:26:51 2019][83387.601436] Lustre: Skipped 56 previous similar messages [Tue Dec 10 05:27:07 2019][83403.620422] Lustre: fir-OST0057: Client 9c41e276-bb54-ccfd-4d34-4092e6989764 (at 10.9.103.70@o2ib4) reconnected, waiting for 1291 clients in recovery for 1:48 [Tue Dec 10 05:27:07 2019][83403.634588] Lustre: Skipped 143 previous similar messages [Tue Dec 10 05:27:38 2019][83434.930767] Lustre: fir-OST0055: Client 5f11dd29-1211-44a2-2612-f8309cf085b3 (at 10.8.21.18@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:27:39 2019][83436.163818] Lustre: fir-OST005d: Client c104d961-ddd0-a5eb-3382-4ecbd88b591c (at 10.8.18.16@o2ib6) reconnected, waiting for 1290 clients in recovery for 1:17 [Tue Dec 10 05:27:39 2019][83436.177946] Lustre: Skipped 143 previous similar messages [Tue Dec 10 05:27:40 2019][83436.930456] Lustre: fir-OST005b: Client a507eb44-8ff1-13e2-fab8-30d1823663f8 (at 10.8.22.24@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:27:44 2019][83440.685672] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.211@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:27:44 2019][83440.685678] LustreError: 67978:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) @@@ truncated bulk GET 0(871544) req@ffff89224fd74050 x1652166659120640/t0(0) o4->da9f6e55-12b4-4@10.9.112.5@o2ib4:517/0 lens 488/448 e 0 to 0 dl 1575984497 ref 1 fl Interpret:/0/0 rc 0/0 [Tue Dec 10 05:27:44 2019][83440.685681] LustreError: 67978:0:(sec.c:2485:sptlrpc_svc_unwrap_bulk()) Skipped 5 previous similar messages [Tue Dec 10 05:27:44 2019][83440.731356] LNetError: 63837:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 13 previous similar messages [Tue Dec 10 05:27:47 2019][83444.232488] Lustre: fir-OST0055: Client 3db7ac8a-faba-9fd6-d84d-1b8e92435cfb (at 10.8.26.18@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:27:47 2019][83444.245695] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:27:50 2019][83446.930688] Lustre: fir-OST0057: Client d30d2da1-5a39-6a53-6def-eb7c150e8cb6 (at 10.8.31.1@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:27:50 2019][83446.943815] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:27:57 2019][83454.163673] Lustre: fir-OST005d: Client 72b66a84-eb6d-8862-b24a-97d6ffec93b7 (at 10.8.24.22@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:27:57 2019][83454.176886] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:28:07 2019][83464.155643] Lustre: fir-OST005d: Client ca09bd61-a4b3-111c-b997-9c7823236764 (at 10.8.22.17@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:28:07 2019][83464.168857] Lustre: Skipped 158 previous similar messages [Tue Dec 10 05:28:23 2019][83480.159355] Lustre: fir-OST0059: Client 5028c448-3432-783a-f116-6a44a16b46a7 (at 10.8.29.8@o2ib6) refused connection, still busy with 6 references [Tue Dec 10 05:28:23 2019][83480.172483] Lustre: Skipped 35 previous similar messages [Tue Dec 10 05:28:43 2019][83500.442869] Lustre: fir-OST005d: Client b34be8aa-32d9-4 (at 10.9.113.13@o2ib4) reconnected, waiting for 1290 clients in recovery for 0:12 [Tue Dec 10 05:28:43 2019][83500.455221] Lustre: Skipped 1934 previous similar messages [Tue Dec 10 05:28:54 2019][83511.176440] Lustre: fir-OST0057: Recovery already passed deadline 0:00. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST0057 abort_recovery. [Tue Dec 10 05:28:55 2019][83511.656879] Lustre: fir-OST0057: recovery is timed out, evict stale exports [Tue Dec 10 05:28:55 2019][83511.664127] Lustre: fir-OST0057: disconnecting 10 stale clients [Tue Dec 10 05:28:55 2019][83511.806156] Lustre: fir-OST0057: Recovery over after 2:30, of 1291 clients 1281 recovered and 10 were evicted. [Tue Dec 10 05:28:55 2019][83511.816161] Lustre: Skipped 5 previous similar messages [Tue Dec 10 05:28:55 2019][83511.860836] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000401:11791589 to 0x18c0000401:11792961 [Tue Dec 10 05:28:55 2019][83511.896502] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000400:3036937 to 0x18c0000400:3037633 [Tue Dec 10 05:28:55 2019][83511.932613] Lustre: fir-OST0057: deleting orphan objects from 0x0:27483952 to 0x0:27483969 [Tue Dec 10 05:28:55 2019][83512.043223] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000402:922876 to 0x18c0000402:922913 [Tue Dec 10 05:28:55 2019][83512.155024] Lustre: fir-OST005b: Recovery already passed deadline 0:00. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST005b abort_recovery. [Tue Dec 10 05:28:55 2019][83512.171098] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:28:55 2019][83512.324646] Lustre: fir-OST005f: recovery is timed out, evict stale exports [Tue Dec 10 05:28:55 2019][83512.331617] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:28:55 2019][83512.336947] Lustre: fir-OST005f: disconnecting 2 stale clients [Tue Dec 10 05:28:55 2019][83512.342815] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:28:56 2019][83513.398676] Lustre: fir-OST005d: recovery is timed out, evict stale exports [Tue Dec 10 05:28:56 2019][83513.405644] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:28:56 2019][83513.411046] Lustre: fir-OST005d: disconnecting 1 stale clients [Tue Dec 10 05:28:56 2019][83513.416915] Lustre: Skipped 2 previous similar messages [Tue Dec 10 05:28:59 2019][83516.311131] Lustre: fir-OST005b: Denying connection for new client 3a18a690-f6fb-7d4d-c179-697da5c59619 (at 10.9.116.10@o2ib4), waiting for 1291 known clients (1188 recovered, 100 in progress, and 3 evicted) to recover in 0:56 [Tue Dec 10 05:29:01 2019][83518.150125] Lustre: fir-OST005f: Denying connection for new client 3c020cd0-089d-acb1-e879-86429192cebf (at 10.8.27.2@o2ib6), waiting for 1291 known clients (1193 recovered, 96 in progress, and 2 evicted) to recover in 0:53 [Tue Dec 10 05:29:07 2019][83524.142848] Lustre: fir-OST0059: Denying connection for new client 5a3d40f3-7440-8bab-3ed3-c953b35f5db5 (at 10.9.104.11@o2ib4), waiting for 1291 known clients (1199 recovered, 91 in progress, and 1 evicted) to recover in 0:48 [Tue Dec 10 05:29:09 2019][83526.157270] Lustre: fir-OST005f: Recovery over after 2:44, of 1291 clients 1289 recovered and 2 were evicted. [Tue Dec 10 05:29:09 2019][83526.227237] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000400:3045562 to 0x1ac0000400:3045857 [Tue Dec 10 05:29:09 2019][83526.463633] Lustre: fir-OST005f: deleting orphan objects from 0x0:27483501 to 0x0:27483521 [Tue Dec 10 05:29:10 2019][83526.484622] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000402:920820 to 0x1ac0000402:920897 [Tue Dec 10 05:29:10 2019][83526.494815] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000401:11753307 to 0x1ac0000401:11753953 [Tue Dec 10 05:29:13 2019][83529.508304] Lustre: fir-OST0055: Denying connection for new client 3dc3e4b3-1daf-f260-3956-f8f68e141bca (at 10.9.117.42@o2ib4), waiting for 1291 known clients (1183 recovered, 107 in progress, and 1 evicted) to recover in 0:41 [Tue Dec 10 05:29:13 2019][83529.528358] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:29:14 2019][83530.457223] Lustre: fir-OST0059: Recovery over after 2:47, of 1291 clients 1290 recovered and 1 was evicted. [Tue Dec 10 05:29:14 2019][83530.548753] Lustre: fir-OST0059: deleting orphan objects from 0x1940000401:2999880 to 0x1940000401:3000289 [Tue Dec 10 05:29:14 2019][83530.672559] Lustre: fir-OST0059: deleting orphan objects from 0x1940000400:906085 to 0x1940000400:906145 [Tue Dec 10 05:29:14 2019][83530.685548] Lustre: fir-OST0059: deleting orphan objects from 0x0:27234514 to 0x0:27234529 [Tue Dec 10 05:29:14 2019][83530.709523] Lustre: fir-OST0059: deleting orphan objects from 0x1940000402:11643021 to 0x1940000402:11644417 [Tue Dec 10 05:29:21 2019][83537.618300] Lustre: 111966:0:(ldlm_lib.c:1765:extend_recovery_timer()) fir-OST005b: extended recovery timer reaching hard limit: 900, extend: 1 [Tue Dec 10 05:29:21 2019][83537.760812] Lustre: fir-OST005b: Recovery over after 2:55, of 1291 clients 1288 recovered and 3 were evicted. [Tue Dec 10 05:29:21 2019][83537.842315] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000401:3019027 to 0x19c0000401:3020481 [Tue Dec 10 05:29:21 2019][83538.024106] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000400:916141 to 0x19c0000400:916193 [Tue Dec 10 05:29:21 2019][83538.084457] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000402:11725588 to 0x19c0000402:11726529 [Tue Dec 10 05:29:21 2019][83538.084461] Lustre: fir-OST005b: deleting orphan objects from 0x0:27420356 to 0x0:27420385 [Tue Dec 10 05:29:38 2019][83554.597085] Lustre: fir-OST0055: Denying connection for new client 3dc3e4b3-1daf-f260-3956-f8f68e141bca (at 10.9.117.42@o2ib4), waiting for 1291 known clients (1184 recovered, 106 in progress, and 1 evicted) to recover in 0:16 [Tue Dec 10 05:29:38 2019][83554.617160] Lustre: Skipped 1 previous similar message [Tue Dec 10 05:30:03 2019][83579.685583] Lustre: fir-OST0055: Denying connection for new client 3dc3e4b3-1daf-f260-3956-f8f68e141bca (at 10.9.117.42@o2ib4), waiting for 1291 known clients (1184 recovered, 106 in progress, and 1 evicted) already passed deadline 0:08 [Tue Dec 10 05:30:04 2019][83580.462432] Lustre: fir-OST0058: Client 964f90b2-201f-0e40-0c9b-d52b03dcf753 (at 10.9.105.61@o2ib4) reconnecting [Tue Dec 10 05:30:04 2019][83580.472632] Lustre: Skipped 7273 previous similar messages [Tue Dec 10 05:30:04 2019][83580.478156] Lustre: fir-OST0058: Connection restored to 964f90b2-201f-0e40-0c9b-d52b03dcf753 (at 10.9.105.61@o2ib4) [Tue Dec 10 05:30:04 2019][83580.488588] Lustre: Skipped 17265 previous similar messages [Tue Dec 10 05:30:17 2019][83593.534006] Lustre: fir-OST0055: Recovery already passed deadline 0:22. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST0055 abort_recovery. [Tue Dec 10 05:30:17 2019][83593.535058] Lustre: 112020:0:(ldlm_lib.c:1765:extend_recovery_timer()) fir-OST005d: extended recovery timer reaching hard limit: 900, extend: 1 [Tue Dec 10 05:30:17 2019][83593.535061] Lustre: 112020:0:(ldlm_lib.c:1765:extend_recovery_timer()) Skipped 154 previous similar messages [Tue Dec 10 05:30:17 2019][83593.572770] Lustre: Skipped 3 previous similar messages [Tue Dec 10 05:30:17 2019][83593.735590] Lustre: fir-OST005d: Recovery over after 3:51, of 1290 clients 1289 recovered and 1 was evicted. [Tue Dec 10 05:30:17 2019][83593.787970] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000402:3041004 to 0x1a40000402:3041729 [Tue Dec 10 05:30:17 2019][83593.958249] Lustre: fir-OST005d: deleting orphan objects from 0x0:27502209 to 0x0:27502241 [Tue Dec 10 05:30:17 2019][83593.958251] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000400:11793335 to 0x1a40000400:11794593 [Tue Dec 10 05:30:17 2019][83594.116751] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000401:922080 to 0x1a40000401:922113 [Tue Dec 10 05:30:24 2019][83600.533414] Lustre: fir-OST0055: Recovery already passed deadline 0:29. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST0055 abort_recovery. [Tue Dec 10 05:30:28 2019][83604.774454] Lustre: fir-OST0055: Denying connection for new client 3dc3e4b3-1daf-f260-3956-f8f68e141bca (at 10.9.117.42@o2ib4), waiting for 1291 known clients (1184 recovered, 106 in progress, and 1 evicted) already passed deadline 0:33 [Tue Dec 10 05:30:35 2019][83611.533963] Lustre: fir-OST0055: Recovery already passed deadline 0:40. If you do not want to wait more, you may force taget eviction via 'lctl --device fir-OST0055 abort_recovery. [Tue Dec 10 05:30:35 2019][83611.550895] Lustre: 111654:0:(ldlm_lib.c:1765:extend_recovery_timer()) fir-OST0055: extended recovery timer reaching hard limit: 900, extend: 1 [Tue Dec 10 05:30:35 2019][83611.563857] Lustre: 111654:0:(ldlm_lib.c:1765:extend_recovery_timer()) Skipped 101 previous similar messages [Tue Dec 10 05:30:35 2019][83611.717356] Lustre: fir-OST0055: Recovery over after 4:10, of 1291 clients 1290 recovered and 1 was evicted. [Tue Dec 10 05:30:35 2019][83611.786226] Lustre: fir-OST0055: deleting orphan objects from 0x1840000400:11790357 to 0x1840000400:11790881 [Tue Dec 10 05:30:35 2019][83611.925706] Lustre: fir-OST0055: deleting orphan objects from 0x1840000402:3041849 to 0x1840000402:3042849 [Tue Dec 10 05:30:35 2019][83612.026730] Lustre: fir-OST0055: deleting orphan objects from 0x0:27493454 to 0x0:27493473 [Tue Dec 10 05:30:35 2019][83612.056601] Lustre: fir-OST0055: deleting orphan objects from 0x1840000401:923366 to 0x1840000401:923457 [Tue Dec 10 05:33:40 2019][83796.700915] Lustre: 63864:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575984813/real 1575984813] req@ffff89077f17b600 x1652452421759024/t0(0) o400->MGC10.0.10.51@o2ib7@10.0.10.51@o2ib7:26/25 lens 224/224 e 0 to 1 dl 1575984820 ref 1 fl Rpc:XN/0/ffffffff rc 0/-1 [Tue Dec 10 05:33:40 2019][83796.728970] Lustre: 63864:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 1 previous similar message [Tue Dec 10 05:33:40 2019][83796.738637] LustreError: 166-1: MGC10.0.10.51@o2ib7: Connection to MGS (at 10.0.10.51@o2ib7) was lost; in progress operations using this service will fail [Tue Dec 10 05:34:36 2019][83852.646060] LNetError: 63820:0:(o2iblnd_cb.c:3350:kiblnd_check_txs_locked()) Timed out tx: active_txs, 0 seconds [Tue Dec 10 05:34:36 2019][83852.656233] LNetError: 63820:0:(o2iblnd_cb.c:3425:kiblnd_check_conns()) Timed out RDMA with 10.0.10.51@o2ib7 (56): c: 5, oc: 0, rc: 8 [Tue Dec 10 05:34:36 2019][83852.668738] LNetError: 63832:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) lpni 10.0.10.51@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:34:36 2019][83852.681607] LNetError: 63832:0:(peer.c:3451:lnet_peer_ni_add_to_recoveryq_locked()) Skipped 4 previous similar messages [Tue Dec 10 05:34:36 2019][83852.751315] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:34:37 2019][83853.751219] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:34:38 2019][83854.751398] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:34:49 2019][83865.751890] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:34:59 2019][83875.751981] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:35:14 2019][83890.752457] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:35:44 2019][83920.753034] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:35:44 2019][83920.765121] LNetError: 108379:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 5 previous similar messages [Tue Dec 10 05:35:45 2019][83922.183649] Lustre: Evicted from MGS (at MGC10.0.10.51@o2ib7_1) after server handle changed from 0xdff031726fbff0e1 to 0xbba64b52f329a2a4 [Tue Dec 10 05:36:05 2019][83941.647915] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:36:20 2019][83956.648211] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:36:20 2019][83956.658316] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:36:20 2019][83956.670341] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 1 previous similar message [Tue Dec 10 05:36:35 2019][83971.648522] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:36:50 2019][83986.648829] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:37:05 2019][84001.649136] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:37:20 2019][84016.649449] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:37:35 2019][84031.649779] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:37:35 2019][84031.661773] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 4 previous similar messages [Tue Dec 10 05:37:51 2019][84047.650090] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 1 seconds [Tue Dec 10 05:37:51 2019][84047.660177] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 1 previous similar message [Tue Dec 10 05:38:16 2019][84072.714709] Lustre: fir-MDT0002-lwp-OST005a: Connection to fir-MDT0002 (at 10.0.10.53@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 05:38:16 2019][84072.730692] Lustre: Skipped 10 previous similar messages [Tue Dec 10 05:38:35 2019][84091.650987] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:38:35 2019][84091.661067] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 2 previous similar messages [Tue Dec 10 05:38:38 2019][84094.908236] Lustre: fir-OST005f: Connection restored to 19d091c7-bad9-3fc5-d8c7-1acb2d646997 (at 10.9.114.9@o2ib4) [Tue Dec 10 05:38:38 2019][84094.918592] Lustre: Skipped 11 previous similar messages [Tue Dec 10 05:38:53 2019][84109.460494] LustreError: 68040:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 16801792 GRANT, real grant 16752640 [Tue Dec 10 05:38:53 2019][84109.475266] LustreError: 68040:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 3 previous similar messages [Tue Dec 10 05:39:01 2019][84117.681837] LustreError: 67972:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 16801792 GRANT, real grant 0 [Tue Dec 10 05:39:03 2019][84120.010758] LustreError: 67824:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 4218880 GRANT, real grant 0 [Tue Dec 10 05:39:16 2019][84133.064019] LustreError: 67828:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 16801792 GRANT, real grant 0 [Tue Dec 10 05:39:24 2019][84141.024594] LustreError: 68046:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 16801792 GRANT, real grant 0 [Tue Dec 10 05:39:45 2019][84161.652430] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:39:45 2019][84161.662513] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 52 previous similar messages [Tue Dec 10 05:39:45 2019][84161.671917] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:39:45 2019][84161.683960] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 12 previous similar messages [Tue Dec 10 05:41:06 2019][84243.112384] LustreError: 68023:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 16801792 GRANT, real grant 13017088 [Tue Dec 10 05:41:06 2019][84243.127158] LustreError: 68023:0:(tgt_grant.c:758:tgt_grant_check()) Skipped 1 previous similar message [Tue Dec 10 05:41:58 2019][84295.432583] Lustre: fir-OST0055: deleting orphan objects from 0x1840000402:3042880 to 0x1840000402:3042913 [Tue Dec 10 05:41:58 2019][84295.432586] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3030047 to 0x1800000400:3030113 [Tue Dec 10 05:41:58 2019][84295.432610] Lustre: fir-OST0059: deleting orphan objects from 0x1940000401:3000337 to 0x1940000401:3000353 [Tue Dec 10 05:41:58 2019][84295.432624] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000401:3020523 to 0x19c0000401:3020545 [Tue Dec 10 05:41:58 2019][84295.432627] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000400:3037677 to 0x18c0000400:3037697 [Tue Dec 10 05:41:58 2019][84295.432628] Lustre: fir-OST0056: deleting orphan objects from 0x1880000402:3038989 to 0x1880000402:3039073 [Tue Dec 10 05:41:59 2019][84295.432630] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3045330 to 0x1900000402:3045345 [Tue Dec 10 05:41:59 2019][84295.432631] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3041125 to 0x1a80000402:3041217 [Tue Dec 10 05:41:59 2019][84295.434960] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3048955 to 0x1980000402:3049025 [Tue Dec 10 05:41:59 2019][84295.434965] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000402:3041777 to 0x1a40000402:3041793 [Tue Dec 10 05:41:59 2019][84295.434975] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000400:3045911 to 0x1ac0000400:3045953 [Tue Dec 10 05:41:59 2019][84295.434977] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3008777 to 0x1a00000401:3008865 [Tue Dec 10 05:42:01 2019][84297.655221] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 1 seconds [Tue Dec 10 05:42:01 2019][84297.665308] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 56 previous similar messages [Tue Dec 10 05:42:02 2019][84298.511476] LustreError: 167-0: fir-MDT0002-lwp-OST0054: This client was evicted by fir-MDT0002; in progress operations using this service will fail. [Tue Dec 10 05:42:08 2019][84304.867327] LustreError: 67723:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0059: cli 52008b8a-1aae-c71d-80d5-aeea34862c6c claims 16801792 GRANT, real grant 7770112 [Tue Dec 10 05:44:04 2019][84420.763267] LNetError: 113073:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:44:04 2019][84420.775355] LNetError: 113073:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 22 previous similar messages [Tue Dec 10 05:46:26 2019][84562.660654] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 1 seconds [Tue Dec 10 05:46:26 2019][84562.670737] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 9 previous similar messages [Tue Dec 10 05:52:51 2019][84947.668501] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 05:52:51 2019][84947.680497] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 34 previous similar messages [Tue Dec 10 05:55:05 2019][85081.671234] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 05:55:05 2019][85081.681315] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 34 previous similar messages [Tue Dec 10 06:03:01 2019][85557.680771] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 06:03:01 2019][85557.692760] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 40 previous similar messages [Tue Dec 10 06:05:11 2019][85687.683334] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 1 seconds [Tue Dec 10 06:05:11 2019][85687.693418] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 40 previous similar messages [Tue Dec 10 06:13:10 2019][86166.692984] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 06:13:10 2019][86166.704978] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 42 previous similar messages [Tue Dec 10 06:13:23 2019][86180.149321] Lustre: fir-MDT0002-lwp-OST005a: Connection to fir-MDT0002 (at 10.0.10.54@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 06:13:23 2019][86180.165349] Lustre: Skipped 11 previous similar messages [Tue Dec 10 06:14:41 2019][86257.533861] Lustre: fir-OST0054: Connection restored to fir-MDT0002-mdtlov_UUID (at 10.0.10.53@o2ib7) [Tue Dec 10 06:14:41 2019][86257.543088] Lustre: Skipped 28 previous similar messages [Tue Dec 10 06:15:04 2019][86280.503488] LustreError: 167-0: fir-MDT0002-lwp-OST005e: This client was evicted by fir-MDT0002; in progress operations using this service will fail. [Tue Dec 10 06:15:04 2019][86280.516908] LustreError: Skipped 11 previous similar messages [Tue Dec 10 06:15:18 2019][86295.454034] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000402:3041953 to 0x1a40000402:3041985 [Tue Dec 10 06:15:18 2019][86295.454036] Lustre: fir-OST0055: deleting orphan objects from 0x1840000402:3043079 to 0x1840000402:3043137 [Tue Dec 10 06:15:18 2019][86295.454037] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000400:3046117 to 0x1ac0000400:3046177 [Tue Dec 10 06:15:18 2019][86295.454040] Lustre: fir-OST0059: deleting orphan objects from 0x1940000401:3000495 to 0x1940000401:3000577 [Tue Dec 10 06:15:18 2019][86295.454056] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3045508 to 0x1900000402:3045537 [Tue Dec 10 06:15:18 2019][86295.454063] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3030279 to 0x1800000400:3030305 [Tue Dec 10 06:15:18 2019][86295.454081] Lustre: fir-OST0056: deleting orphan objects from 0x1880000402:3039249 to 0x1880000402:3039265 [Tue Dec 10 06:15:18 2019][86295.454108] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3049184 to 0x1980000402:3049217 [Tue Dec 10 06:15:18 2019][86295.454148] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3041372 to 0x1a80000402:3041409 [Tue Dec 10 06:15:19 2019][86295.454217] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000400:3037845 to 0x18c0000400:3037889 [Tue Dec 10 06:15:19 2019][86295.454219] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000401:3020727 to 0x19c0000401:3020769 [Tue Dec 10 06:15:19 2019][86295.454224] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3009028 to 0x1a00000401:3009057 [Tue Dec 10 06:15:25 2019][86301.695719] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.51@o2ib7: 0 seconds [Tue Dec 10 06:15:25 2019][86301.705801] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 42 previous similar messages [Tue Dec 10 06:16:51 2019][86387.857445] Lustre: 63875:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575987404/real 1575987404] req@ffff8922f905ad00 x1652452422920864/t0(0) o400->fir-MDT0003-lwp-OST005f@10.0.10.54@o2ib7:12/10 lens 224/224 e 0 to 1 dl 1575987411 ref 1 fl Rpc:XN/0/ffffffff rc 0/-1 [Tue Dec 10 06:16:51 2019][86387.857447] Lustre: 63866:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1575987404/real 1575987404] req@ffff8922f9059680 x1652452422920912/t0(0) o400->fir-MDT0003-lwp-OST005b@10.0.10.54@o2ib7:12/10 lens 224/224 e 0 to 1 dl 1575987411 ref 1 fl Rpc:XN/0/ffffffff rc 0/-1 [Tue Dec 10 06:16:51 2019][86387.857451] Lustre: 63866:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Tue Dec 10 06:16:51 2019][86387.857453] Lustre: fir-MDT0003-lwp-OST0059: Connection to fir-MDT0003 (at 10.0.10.54@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 06:16:51 2019][86387.857455] Lustre: Skipped 2 previous similar messages [Tue Dec 10 06:16:51 2019][86387.945142] Lustre: 63875:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 6 previous similar messages [Tue Dec 10 06:18:20 2019][86476.977328] Lustre: fir-OST0054: Connection restored to fir-MDT0002-mdtlov_UUID (at 10.0.10.53@o2ib7) [Tue Dec 10 06:18:20 2019][86476.986563] Lustre: Skipped 20 previous similar messages [Tue Dec 10 06:18:56 2019][86513.340197] LustreError: 167-0: fir-MDT0003-lwp-OST0054: This client was evicted by fir-MDT0003; in progress operations using this service will fail. [Tue Dec 10 06:18:56 2019][86513.353585] LustreError: Skipped 11 previous similar messages [Tue Dec 10 06:19:04 2019][86520.840041] Lustre: fir-OST0056: deleting orphan objects from 0x1880000400:11762168 to 0x1880000400:11762241 [Tue Dec 10 06:19:04 2019][86520.840046] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11785079 to 0x1a80000400:11785153 [Tue Dec 10 06:19:04 2019][86520.840059] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11781867 to 0x1900000401:11782049 [Tue Dec 10 06:19:04 2019][86520.840069] Lustre: fir-OST0055: deleting orphan objects from 0x1840000400:11791176 to 0x1840000400:11791265 [Tue Dec 10 06:19:04 2019][86520.840073] Lustre: fir-OST0059: deleting orphan objects from 0x1940000402:11644728 to 0x1940000402:11644769 [Tue Dec 10 06:19:04 2019][86520.840127] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11769931 to 0x1800000401:11769985 [Tue Dec 10 06:19:04 2019][86520.840129] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000402:11726859 to 0x19c0000402:11726881 [Tue Dec 10 06:19:04 2019][86520.840131] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11826483 to 0x1980000401:11826657 [Tue Dec 10 06:19:04 2019][86520.840132] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000400:11638479 to 0x1a00000400:11638529 [Tue Dec 10 06:19:04 2019][86520.840179] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000401:11793269 to 0x18c0000401:11793313 [Tue Dec 10 06:19:04 2019][86520.840181] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000400:11794873 to 0x1a40000400:11794913 [Tue Dec 10 06:19:04 2019][86520.840190] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000401:11754239 to 0x1ac0000401:11754273 [Tue Dec 10 06:20:12 2019][86588.605617] Lustre: fir-MDT0001-lwp-OST005a: Connection to fir-MDT0001 (at 10.0.10.52@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 06:20:12 2019][86588.621603] Lustre: Skipped 19 previous similar messages [Tue Dec 10 06:22:12 2019][86708.846358] Lustre: fir-OST0054: Connection restored to fir-MDT0001-mdtlov_UUID (at 10.0.10.51@o2ib7) [Tue Dec 10 06:22:12 2019][86708.855584] Lustre: Skipped 21 previous similar messages [Tue Dec 10 06:22:37 2019][86734.084538] Lustre: fir-OST0055: deleting orphan objects from 0x1840000401:923481 to 0x1840000401:923521 [Tue Dec 10 06:22:37 2019][86734.084540] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:918591 to 0x1800000402:918625 [Tue Dec 10 06:22:37 2019][86734.084543] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:922723 to 0x1900000400:922753 [Tue Dec 10 06:22:37 2019][86734.084545] Lustre: fir-OST0057: deleting orphan objects from 0x18c0000402:922950 to 0x18c0000402:922977 [Tue Dec 10 06:22:37 2019][86734.084549] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:907512 to 0x1a00000402:907553 [Tue Dec 10 06:22:37 2019][86734.084551] Lustre: fir-OST0059: deleting orphan objects from 0x1940000400:906186 to 0x1940000400:906209 [Tue Dec 10 06:22:37 2019][86734.084553] Lustre: fir-OST0056: deleting orphan objects from 0x1880000401:920974 to 0x1880000401:920993 [Tue Dec 10 06:22:37 2019][86734.084571] Lustre: fir-OST005f: deleting orphan objects from 0x1ac0000402:920947 to 0x1ac0000402:920993 [Tue Dec 10 06:22:37 2019][86734.084576] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:920757 to 0x1a80000401:920801 [Tue Dec 10 06:22:37 2019][86734.084583] Lustre: fir-OST005b: deleting orphan objects from 0x19c0000400:916221 to 0x19c0000400:916257 [Tue Dec 10 06:22:37 2019][86734.084584] Lustre: fir-OST005d: deleting orphan objects from 0x1a40000401:922148 to 0x1a40000401:922177 [Tue Dec 10 06:22:37 2019][86734.085560] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:924075 to 0x1980000400:924097 [Tue Dec 10 06:22:42 2019][86739.136744] LustreError: 167-0: fir-MDT0001-lwp-OST005e: This client was evicted by fir-MDT0001; in progress operations using this service will fail. [Tue Dec 10 06:22:42 2019][86739.150146] LustreError: Skipped 11 previous similar messages [Tue Dec 10 06:41:10 2019][87846.814365] Lustre: Failing over fir-OST005d [Tue Dec 10 06:41:10 2019][87846.861997] Lustre: fir-OST0059: Not available for connect from 10.8.30.2@o2ib6 (stopping) [Tue Dec 10 06:41:10 2019][87846.870267] Lustre: Skipped 1 previous similar message [Tue Dec 10 06:41:10 2019][87846.932238] LustreError: 114752:0:(ldlm_resource.c:1147:ldlm_resource_complain()) filter-fir-OST0059_UUID: namespace resource [0x1940000401:0x2dc03d:0x0].0x0 (ffff88ed360c12c0) refcount nonzero (2) after lock cleanup; forcing cleanup. [Tue Dec 10 06:41:10 2019][87847.376418] Lustre: fir-OST0055: Not available for connect from 10.9.113.11@o2ib4 (stopping) [Tue Dec 10 06:41:10 2019][87847.384864] Lustre: Skipped 343 previous similar messages [Tue Dec 10 06:41:11 2019][87848.380594] Lustre: fir-OST0057: Not available for connect from 10.9.105.65@o2ib4 (stopping) [Tue Dec 10 06:41:11 2019][87848.389032] Lustre: Skipped 628 previous similar messages [Tue Dec 10 06:41:12 2019][87849.162769] LustreError: 137-5: fir-OST005d_UUID: not available for connect from 10.8.23.24@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 06:41:12 2019][87849.180054] LustreError: Skipped 8474 previous similar messages [Tue Dec 10 06:41:13 2019][87849.962903] Lustre: server umount fir-OST0059 complete [Tue Dec 10 06:41:13 2019][87849.968053] Lustre: Skipped 2 previous similar messages [Tue Dec 10 06:41:14 2019][87850.574398] LustreError: 87509:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) ldlm_cancel from 10.9.108.71@o2ib4 arrived at 1575988874 with bad export cookie 8099382812963126271 [Tue Dec 10 06:41:14 2019][87850.589954] LustreError: 87509:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) Skipped 1 previous similar message [Tue Dec 10 06:41:14 2019][87851.284712] LustreError: 66057:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) ldlm_cancel from 10.9.116.1@o2ib4 arrived at 1575988874 with bad export cookie 8099382812963119894 [Tue Dec 10 06:41:17 2019][87853.979725] md7: detected capacity change from 64011422924800 to 0 [Tue Dec 10 06:41:17 2019][87853.985921] md: md7 stopped. [Tue Dec 10 06:41:17 2019][87854.043783] md1: detected capacity change from 64011422924800 to 0 [Tue Dec 10 06:41:17 2019][87854.049978] md: md1 stopped. [Tue Dec 10 06:41:17 2019][87854.050954] md3: detected capacity change from 64011422924800 to 0 [Tue Dec 10 06:41:17 2019][87854.050958] md: md3 stopped. [Tue Dec 10 06:41:17 2019][87854.053291] md11: detected capacity change from 64011422924800 to 0 [Tue Dec 10 06:41:17 2019][87854.053302] md: md11 stopped. [Tue Dec 10 06:41:17 2019][87854.140475] md9: detected capacity change from 64011422924800 to 0 [Tue Dec 10 06:41:17 2019][87854.146665] md: md9 stopped. [Tue Dec 10 06:41:19 2019][87855.711959] LustreError: 87509:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) ldlm_cancel from 10.9.101.49@o2ib4 arrived at 1575988879 with bad export cookie 8099382812963135658 [Tue Dec 10 06:41:19 2019][87855.727511] LustreError: 87509:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) Skipped 3 previous similar messages [Tue Dec 10 06:41:19 2019][87856.355456] md5: detected capacity change from 64011422924800 to 0 [Tue Dec 10 06:41:19 2019][87856.361657] md: md5 stopped. [Tue Dec 10 06:41:20 2019][87857.071054] md: md7 stopped. [Tue Dec 10 06:41:20 2019][87857.072014] md: md3 stopped. [Tue Dec 10 06:41:21 2019][87858.074149] md: md3 stopped. [Tue Dec 10 06:41:25 2019][87861.751375] LustreError: 68056:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) ldlm_cancel from 10.9.101.13@o2ib4 arrived at 1575988885 with bad export cookie 8099382812963145024 [Tue Dec 10 06:41:32 2019][87869.272333] LustreError: 87509:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) ldlm_cancel from 10.9.101.47@o2ib4 arrived at 1575988892 with bad export cookie 8099382812963143897 [Tue Dec 10 06:41:32 2019][87869.287880] LustreError: 87509:0:(ldlm_lockd.c:2324:ldlm_cancel_handler()) Skipped 3 previous similar messages [Tue Dec 10 06:41:46 2019][87883.295171] LustreError: 137-5: fir-OST005b_UUID: not available for connect from 10.9.117.22@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 06:41:46 2019][87883.312550] LustreError: Skipped 6249 previous similar messages [Tue Dec 10 06:42:50 2019][87947.296381] LustreError: 137-5: fir-OST0055_UUID: not available for connect from 10.8.27.2@o2ib6 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 06:42:50 2019][87947.313575] LustreError: Skipped 7797 previous similar messages [Tue Dec 10 06:44:58 2019][88075.492651] LustreError: 137-5: fir-OST005f_UUID: not available for connect from 10.9.101.18@o2ib4 (no target). If you are running an HA pair check that the target is mounted on the other server. [Tue Dec 10 06:44:58 2019][88075.510050] LustreError: Skipped 13956 previous similar messages [Tue Dec 10 07:01:05 2019][89042.543427] LustreError: 11-0: fir-MDT0000-lwp-OST005e: operation ldlm_enqueue to node 10.0.10.52@o2ib7 failed: rc = -107 [Tue Dec 10 07:01:05 2019][89042.543431] Lustre: fir-MDT0000-lwp-OST0056: Connection to fir-MDT0000 (at 10.0.10.52@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 07:01:06 2019][89042.543433] Lustre: Skipped 2 previous similar messages [Tue Dec 10 07:01:06 2019][89042.575605] LustreError: Skipped 30 previous similar messages [Tue Dec 10 07:02:27 2019][89123.752807] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.54@o2ib7: 0 seconds [Tue Dec 10 07:02:27 2019][89123.762893] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 112 previous similar messages [Tue Dec 10 07:02:27 2019][89123.772391] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 07:02:27 2019][89123.784389] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 34 previous similar messages [Tue Dec 10 07:03:35 2019][89191.907564] Lustre: fir-OST0054: Connection restored to fir-MDT0001-mdtlov_UUID (at 10.0.10.51@o2ib7) [Tue Dec 10 07:03:35 2019][89191.916813] Lustre: Skipped 23 previous similar messages [Tue Dec 10 07:04:06 2019][89222.899028] LustreError: 167-0: fir-MDT0000-lwp-OST005c: This client was evicted by fir-MDT0000; in progress operations using this service will fail. [Tue Dec 10 07:04:06 2019][89222.912416] LustreError: Skipped 11 previous similar messages [Tue Dec 10 07:04:06 2019][89222.919007] LustreError: 63873:0:(client.c:1197:ptlrpc_import_delay_req()) @@@ invalidate in flight req@ffff890de48d3f00 x1652452423870288/t0(0) o103->fir-MDT0000-lwp-OST005e@10.0.10.51@o2ib7:17/18 lens 328/224 e 0 to 0 dl 0 ref 1 fl Rpc:W/0/ffffffff rc 0/-1 [Tue Dec 10 07:04:06 2019][89223.159901] LustreError: 11-0: fir-MDT0000-lwp-OST005c: operation quota_acquire to node 10.0.10.51@o2ib7 failed: rc = -11 [Tue Dec 10 07:04:06 2019][89223.170875] LustreError: Skipped 3 previous similar messages [Tue Dec 10 07:04:08 2019][89224.874759] LustreError: 11-0: fir-MDT0000-lwp-OST005a: operation quota_acquire to node 10.0.10.51@o2ib7 failed: rc = -11 [Tue Dec 10 07:04:08 2019][89224.885718] LustreError: Skipped 16 previous similar messages [Tue Dec 10 07:04:14 2019][89230.696477] Lustre: fir-OST0056: deleting orphan objects from 0x0:27467647 to 0x0:27467681 [Tue Dec 10 07:04:14 2019][89230.696513] Lustre: fir-OST005c: deleting orphan objects from 0x0:27180265 to 0x0:27180289 [Tue Dec 10 07:04:14 2019][89230.696575] Lustre: fir-OST005a: deleting orphan objects from 0x0:27549741 to 0x0:27549761 [Tue Dec 10 07:04:14 2019][89230.696608] Lustre: fir-OST0054: deleting orphan objects from 0x0:27445580 to 0x0:27445601 [Tue Dec 10 07:04:14 2019][89230.696647] Lustre: fir-OST005e: deleting orphan objects from 0x0:27454900 to 0x0:27454945 [Tue Dec 10 07:04:14 2019][89230.697022] Lustre: fir-OST0058: deleting orphan objects from 0x0:27494442 to 0x0:27494465 [Tue Dec 10 07:04:57 2019][89273.755955] LNetError: 114061:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 07:04:57 2019][89273.768058] LNetError: 114061:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 6 previous similar messages [Tue Dec 10 07:04:59 2019][89275.873243] Lustre: 63870:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has failed due to network error: [sent 1575990296/real 1575990299] req@ffff89129e81da00 x1652452423996192/t0(0) o400->MGC10.0.10.51@o2ib7@10.0.10.52@o2ib7:26/25 lens 224/224 e 0 to 1 dl 1575990303 ref 1 fl Rpc:eXN/0/ffffffff rc 0/-1 [Tue Dec 10 07:04:59 2019][89275.901644] Lustre: 63870:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Tue Dec 10 07:04:59 2019][89275.911396] LustreError: 166-1: MGC10.0.10.51@o2ib7: Connection to MGS (at 10.0.10.52@o2ib7) was lost; in progress operations using this service will fail [Tue Dec 10 07:07:35 2019][89431.759049] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 0 seconds [Tue Dec 10 07:07:35 2019][89431.769136] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 23 previous similar messages [Tue Dec 10 07:07:35 2019][89431.778562] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 07:07:35 2019][89431.790592] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 13 previous similar messages [Tue Dec 10 07:10:07 2019][89583.762119] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 1 seconds [Tue Dec 10 07:10:07 2019][89583.772201] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 7 previous similar messages [Tue Dec 10 07:13:01 2019][89757.765638] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 07:13:01 2019][89757.777660] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 15 previous similar messages [Tue Dec 10 07:17:12 2019][90008.770606] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 0 seconds [Tue Dec 10 07:17:12 2019][90008.780691] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 11 previous similar messages [Tue Dec 10 07:23:54 2019][90410.778568] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) ni 10.0.10.115@o2ib7 added to recovery queue. Health = 900 [Tue Dec 10 07:23:54 2019][90410.790559] LNetError: 63820:0:(lib-msg.c:485:lnet_handle_local_failure()) Skipped 26 previous similar messages [Tue Dec 10 07:31:00 2019][90836.787083] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Timed out tx for 10.0.10.52@o2ib7: 0 seconds [Tue Dec 10 07:31:00 2019][90836.797165] LNet: 63820:0:(o2iblnd_cb.c:3396:kiblnd_check_conns()) Skipped 23 previous similar messages [Tue Dec 10 07:33:00 2019][90956.821732] Lustre: fir-MDT0003-lwp-OST005c: Connection to fir-MDT0003 (at 10.0.10.53@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 07:33:00 2019][90956.837717] Lustre: Skipped 9 previous similar messages [Tue Dec 10 07:34:25 2019][91042.163769] Lustre: fir-OST0054: Connection restored to fir-MDT0003-mdtlov_UUID (at 10.0.10.54@o2ib7) [Tue Dec 10 07:34:25 2019][91042.173003] Lustre: Skipped 8 previous similar messages [Tue Dec 10 07:35:05 2019][91082.264270] LustreError: 167-0: fir-MDT0003-lwp-OST005c: This client was evicted by fir-MDT0003; in progress operations using this service will fail. [Tue Dec 10 07:35:05 2019][91082.277659] LustreError: Skipped 5 previous similar messages [Tue Dec 10 07:35:05 2019][91082.285228] Lustre: fir-MDT0003-lwp-OST005c: Connection restored to 10.0.10.54@o2ib7 (at 10.0.10.54@o2ib7) [Tue Dec 10 07:35:05 2019][91082.294896] Lustre: Skipped 6 previous similar messages [Tue Dec 10 07:35:33 2019][91110.101659] Lustre: fir-OST0056: deleting orphan objects from 0x1880000400:11762647 to 0x1880000400:11762689 [Tue Dec 10 07:35:33 2019][91110.101679] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000400:11638916 to 0x1a00000400:11638945 [Tue Dec 10 07:35:33 2019][91110.101680] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11782453 to 0x1900000401:11782497 [Tue Dec 10 07:35:33 2019][91110.101803] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11785582 to 0x1a80000400:11785601 [Tue Dec 10 07:35:33 2019][91110.101804] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11770384 to 0x1800000401:11770401 [Tue Dec 10 07:35:33 2019][91110.101805] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11827087 to 0x1980000401:11827105 [Tue Dec 10 07:38:01 2019][91257.883712] Lustre: fir-MDT0001-lwp-OST005a: Connection to fir-MDT0001 (at 10.0.10.51@o2ib7) was lost; in progress operations using this service will wait for recovery to complete [Tue Dec 10 07:38:01 2019][91257.899712] Lustre: Skipped 5 previous similar messages [Tue Dec 10 07:39:02 2019][91318.892140] Lustre: fir-OST0054: Connection restored to fir-MDT0001-mdtlov_UUID (at 10.0.10.52@o2ib7) [Tue Dec 10 07:39:02 2019][91318.901362] Lustre: Skipped 3 previous similar messages [Tue Dec 10 07:40:06 2019][91383.326212] LustreError: 167-0: fir-MDT0001-lwp-OST005a: This client was evicted by fir-MDT0001; in progress operations using this service will fail. [Tue Dec 10 07:40:06 2019][91383.339597] LustreError: Skipped 5 previous similar messages [Tue Dec 10 07:40:06 2019][91383.347176] Lustre: fir-MDT0001-lwp-OST005a: Connection restored to 10.0.10.52@o2ib7 (at 10.0.10.52@o2ib7) [Tue Dec 10 07:40:06 2019][91383.356830] Lustre: Skipped 6 previous similar messages [Tue Dec 10 07:40:19 2019][91396.114129] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:907594 to 0x1a00000402:907617 [Tue Dec 10 07:40:19 2019][91396.114130] Lustre: fir-OST0056: deleting orphan objects from 0x1880000401:921038 to 0x1880000401:921057 [Tue Dec 10 07:40:19 2019][91396.114134] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:918672 to 0x1800000402:918689 [Tue Dec 10 07:40:19 2019][91396.114203] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:924139 to 0x1980000400:924161 [Tue Dec 10 07:40:19 2019][91396.114266] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:920847 to 0x1a80000401:920865 [Tue Dec 10 07:40:19 2019][91396.114268] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:922805 to 0x1900000400:922849 [Tue Dec 10 07:41:21 2019][91458.591755] Lustre: Evicted from MGS (at 10.0.10.51@o2ib7) after server handle changed from 0xbba64b52f329a2a4 to 0xc3c20c0652556a2a [Tue Dec 10 07:41:21 2019][91458.603833] Lustre: MGC10.0.10.51@o2ib7: Connection restored to 10.0.10.51@o2ib7 (at 10.0.10.51@o2ib7) [Tue Dec 10 07:53:33 2019][92189.693251] Lustre: fir-OST0058: haven't heard from client 82b9ac9e-bd42-fb9c-cb3e-f327857b510c (at 10.9.0.62@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892272a71400, cur 1575993213 expire 1575993063 last 1575992986 [Tue Dec 10 07:53:33 2019][92189.714880] Lustre: Skipped 5 previous similar messages [Tue Dec 10 08:34:30 2019][94647.453315] Lustre: fir-OST0054: Connection restored to 82b9ac9e-bd42-fb9c-cb3e-f327857b510c (at 10.9.0.62@o2ib4) [Tue Dec 10 08:34:30 2019][94647.463585] Lustre: Skipped 5 previous similar messages [Tue Dec 10 08:34:31 2019][94647.725707] Lustre: fir-OST0058: haven't heard from client cec884d3-ca4b-8127-2f6b-7762665aa5f8 (at 10.9.0.64@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892266f68800, cur 1575995671 expire 1575995521 last 1575995444 [Tue Dec 10 08:34:31 2019][94647.747347] Lustre: Skipped 5 previous similar messages [Tue Dec 10 09:11:00 2019][96836.831164] Lustre: fir-OST0054: Connection restored to cec884d3-ca4b-8127-2f6b-7762665aa5f8 (at 10.9.0.64@o2ib4) [Tue Dec 10 09:11:00 2019][96836.841425] Lustre: Skipped 5 previous similar messages [Tue Dec 10 09:18:12 2019][97268.763743] Lustre: fir-OST005a: haven't heard from client fb9a2d5e-e9b3-4fb9-b988-9954fcfb0920 (at 10.8.0.66@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8912f8ae5000, cur 1575998292 expire 1575998142 last 1575998065 [Tue Dec 10 09:18:12 2019][97268.785369] Lustre: Skipped 5 previous similar messages [Tue Dec 10 09:52:25 2019][99322.533302] Lustre: fir-OST0054: Connection restored to fb9a2d5e-e9b3-4fb9-b988-9954fcfb0920 (at 10.8.0.66@o2ib6) [Tue Dec 10 09:52:25 2019][99322.543567] Lustre: Skipped 5 previous similar messages [Tue Dec 10 10:17:14 2019][100810.839701] Lustre: fir-OST0058: haven't heard from client 40a204f8-61bd-7bf5-8e8b-66a640362528 (at 10.8.21.28@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8922569b5400, cur 1576001834 expire 1576001684 last 1576001607 [Tue Dec 10 10:17:14 2019][100810.861510] Lustre: Skipped 5 previous similar messages [Tue Dec 10 10:46:11 2019][102548.503191] Lustre: fir-OST0054: Connection restored to 0af2ee10-72ea-97a8-65e7-44544fdbc0b9 (at 10.9.108.39@o2ib4) [Tue Dec 10 10:46:11 2019][102548.513721] Lustre: Skipped 5 previous similar messages [Tue Dec 10 10:51:50 2019][102887.676821] Lustre: fir-OST0054: Connection restored to 6943a6ac-ba36-d287-3012-a3d9ab556566 (at 10.8.21.14@o2ib6) [Tue Dec 10 10:51:50 2019][102887.687255] Lustre: Skipped 5 previous similar messages [Tue Dec 10 10:52:05 2019][102902.279630] Lustre: fir-OST0054: Connection restored to 98c710cf-a183-35fe-d60d-8494e153f1c3 (at 10.8.21.13@o2ib6) [Tue Dec 10 10:52:05 2019][102902.290069] Lustre: Skipped 5 previous similar messages [Tue Dec 10 10:52:07 2019][102904.746400] Lustre: fir-OST0054: Connection restored to (at 10.8.21.8@o2ib6) [Tue Dec 10 10:52:07 2019][102904.746401] Lustre: fir-OST0056: Connection restored to (at 10.8.21.8@o2ib6) [Tue Dec 10 10:52:07 2019][102904.746404] Lustre: Skipped 6 previous similar messages [Tue Dec 10 10:52:07 2019][102904.766189] Lustre: Skipped 4 previous similar messages [Tue Dec 10 10:52:17 2019][102914.701723] Lustre: fir-OST0054: Connection restored to 40a204f8-61bd-7bf5-8e8b-66a640362528 (at 10.8.21.28@o2ib6) [Tue Dec 10 10:52:17 2019][102914.712164] Lustre: Skipped 11 previous similar messages [Tue Dec 10 10:52:36 2019][102933.490929] Lustre: fir-OST0054: Connection restored to 07312e22-36ea-cbe1-f5a7-b2f2d00651b0 (at 10.8.20.22@o2ib6) [Tue Dec 10 10:52:36 2019][102933.501368] Lustre: Skipped 23 previous similar messages [Tue Dec 10 10:53:10 2019][102967.732545] Lustre: fir-OST0054: Connection restored to b5be2f5f-0f09-196f-7061-da3a3aa7cecb (at 10.8.20.31@o2ib6) [Tue Dec 10 10:53:10 2019][102967.742984] Lustre: Skipped 83 previous similar messages [Tue Dec 10 10:54:19 2019][103036.782165] Lustre: fir-OST0054: Connection restored to a77d579b-bc84-7eca-11a1-85e2fd56cb4e (at 10.8.20.23@o2ib6) [Tue Dec 10 10:54:19 2019][103036.792604] Lustre: Skipped 106 previous similar messages [Tue Dec 10 10:55:36 2019][103113.335081] LustreError: 66125:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005e: cli 20463417-fb32-2f92-5aae-59bfa8e287e3 claims 14893056 GRANT, real grant 0 [Tue Dec 10 10:56:02 2019][103139.403106] LustreError: 67963:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli aa2a15c8-736e-708a-65eb-dfabc669a063 claims 28672 GRANT, real grant 0 [Tue Dec 10 11:08:59 2019][103916.737798] Lustre: fir-OST0054: Connection restored to 8003aaab-bcab-ef28-dd2b-704b0d862745 (at 10.8.22.12@o2ib6) [Tue Dec 10 11:08:59 2019][103916.748231] Lustre: Skipped 11 previous similar messages [Tue Dec 10 11:09:22 2019][103939.599954] Lustre: fir-OST0058: Connection restored to 92c08489-d99f-9692-0d8e-5d862ef77698 (at 10.8.22.5@o2ib6) [Tue Dec 10 11:09:22 2019][103939.610312] Lustre: Skipped 4 previous similar messages [Tue Dec 10 11:10:36 2019][104013.821679] Lustre: fir-OST0054: Connection restored to 37454ba9-0898-97b6-5a68-4a0682e739f8 (at 10.8.20.8@o2ib6) [Tue Dec 10 11:10:36 2019][104013.832031] Lustre: Skipped 18 previous similar messages [Tue Dec 10 11:43:24 2019][105981.538288] Lustre: fir-OST0054: Connection restored to b11f5302-9207-4a63-91bc-6141fa0b09e3 (at 10.8.22.4@o2ib6) [Tue Dec 10 11:43:24 2019][105981.548641] Lustre: Skipped 5 previous similar messages [Tue Dec 10 11:44:48 2019][106064.964622] LustreError: 68035:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0058: cli 5d5036fa-60c3-4 claims 16752640 GRANT, real grant 0 [Tue Dec 10 12:01:59 2019][107096.674363] LustreError: 67973:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 295209bb-0224-d868-bd7c-cd75c3b19a1c claims 200704 GRANT, real grant 0 [Tue Dec 10 12:05:18 2019][107295.378991] LustreError: 68000:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005c: cli 31722a42-53ae-b678-363c-dc0a8c0b6d11 claims 28672 GRANT, real grant 0 [Tue Dec 10 12:14:47 2019][107864.528049] LustreError: 67965:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005c: cli 5d5036fa-60c3-4 claims 2076672 GRANT, real grant 0 [Tue Dec 10 12:33:44 2019][109001.246063] perf: interrupt took too long (3130 > 3128), lowering kernel.perf_event_max_sample_rate to 63000 [Tue Dec 10 13:08:00 2019][111057.203872] Lustre: fir-OST0054: Connection restored to a5709ca0-bfe0-cc30-835f-99ba0583ca05 (at 10.8.20.27@o2ib6) [Tue Dec 10 13:08:00 2019][111057.214309] Lustre: Skipped 5 previous similar messages [Tue Dec 10 13:27:28 2019][112225.415351] LustreError: 67989:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005a: cli a83208a9-361d-4 claims 1597440 GRANT, real grant 0 [Tue Dec 10 13:44:48 2019][113265.085246] Lustre: fir-OST005e: haven't heard from client 8fbd1a16-d09d-1ef7-e10d-4e68dc0a9f97 (at 10.8.23.32@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892250f16c00, cur 1576014288 expire 1576014138 last 1576014061 [Tue Dec 10 13:44:48 2019][113265.107060] Lustre: Skipped 61 previous similar messages [Tue Dec 10 13:44:50 2019][113267.080987] Lustre: fir-OST005a: haven't heard from client 8fbd1a16-d09d-1ef7-e10d-4e68dc0a9f97 (at 10.8.23.32@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff88e322a4a800, cur 1576014290 expire 1576014140 last 1576014063 [Tue Dec 10 13:44:50 2019][113267.102809] Lustre: Skipped 1 previous similar message [Tue Dec 10 14:12:28 2019][114926.010694] LustreError: 67711:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 57b26761-b79f-628f-0ec2-0a10fd7ac3bd claims 28672 GRANT, real grant 0 [Tue Dec 10 14:21:04 2019][115441.676815] Lustre: fir-OST0054: Connection restored to (at 10.8.23.32@o2ib6) [Tue Dec 10 14:21:04 2019][115441.684128] Lustre: Skipped 5 previous similar messages [Tue Dec 10 14:47:57 2019][117054.836037] Lustre: fir-OST0054: Connection restored to 0bbd53e2-6989-83e6-f126-86a473496205 (at 10.8.21.36@o2ib6) [Tue Dec 10 14:47:57 2019][117054.846482] Lustre: Skipped 5 previous similar messages [Tue Dec 10 15:28:53 2019][119510.220122] Lustre: fir-OST0058: haven't heard from client ee4590b6-1057-e690-5db0-89b0af3963cd (at 10.8.22.30@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8922bfba1000, cur 1576020533 expire 1576020383 last 1576020306 [Tue Dec 10 15:28:53 2019][119510.241935] Lustre: Skipped 3 previous similar messages [Tue Dec 10 15:29:02 2019][119519.215315] Lustre: fir-OST0056: haven't heard from client ee4590b6-1057-e690-5db0-89b0af3963cd (at 10.8.22.30@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892251931400, cur 1576020542 expire 1576020392 last 1576020315 [Tue Dec 10 15:29:02 2019][119519.237123] Lustre: Skipped 3 previous similar messages [Tue Dec 10 15:35:10 2019][119887.411759] Lustre: fir-OST0054: Connection restored to c20915b7-72a8-8f0f-a961-7c81095a2283 (at 10.8.23.29@o2ib6) [Tue Dec 10 15:35:10 2019][119887.422195] Lustre: Skipped 5 previous similar messages [Tue Dec 10 16:03:38 2019][121596.087729] Lustre: fir-OST0054: Connection restored to ee4590b6-1057-e690-5db0-89b0af3963cd (at 10.8.22.30@o2ib6) [Tue Dec 10 16:03:38 2019][121596.098177] Lustre: Skipped 5 previous similar messages [Tue Dec 10 17:01:22 2019][125060.118327] Lustre: fir-OST0054: Connection restored to 7898fb8f-92c0-6a6b-8c01-20f1dcd2c072 (at 10.8.23.20@o2ib6) [Tue Dec 10 17:01:22 2019][125060.128767] Lustre: Skipped 5 previous similar messages [Tue Dec 10 17:09:05 2019][125522.869119] LustreError: 67911:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli da9f6e55-12b4-4 claims 1597440 GRANT, real grant 0 [Tue Dec 10 17:57:50 2019][128448.096826] LustreError: 67954:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005a: cli da9f6e55-12b4-4 claims 36864 GRANT, real grant 0 [Tue Dec 10 18:18:56 2019][129713.385351] Lustre: fir-OST0054: Connection restored to 43d748a2-b8c5-e7f9-8b00-d16d4390ff4d (at 10.8.22.6@o2ib6) [Tue Dec 10 18:18:56 2019][129713.395703] Lustre: Skipped 3 previous similar messages [Tue Dec 10 18:30:39 2019][130416.428824] Lustre: fir-OST005c: haven't heard from client b6bab463-5f5c-8f5c-f09a-8f0ce0f6e1cd (at 10.8.21.31@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8922f2259800, cur 1576031439 expire 1576031289 last 1576031212 [Tue Dec 10 18:30:39 2019][130416.450651] Lustre: Skipped 1 previous similar message [Tue Dec 10 18:31:55 2019][130492.437526] Lustre: fir-OST005e: haven't heard from client 7515dbe4-f1c8-844a-9186-76f9c6288c34 (at 10.9.104.2@o2ib4) in 222 seconds. I think it's dead, and I am evicting it. exp ffff892250bf5400, cur 1576031515 expire 1576031365 last 1576031293 [Tue Dec 10 18:31:55 2019][130492.459325] Lustre: Skipped 29 previous similar messages [Tue Dec 10 18:51:08 2019][131645.782803] Lustre: fir-OST0054: Connection restored to (at 10.9.114.14@o2ib4) [Tue Dec 10 18:51:08 2019][131645.790234] Lustre: Skipped 5 previous similar messages [Tue Dec 10 18:51:55 2019][131693.053219] Lustre: fir-OST0056: Connection restored to d66c3860-7975-f3f1-3866-4386eb6742ed (at 10.8.19.6@o2ib6) [Tue Dec 10 18:51:55 2019][131693.063577] Lustre: Skipped 5 previous similar messages [Tue Dec 10 18:55:04 2019][131881.923365] Lustre: fir-OST0054: Connection restored to 2295c161-47a8-c199-f8c4-4e53eff1b957 (at 10.9.110.71@o2ib4) [Tue Dec 10 18:55:04 2019][131881.923366] Lustre: fir-OST0058: Connection restored to 2295c161-47a8-c199-f8c4-4e53eff1b957 (at 10.9.110.71@o2ib4) [Tue Dec 10 18:55:04 2019][131881.944410] Lustre: Skipped 4 previous similar messages [Tue Dec 10 18:55:43 2019][131921.259280] Lustre: fir-OST0054: Connection restored to 67fb2ec4-7a5a-f103-4386-bc08c967f193 (at 10.9.107.9@o2ib4) [Tue Dec 10 18:55:43 2019][131921.269718] Lustre: Skipped 5 previous similar messages [Tue Dec 10 18:57:08 2019][132005.721784] Lustre: fir-OST0054: Connection restored to e8872901-9e69-2d9a-e57a-55077a64186b (at 10.9.109.25@o2ib4) [Tue Dec 10 18:57:08 2019][132005.721785] Lustre: fir-OST0058: Connection restored to e8872901-9e69-2d9a-e57a-55077a64186b (at 10.9.109.25@o2ib4) [Tue Dec 10 18:57:08 2019][132005.742828] Lustre: Skipped 4 previous similar messages [Tue Dec 10 18:59:18 2019][132136.020350] Lustre: fir-OST0054: Connection restored to 2ad8ff13-d978-9373-7245-882c6479cc4c (at 10.9.110.63@o2ib4) [Tue Dec 10 18:59:18 2019][132136.030875] Lustre: Skipped 5 previous similar messages [Tue Dec 10 19:03:19 2019][132377.338818] Lustre: fir-OST0054: Connection restored to b5acf087-1850-f5e1-236a-4cc1bab1a9f0 (at 10.9.104.34@o2ib4) [Tue Dec 10 19:03:19 2019][132377.349350] Lustre: Skipped 34 previous similar messages [Tue Dec 10 19:05:07 2019][132484.938830] Lustre: fir-OST0054: Connection restored to b6bab463-5f5c-8f5c-f09a-8f0ce0f6e1cd (at 10.8.21.31@o2ib6) [Tue Dec 10 19:05:07 2019][132484.938830] Lustre: fir-OST0056: Connection restored to b6bab463-5f5c-8f5c-f09a-8f0ce0f6e1cd (at 10.8.21.31@o2ib6) [Tue Dec 10 19:05:07 2019][132484.938833] Lustre: Skipped 18 previous similar messages [Tue Dec 10 19:05:07 2019][132484.965098] Lustre: Skipped 4 previous similar messages [Tue Dec 10 19:08:44 2019][132702.203103] Lustre: fir-OST0054: Connection restored to (at 10.8.28.9@o2ib6) [Tue Dec 10 19:08:44 2019][132702.210335] Lustre: Skipped 29 previous similar messages [Tue Dec 10 19:42:31 2019][134728.568717] Lustre: fir-OST0056: haven't heard from client aadbd140-afe6-3cc5-5efa-1bf64465f6e7 (at 10.8.20.34@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8922515e2800, cur 1576035751 expire 1576035601 last 1576035524 [Tue Dec 10 19:42:31 2019][134728.590521] Lustre: Skipped 77 previous similar messages [Tue Dec 10 19:42:37 2019][134734.512090] Lustre: fir-OST0054: haven't heard from client aadbd140-afe6-3cc5-5efa-1bf64465f6e7 (at 10.8.20.34@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892252a14c00, cur 1576035757 expire 1576035607 last 1576035530 [Tue Dec 10 19:42:41 2019][134738.545470] Lustre: fir-OST005c: haven't heard from client aadbd140-afe6-3cc5-5efa-1bf64465f6e7 (at 10.8.20.34@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8922f23b4000, cur 1576035761 expire 1576035611 last 1576035534 [Tue Dec 10 19:42:41 2019][134738.567278] Lustre: Skipped 3 previous similar messages [Tue Dec 10 20:17:49 2019][136846.637044] Lustre: fir-OST0054: Connection restored to (at 10.8.20.34@o2ib6) [Tue Dec 10 20:17:49 2019][136846.644360] Lustre: Skipped 53 previous similar messages [Tue Dec 10 21:36:51 2019][141589.169921] Lustre: fir-OST0054: Connection restored to 55ff50e7-08a4-be07-5499-ccc18f03f2c9 (at 10.8.23.17@o2ib6) [Tue Dec 10 21:36:51 2019][141589.180357] Lustre: Skipped 5 previous similar messages [Tue Dec 10 22:43:25 2019][145583.045993] LustreError: 68031:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0058: cli c9911b4c-e55e-f4aa-416a-b652019239f7 claims 28672 GRANT, real grant 0 [Tue Dec 10 23:32:58 2019][148556.094351] Lustre: fir-OST0054: Connection restored to 77f07ca8-e3bd-72f6-4ac1-3da8889522b3 (at 10.8.22.19@o2ib6) [Tue Dec 10 23:32:58 2019][148556.094352] Lustre: fir-OST0056: Connection restored to 77f07ca8-e3bd-72f6-4ac1-3da8889522b3 (at 10.8.22.19@o2ib6) [Tue Dec 10 23:32:58 2019][148556.115220] Lustre: Skipped 4 previous similar messages [Tue Dec 10 23:41:20 2019][149058.222678] Lustre: fir-OST0054: Connection restored to 10918197-1d43-5fa6-1aea-d8f3cfbab80a (at 10.8.20.5@o2ib6) [Tue Dec 10 23:41:20 2019][149058.233028] Lustre: Skipped 5 previous similar messages [Wed Dec 11 01:11:55 2019][154493.133417] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055508/real 1576055508] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055515 ref 1 fl Rpc:X/0/ffffffff rc 0/-1 [Wed Dec 11 01:12:02 2019][154500.160563] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055515/real 1576055515] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055522 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:12:09 2019][154507.187710] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055522/real 1576055522] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055529 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:12:16 2019][154514.214855] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055529/real 1576055529] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055536 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:12:23 2019][154521.241994] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055536/real 1576055536] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055543 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:12:37 2019][154535.269278] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055550/real 1576055550] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055557 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:12:37 2019][154535.296553] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 1 previous similar message [Wed Dec 11 01:12:58 2019][154556.307704] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055571/real 1576055571] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055578 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:12:58 2019][154556.334976] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 2 previous similar messages [Wed Dec 11 01:13:33 2019][154591.345414] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055606/real 1576055606] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055613 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:13:33 2019][154591.372668] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 4 previous similar messages [Wed Dec 11 01:14:43 2019][154661.384828] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) @@@ Request sent has timed out for slow reply: [sent 1576055676/real 1576055676] req@ffff88ec68d40480 x1652452551974832/t0(0) o106->fir-OST0056@10.8.22.1@o2ib6:15/16 lens 296/280 e 0 to 1 dl 1576055683 ref 1 fl Rpc:X/2/ffffffff rc 0/-1 [Wed Dec 11 01:14:43 2019][154661.412114] Lustre: 67842:0:(client.c:2133:ptlrpc_expire_one_request()) Skipped 9 previous similar messages [Wed Dec 11 01:15:08 2019][154686.606341] LNet: Service thread pid 67842 was inactive for 200.46s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Wed Dec 11 01:15:08 2019][154686.623374] Pid: 67842, comm: ll_ost00_056 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Wed Dec 11 01:15:08 2019][154686.633949] Call Trace: [Wed Dec 11 01:15:08 2019][154686.636506] [] ptlrpc_set_wait+0x480/0x790 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.643193] [] ldlm_run_ast_work+0xd5/0x3a0 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.649970] [] ldlm_glimpse_locks+0x3b/0x100 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.656875] [] ofd_intent_policy+0x69b/0x920 [ofd] [Wed Dec 11 01:15:08 2019][154686.663456] [] ldlm_lock_enqueue+0x356/0xa20 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.670309] [] ldlm_handle_enqueue0+0xa56/0x15f0 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.677505] [] tgt_enqueue+0x62/0x210 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.683824] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.690877] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.698716] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Wed Dec 11 01:15:08 2019][154686.705172] [] kthread+0xd1/0xe0 [Wed Dec 11 01:15:08 2019][154686.710175] [] ret_from_fork_nospec_begin+0xe/0x21 [Wed Dec 11 01:15:08 2019][154686.716742] [] 0xffffffffffffffff [Wed Dec 11 01:15:08 2019][154686.721919] LustreError: dumping log to /tmp/lustre-log.1576055708.67842 [Wed Dec 11 01:15:15 2019][154692.916159] Lustre: fir-OST005c: haven't heard from client 09a03217-f2a1-2632-097f-38339f6cbc7c (at 10.8.22.1@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892260120000, cur 1576055715 expire 1576055565 last 1576055488 [Wed Dec 11 01:15:15 2019][154692.937983] LustreError: 67842:0:(ldlm_lockd.c:681:ldlm_handle_ast_error()) ### client (nid 10.8.22.1@o2ib6) failed to reply to glimpse AST (req@ffff88ec68d40480 x1652452551974832 status 0 rc -5), evict it ns: filter-fir-OST0056_UUID lock: ffff88fdf7a76c00/0x7066c9c18d907795 lrc: 3/0,0 mode: PW/PW res: [0x1880000402:0x2e6270:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 67108864->68719476735) flags: 0x40000000000000 nid: 10.8.22.1@o2ib6 remote: 0x891f0e0311d6012b expref: 6 pid: 66438 timeout: 0 lvb_type: 0 [Wed Dec 11 01:15:15 2019][154692.983786] LustreError: 138-a: fir-OST0056: A client on nid 10.8.22.1@o2ib6 was evicted due to a lock glimpse callback time out: rc -5 [Wed Dec 11 01:15:15 2019][154692.996079] LustreError: Skipped 1 previous similar message [Wed Dec 11 01:15:15 2019][154693.001792] LNet: Service thread pid 67842 completed after 206.86s. This indicates the system was overloaded (too many service threads, or there were not enough hardware resources). [Wed Dec 11 01:15:17 2019][154695.053939] Lustre: fir-OST005e: haven't heard from client 09a03217-f2a1-2632-097f-38339f6cbc7c (at 10.8.22.1@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892250855400, cur 1576055717 expire 1576055567 last 1576055490 [Wed Dec 11 01:15:17 2019][154695.075659] Lustre: Skipped 4 previous similar messages [Wed Dec 11 01:16:06 2019][154744.471386] Lustre: fir-OST0054: Connection restored to 37c7e464-6686-fdc0-1c81-eae75026a910 (at 10.8.22.2@o2ib6) [Wed Dec 11 01:16:06 2019][154744.481736] Lustre: Skipped 5 previous similar messages [Wed Dec 11 01:16:39 2019][154776.988984] LustreError: 67983:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0058: cli 75ca7fbe-4dbb-5345-e1bf-3a337b10784c claims 28672 GRANT, real grant 0 [Wed Dec 11 01:17:07 2019][154805.181795] LustreError: 67824:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0058: cli 1ca33a17-2a16-9d12-d021-e37db0ce1d5c claims 28672 GRANT, real grant 0 [Wed Dec 11 01:47:41 2019][156639.530128] Lustre: fir-OST0054: Connection restored to 1b1ace85-4b01-f903-bb83-ddb9142a20b0 (at 10.8.23.25@o2ib6) [Wed Dec 11 01:47:41 2019][156639.540596] Lustre: Skipped 5 previous similar messages [Wed Dec 11 01:50:36 2019][156814.118108] Lustre: fir-OST0054: Connection restored to (at 10.8.22.1@o2ib6) [Wed Dec 11 01:50:36 2019][156814.125342] Lustre: Skipped 5 previous similar messages [Wed Dec 11 02:10:21 2019][157998.975312] Lustre: fir-OST0056: haven't heard from client d48dfcab-ce8f-b93c-3409-a3e76df7c945 (at 10.8.23.22@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892251bfa000, cur 1576059021 expire 1576058871 last 1576058794 [Wed Dec 11 02:10:25 2019][158003.001566] Lustre: fir-OST005c: haven't heard from client d48dfcab-ce8f-b93c-3409-a3e76df7c945 (at 10.8.23.22@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892276b4d000, cur 1576059025 expire 1576058875 last 1576058798 [Wed Dec 11 02:10:25 2019][158003.023385] Lustre: Skipped 1 previous similar message [Wed Dec 11 02:46:47 2019][160185.543119] Lustre: fir-OST0054: Connection restored to d48dfcab-ce8f-b93c-3409-a3e76df7c945 (at 10.8.23.22@o2ib6) [Wed Dec 11 02:46:47 2019][160185.553555] Lustre: Skipped 5 previous similar messages [Wed Dec 11 07:09:33 2019][175952.204662] Lustre: fir-OST0054: Connection restored to 54375174-855e-4eb5-233f-bff7110a15a5 (at 10.8.22.7@o2ib6) [Wed Dec 11 07:09:33 2019][175952.215018] Lustre: Skipped 5 previous similar messages [Wed Dec 11 07:23:36 2019]SOL session closed by BMC [Wed Dec 11 07:23:36 2019]Error in SOL session [-- Console down -- Wed Dec 11 07:23:36 2019] [-- Console up -- Wed Dec 11 07:23:37 2019] [Wed Dec 11 07:23:37 2019]Acquiring startup lock...done [Wed Dec 11 07:23:37 2019]Info: SOL payload already de-activated [Wed Dec 11 07:23:37 2019][SOL Session operational. Use ~? for help] [Wed Dec 11 08:29:29 2019][180747.446024] Lustre: fir-OST0054: haven't heard from client 5a6b489d-8a0c-1dc7-c222-8c5330c92213 (at 10.8.8.20@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892252f4d800, cur 1576081769 expire 1576081619 last 1576081542 [Wed Dec 11 08:29:29 2019][180747.467764] Lustre: Skipped 3 previous similar messages [Wed Dec 11 08:32:30 2019][180928.431481] Lustre: fir-OST0054: haven't heard from client dcb788f4-67f3-4 (at 10.9.109.25@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff88f00a8e3c00, cur 1576081950 expire 1576081800 last 1576081723 [Wed Dec 11 08:32:30 2019][180928.451542] Lustre: Skipped 53 previous similar messages [Wed Dec 11 08:32:38 2019][180936.847782] Lustre: fir-OST0054: Connection restored to (at 10.9.107.20@o2ib4) [Wed Dec 11 08:32:38 2019][180936.855189] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:36:45 2019][181184.073424] Lustre: fir-OST0054: Connection restored to 2295c161-47a8-c199-f8c4-4e53eff1b957 (at 10.9.110.71@o2ib4) [Wed Dec 11 08:36:45 2019][181184.083951] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:37:00 2019][181199.055734] Lustre: fir-OST0056: Connection restored to e8872901-9e69-2d9a-e57a-55077a64186b (at 10.9.109.25@o2ib4) [Wed Dec 11 08:37:00 2019][181199.066249] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:45:05 2019][181684.308691] LustreError: 67945:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005a: cli da9f6e55-12b4-4 claims 1605632 GRANT, real grant 36864 [Wed Dec 11 08:53:23 2019][182181.803276] Lustre: fir-OST0054: Connection restored to 907ff646-c0ba-4 (at 10.9.117.46@o2ib4) [Wed Dec 11 08:53:23 2019][182181.811976] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:53:56 2019][182214.739773] Lustre: fir-OST0054: Connection restored to (at 10.8.9.1@o2ib6) [Wed Dec 11 08:53:56 2019][182214.746916] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:57:30 2019][182428.920092] Lustre: fir-OST0054: Connection restored to 8a77a7b3-28b8-5200-390a-7fe51bf1be0a (at 10.8.7.5@o2ib6) [Wed Dec 11 08:57:30 2019][182428.930351] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:59:03 2019][182522.314512] Lustre: fir-OST0054: Connection restored to 0df17536-86d5-4 (at 10.9.101.60@o2ib4) [Wed Dec 11 08:59:03 2019][182522.323216] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:59:20 2019][182538.970365] Lustre: fir-OST0054: Connection restored to 54fd6f2e-cb6c-4 (at 10.9.101.57@o2ib4) [Wed Dec 11 08:59:20 2019][182538.979070] Lustre: Skipped 5 previous similar messages [Wed Dec 11 08:59:30 2019][182549.155961] Lustre: fir-OST0054: Connection restored to (at 10.9.101.59@o2ib4) [Wed Dec 11 08:59:30 2019][182549.163362] Lustre: Skipped 5 previous similar messages [Wed Dec 11 09:00:59 2019][182638.397737] Lustre: fir-OST0054: Connection restored to 5a6b489d-8a0c-1dc7-c222-8c5330c92213 (at 10.8.8.20@o2ib6) [Wed Dec 11 09:00:59 2019][182638.408090] Lustre: Skipped 5 previous similar messages [Wed Dec 11 09:04:24 2019][182842.722377] Lustre: fir-OST0054: Connection restored to fc841094-f1fd-2756-1968-f74105b220e6 (at 10.8.8.30@o2ib6) [Wed Dec 11 09:04:24 2019][182842.732727] Lustre: Skipped 5 previous similar messages [Wed Dec 11 09:08:35 2019][183093.965890] Lustre: fir-OST0054: Connection restored to 8393b8d6-d8ea-1574-4a69-552de6648def (at 10.9.102.48@o2ib4) [Wed Dec 11 09:08:35 2019][183093.976416] Lustre: Skipped 16 previous similar messages [Wed Dec 11 09:16:13 2019][183552.110674] Lustre: fir-OST0054: Connection restored to 6676e5f3-c59e-c628-05b4-c9153b23c3f7 (at 10.8.21.16@o2ib6) [Wed Dec 11 09:16:13 2019][183552.121113] Lustre: Skipped 11 previous similar messages [Wed Dec 11 10:23:45 2019][187603.803944] LustreError: 67812:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli 57b26761-b79f-628f-0ec2-0a10fd7ac3bd claims 212992 GRANT, real grant 28672 [Wed Dec 11 10:30:05 2019][187984.504138] LustreError: 67963:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0054: cli da9f6e55-12b4-4 claims 1605632 GRANT, real grant 1597440 [Wed Dec 11 10:38:39 2019][188497.658275] LustreError: 67893:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005e: cli 837641b0-d89a-c20b-3139-4eb8fe8d733b claims 110592 GRANT, real grant 0 [Wed Dec 11 11:29:00 2019][191519.399673] Lustre: fir-OST0054: Connection restored to 5ce2e68e-76b2-bbc3-75c5-66a5c2b02651 (at 10.8.23.15@o2ib6) [Wed Dec 11 11:29:00 2019][191519.410118] Lustre: Skipped 10 previous similar messages [Wed Dec 11 11:52:30 2019][192928.681711] Lustre: fir-OST005e: haven't heard from client 45ffa07c-203c-dad9-8f0d-e714fc6465b8 (at 10.8.22.11@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892250928400, cur 1576093950 expire 1576093800 last 1576093723 [Wed Dec 11 11:52:30 2019][192928.703538] Lustre: Skipped 11 previous similar messages [Wed Dec 11 12:20:27 2019][194605.727054] Lustre: fir-OST005e: haven't heard from client 704e8622-7442-8eb3-b4e3-c86a69ef45af (at 10.8.20.21@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892250aacc00, cur 1576095627 expire 1576095477 last 1576095400 [Wed Dec 11 12:20:27 2019][194605.748848] Lustre: Skipped 5 previous similar messages [Wed Dec 11 12:20:46 2019][194624.706806] Lustre: fir-OST0054: haven't heard from client 704e8622-7442-8eb3-b4e3-c86a69ef45af (at 10.8.20.21@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892253eacc00, cur 1576095646 expire 1576095496 last 1576095419 [Wed Dec 11 12:20:46 2019][194624.728603] Lustre: Skipped 4 previous similar messages [Wed Dec 11 12:26:57 2019][194995.860637] Lustre: fir-OST0054: Connection restored to (at 10.8.22.11@o2ib6) [Wed Dec 11 12:26:57 2019][194995.867949] Lustre: Skipped 5 previous similar messages [Wed Dec 11 12:27:08 2019][195007.252896] Lustre: fir-OST0054: Connection restored to 4f86dcb5-8d8c-1599-bd44-005eb718eb65 (at 10.8.22.10@o2ib6) [Wed Dec 11 12:27:08 2019][195007.263331] Lustre: Skipped 5 previous similar messages [Wed Dec 11 12:28:04 2019][195063.004671] Lustre: fir-OST0054: Connection restored to a8841932-bc4a-ab11-1ace-8e1fdda46930 (at 10.8.23.23@o2ib6) [Wed Dec 11 12:28:04 2019][195063.015129] Lustre: Skipped 5 previous similar messages [Wed Dec 11 12:37:46 2019][195644.721852] Lustre: fir-OST0058: haven't heard from client c3415e6e-dda3-8602-28df-a932f656881d (at 10.9.112.17@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff89225390b000, cur 1576096666 expire 1576096516 last 1576096439 [Wed Dec 11 12:55:45 2019][196724.523607] Lustre: fir-OST0054: Connection restored to 704e8622-7442-8eb3-b4e3-c86a69ef45af (at 10.8.20.21@o2ib6) [Wed Dec 11 12:55:45 2019][196724.534047] Lustre: Skipped 5 previous similar messages [Wed Dec 11 12:56:17 2019][196756.062990] Lustre: fir-OST0054: Connection restored to (at 10.9.112.17@o2ib4) [Wed Dec 11 12:56:17 2019][196756.070397] Lustre: Skipped 5 previous similar messages [Wed Dec 11 13:02:17 2019][197116.468514] Lustre: fir-OST0054: Connection restored to (at 10.8.9.1@o2ib6) [Wed Dec 11 13:02:17 2019][197116.468515] Lustre: fir-OST0056: Connection restored to (at 10.8.9.1@o2ib6) [Wed Dec 11 13:02:17 2019][197116.482806] Lustre: Skipped 3 previous similar messages [Wed Dec 11 13:02:32 2019][197131.014869] Lustre: fir-OST0054: Connection restored to bdb2a993-354c-ddce-bf9d-5960b01c7975 (at 10.8.23.13@o2ib6) [Wed Dec 11 13:02:32 2019][197131.025313] Lustre: Skipped 5 previous similar messages [Wed Dec 11 13:04:06 2019][197225.346656] Lustre: fir-OST0054: Connection restored to 37c7e464-6686-fdc0-1c81-eae75026a910 (at 10.8.22.2@o2ib6) [Wed Dec 11 13:04:06 2019][197225.357004] Lustre: Skipped 5 previous similar messages [Wed Dec 11 13:05:44 2019][197323.297435] Lustre: fir-OST0054: Connection restored to (at 10.9.113.13@o2ib4) [Wed Dec 11 13:05:44 2019][197323.304838] Lustre: Skipped 5 previous similar messages [Wed Dec 11 13:07:05 2019][197404.416039] Lustre: fir-OST0054: Connection restored to 0df17536-86d5-4 (at 10.9.101.60@o2ib4) [Wed Dec 11 13:07:05 2019][197404.424752] Lustre: Skipped 5 previous similar messages [Wed Dec 11 13:12:37 2019][197736.574442] Lustre: fir-OST0054: Connection restored to 48f67746-6174-d4eb-bf6b-7295eeca30af (at 10.8.24.7@o2ib6) [Wed Dec 11 13:12:37 2019][197736.574443] Lustre: fir-OST0056: Connection restored to 48f67746-6174-d4eb-bf6b-7295eeca30af (at 10.8.24.7@o2ib6) [Wed Dec 11 13:12:37 2019][197736.574446] Lustre: Skipped 6 previous similar messages [Wed Dec 11 13:12:37 2019][197736.600487] Lustre: Skipped 4 previous similar messages [Wed Dec 11 13:37:40 2019][199238.819613] Lustre: fir-OST0058: haven't heard from client 000d6715-906a-fe00-99d9-1ba39760e7f7 (at 10.8.22.16@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892253cbbc00, cur 1576100260 expire 1576100110 last 1576100033 [Wed Dec 11 13:37:40 2019][199238.841428] Lustre: Skipped 5 previous similar messages [Wed Dec 11 13:45:53 2019][199731.802323] Lustre: fir-OST0058: haven't heard from client 85fbdf3d-35db-072c-03b7-e9977baaa2bf (at 10.8.23.12@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892261a48c00, cur 1576100753 expire 1576100603 last 1576100526 [Wed Dec 11 13:45:53 2019][199731.824114] Lustre: Skipped 11 previous similar messages [Wed Dec 11 13:49:24 2019][199942.960635] Lustre: fir-OST0054: Connection restored to (at 10.8.23.12@o2ib6) [Wed Dec 11 13:49:24 2019][199942.967956] Lustre: Skipped 5 previous similar messages [Wed Dec 11 14:12:19 2019][201318.492614] Lustre: fir-OST0054: Connection restored to cea3a46a-6e64-ecd2-2636-1b7611592cd3 (at 10.8.23.8@o2ib6) [Wed Dec 11 14:12:19 2019][201318.502968] Lustre: Skipped 5 previous similar messages [Wed Dec 11 14:12:27 2019][201326.401557] Lustre: fir-OST0054: Connection restored to 60e7dd38-7049-6086-949c-b7f68f3f00ca (at 10.8.23.18@o2ib6) [Wed Dec 11 14:12:27 2019][201326.411991] Lustre: Skipped 5 previous similar messages [Wed Dec 11 14:12:38 2019][201337.051897] Lustre: fir-OST0054: Connection restored to 5bbbecd7-709a-9f29-693e-a19d73c8cefb (at 10.8.22.18@o2ib6) [Wed Dec 11 14:12:38 2019][201337.062358] Lustre: Skipped 5 previous similar messages [Wed Dec 11 14:12:55 2019][201354.235845] Lustre: fir-OST0054: Connection restored to 94396c8b-eccd-7da2-de85-f79420b2e641 (at 10.8.23.33@o2ib6) [Wed Dec 11 14:12:55 2019][201354.246291] Lustre: Skipped 11 previous similar messages [Wed Dec 11 14:30:44 2019][202422.854023] Lustre: fir-OST0054: haven't heard from client 8c2fd243-a078-4 (at 10.9.117.46@o2ib4) in 227 seconds. I think it's dead, and I am evicting it. exp ffff8902f96e1000, cur 1576103444 expire 1576103294 last 1576103217 [Wed Dec 11 14:30:44 2019][202422.874103] Lustre: Skipped 5 previous similar messages [Wed Dec 11 14:33:19 2019][202578.475456] Lustre: fir-OST0054: Connection restored to 907ff646-c0ba-4 (at 10.9.117.46@o2ib4) [Wed Dec 11 14:33:19 2019][202578.484188] Lustre: Skipped 17 previous similar messages [Wed Dec 11 14:37:18 2019][202817.834063] Lustre: fir-OST0054: Connection restored to fb63a42c-93f0-576d-f57c-a83fc4375277 (at 10.8.21.2@o2ib6) [Wed Dec 11 14:37:18 2019][202817.844415] Lustre: Skipped 2 previous similar messages [Wed Dec 11 14:37:30 2019][202829.180467] Lustre: fir-OST0056: Connection restored to (at 10.8.22.32@o2ib6) [Wed Dec 11 14:37:30 2019][202829.187793] Lustre: Skipped 5 previous similar messages [Wed Dec 11 15:05:15 2019][204494.660619] Lustre: fir-OST0054: Connection restored to 98b70d1a-7357-ff1b-1e1d-8bd68b6592c2 (at 10.8.23.27@o2ib6) [Wed Dec 11 15:05:15 2019][204494.671065] Lustre: Skipped 5 previous similar messages [Wed Dec 11 15:09:25 2019][204744.214693] Lustre: fir-OST0054: Connection restored to 84c69ebc-7dc0-678f-942c-60a0d29de5a5 (at 10.8.22.27@o2ib6) [Wed Dec 11 15:09:25 2019][204744.225127] Lustre: Skipped 5 previous similar messages [Wed Dec 11 15:18:50 2019][205309.175062] LustreError: 67988:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005c: cli 8442b5f1-7da8-4 claims 28672 GRANT, real grant 0 [Wed Dec 11 18:58:17 2019][218476.265970] Lustre: fir-OST0054: Connection restored to 0aa269ad-def9-3be3-d596-fd7c0af955fb (at 10.8.20.26@o2ib6) [Wed Dec 11 18:58:17 2019][218476.276413] Lustre: Skipped 5 previous similar messages [Wed Dec 11 19:34:35 2019][220654.286010] Lustre: fir-OST0054: Connection restored to 207217ac-1163-df36-3120-8bf6c3ecbb93 (at 10.8.23.21@o2ib6) [Wed Dec 11 19:34:35 2019][220654.296456] Lustre: Skipped 5 previous similar messages [Wed Dec 11 19:47:16 2019][221415.472120] LustreError: 67718:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005c: cli 35ba350a-bccc-3fd9-39f0-a94eca80785d claims 16752640 GRANT, real grant 0 [Wed Dec 11 21:40:15 2019][228194.663383] Lustre: fir-OST0054: Connection restored to e15078c5-8209-4 (at 10.8.25.17@o2ib6) [Wed Dec 11 21:40:15 2019][228194.671998] Lustre: Skipped 5 previous similar messages [Wed Dec 11 21:41:08 2019][228247.375046] Lustre: fir-OST0054: haven't heard from client e15078c5-8209-4 (at 10.8.25.17@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff892253ace800, cur 1576129268 expire 1576129118 last 1576129041 [Wed Dec 11 21:41:08 2019][228247.395022] Lustre: Skipped 5 previous similar messages [Wed Dec 11 21:47:23 2019][228622.376907] Lustre: fir-OST0054: haven't heard from client 208ccf09-d6ca-4 (at 10.8.25.17@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff88f4f5725c00, cur 1576129643 expire 1576129493 last 1576129416 [Wed Dec 11 21:47:23 2019][228622.396920] Lustre: Skipped 5 previous similar messages [Wed Dec 11 22:06:24 2019][229763.991868] Lustre: fir-OST0054: Connection restored to e15078c5-8209-4 (at 10.8.25.17@o2ib6) [Wed Dec 11 22:06:24 2019][229764.000503] Lustre: Skipped 5 previous similar messages [Wed Dec 11 22:14:23 2019][230242.409048] Lustre: fir-OST0056: haven't heard from client 0cfc0c49-f407-4 (at 10.8.25.17@o2ib6) in 227 seconds. I think it's dead, and I am evicting it. exp ffff890858777400, cur 1576131263 expire 1576131113 last 1576131036 [Wed Dec 11 22:14:23 2019][230242.429025] Lustre: Skipped 5 previous similar messages [Wed Dec 11 22:54:40 2019][232660.307500] Lustre: fir-OST0054: Connection restored to f8d5264b-1de7-5abd-fef8-60297df9f169 (at 10.8.22.20@o2ib6) [Wed Dec 11 22:54:40 2019][232660.317943] Lustre: Skipped 5 previous similar messages [Wed Dec 11 22:54:46 2019][232666.429524] Lustre: fir-OST0054: Connection restored to bd358c1a-07c6-3f9f-7c84-efdb04e29ef9 (at 10.8.21.1@o2ib6) [Wed Dec 11 22:54:46 2019][232666.439875] Lustre: Skipped 5 previous similar messages [Wed Dec 11 22:55:51 2019][232730.711982] Lustre: fir-OST0054: Connection restored to 26627d4d-9b72-83d5-02a3-73c7f9501a91 (at 10.8.22.26@o2ib6) [Wed Dec 11 22:55:51 2019][232730.722419] Lustre: Skipped 5 previous similar messages [Wed Dec 11 23:57:33 2019][236433.088011] LustreError: 67718:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST005c: cli 8442b5f1-7da8-4 claims 3637248 GRANT, real grant 28672 [Thu Dec 12 00:00:44 2019][236623.856599] LustreError: 67971:0:(tgt_grant.c:758:tgt_grant_check()) fir-OST0056: cli cc645112-3584-d084-5d6b-c64af0bf19ce claims 299008 GRANT, real grant 0 [Thu Dec 12 00:10:47 2019][237227.255864] LNet: Service thread pid 67997 was inactive for 200.48s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:10:47 2019][237227.272892] Pid: 67997, comm: ll_ost_io00_031 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:10:47 2019][237227.283678] Call Trace: [Thu Dec 12 00:10:47 2019][237227.286234] [] bitmap_startwrite+0x1f5/0x210 [Thu Dec 12 00:10:47 2019][237227.292286] [] add_stripe_bio+0x451/0x7f0 [raid456] [Thu Dec 12 00:10:47 2019][237227.298953] [] raid5_make_request+0x1e4/0xca0 [raid456] [Thu Dec 12 00:10:47 2019][237227.305972] [] md_handle_request+0xd0/0x150 [Thu Dec 12 00:10:47 2019][237227.311936] [] md_make_request+0x79/0x190 [Thu Dec 12 00:10:47 2019][237227.317759] [] generic_make_request+0x147/0x380 [Thu Dec 12 00:10:47 2019][237227.324088] [] submit_bio+0x70/0x150 [Thu Dec 12 00:10:47 2019][237227.329487] [] osd_submit_bio+0x1c/0x60 [osd_ldiskfs] [Thu Dec 12 00:10:47 2019][237227.336326] [] osd_do_bio.isra.35+0x9b5/0xab0 [osd_ldiskfs] [Thu Dec 12 00:10:47 2019][237227.343723] [] osd_write_commit+0x3ec/0x8c0 [osd_ldiskfs] [Thu Dec 12 00:10:47 2019][237227.350903] [] ofd_commitrw_write+0xfbe/0x1d40 [ofd] [Thu Dec 12 00:10:47 2019][237227.357653] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:10:47 2019][237227.363835] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.370543] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.377579] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.385406] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.391836] [] kthread+0xd1/0xe0 [Thu Dec 12 00:10:47 2019][237227.396855] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:10:47 2019][237227.403431] [] 0xffffffffffffffff [Thu Dec 12 00:10:47 2019][237227.408592] LustreError: dumping log to /tmp/lustre-log.1576138247.67997 [Thu Dec 12 00:10:47 2019][237227.417351] Pid: 67875, comm: ll_ost_io01_019 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:10:47 2019][237227.428137] Call Trace: [Thu Dec 12 00:10:47 2019][237227.430693] [] osd_trans_stop+0x265/0x8e0 [osd_ldiskfs] [Thu Dec 12 00:10:47 2019][237227.437702] [] ofd_trans_stop+0x25/0x60 [ofd] [Thu Dec 12 00:10:47 2019][237227.443864] [] ofd_commitrw_write+0x9d4/0x1d40 [ofd] [Thu Dec 12 00:10:47 2019][237227.450606] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:10:47 2019][237227.456750] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.463458] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.470523] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.478351] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:10:47 2019][237227.484804] [] kthread+0xd1/0xe0 [Thu Dec 12 00:10:47 2019][237227.489810] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:10:47 2019][237227.496411] [] 0xffffffffffffffff [Thu Dec 12 00:10:47 2019][237227.501547] Pid: 66127, comm: ll_ost_io01_000 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:10:47 2019][237227.512325] Call Trace: [Thu Dec 12 00:10:47 2019][237227.514878] [] bitmap_startwrite+0x1f5/0x210 [Thu Dec 12 00:10:47 2019][237227.520928] [] add_stripe_bio+0x451/0x7f0 [raid456] [Thu Dec 12 00:10:48 2019][237227.527599] [] raid5_make_request+0x1e4/0xca0 [raid456] [Thu Dec 12 00:10:48 2019][237227.534635] [] md_handle_request+0xd0/0x150 [Thu Dec 12 00:10:48 2019][237227.540635] [] md_make_request+0x79/0x190 [Thu Dec 12 00:10:48 2019][237227.546452] [] generic_make_request+0x147/0x380 [Thu Dec 12 00:10:48 2019][237227.552776] [] submit_bio+0x70/0x150 [Thu Dec 12 00:10:48 2019][237227.558158] [] osd_submit_bio+0x1c/0x60 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.565041] [] osd_do_bio.isra.35+0x489/0xab0 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.572395] [] osd_write_commit+0x3ec/0x8c0 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.579591] [] ofd_commitrw_write+0xfbe/0x1d40 [ofd] [Thu Dec 12 00:10:48 2019][237227.586343] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:10:48 2019][237227.592521] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.599219] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.606309] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.614163] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.620594] [] kthread+0xd1/0xe0 [Thu Dec 12 00:10:48 2019][237227.625610] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:10:48 2019][237227.632189] [] 0xffffffffffffffff [Thu Dec 12 00:10:48 2019][237227.637348] Pid: 67903, comm: ll_ost_io02_025 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:10:48 2019][237227.648139] Call Trace: [Thu Dec 12 00:10:48 2019][237227.650703] [] bitmap_startwrite+0x1f5/0x210 [Thu Dec 12 00:10:48 2019][237227.656754] [] add_stripe_bio+0x451/0x7f0 [raid456] [Thu Dec 12 00:10:48 2019][237227.663473] [] raid5_make_request+0x1e4/0xca0 [raid456] [Thu Dec 12 00:10:48 2019][237227.670548] [] md_handle_request+0xd0/0x150 [Thu Dec 12 00:10:48 2019][237227.676521] [] md_make_request+0x79/0x190 [Thu Dec 12 00:10:48 2019][237227.682317] [] generic_make_request+0x147/0x380 [Thu Dec 12 00:10:48 2019][237227.688659] [] submit_bio+0x70/0x150 [Thu Dec 12 00:10:48 2019][237227.694018] [] osd_submit_bio+0x1c/0x60 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.700923] [] osd_do_bio.isra.35+0x9b5/0xab0 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.708290] [] osd_write_commit+0x3ec/0x8c0 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.715539] [] ofd_commitrw_write+0xfbe/0x1d40 [ofd] [Thu Dec 12 00:10:48 2019][237227.722288] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:10:48 2019][237227.728448] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.735151] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.742187] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.750006] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.756489] [] kthread+0xd1/0xe0 [Thu Dec 12 00:10:48 2019][237227.761507] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:10:48 2019][237227.768122] [] 0xffffffffffffffff [Thu Dec 12 00:10:48 2019][237227.773261] LNet: Service thread pid 67874 was inactive for 201.01s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:10:48 2019][237227.790313] LNet: Skipped 3 previous similar messages [Thu Dec 12 00:10:48 2019][237227.795467] Pid: 67874, comm: ll_ost_io02_019 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:10:48 2019][237227.806299] Call Trace: [Thu Dec 12 00:10:48 2019][237227.808854] [] bitmap_startwrite+0x1f5/0x210 [Thu Dec 12 00:10:48 2019][237227.814913] [] add_stripe_bio+0x451/0x7f0 [raid456] [Thu Dec 12 00:10:48 2019][237227.821567] [] raid5_make_request+0x1e4/0xca0 [raid456] [Thu Dec 12 00:10:48 2019][237227.828571] [] md_handle_request+0xd0/0x150 [Thu Dec 12 00:10:48 2019][237227.834545] [] md_make_request+0x79/0x190 [Thu Dec 12 00:10:48 2019][237227.840363] [] generic_make_request+0x147/0x380 [Thu Dec 12 00:10:48 2019][237227.846684] [] submit_bio+0x70/0x150 [Thu Dec 12 00:10:48 2019][237227.852096] [] osd_submit_bio+0x1c/0x60 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.858932] [] osd_do_bio.isra.35+0x9b5/0xab0 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.866313] [] osd_write_commit+0x3ec/0x8c0 [osd_ldiskfs] [Thu Dec 12 00:10:48 2019][237227.873502] [] ofd_commitrw_write+0xfbe/0x1d40 [ofd] [Thu Dec 12 00:10:48 2019][237227.880275] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:10:48 2019][237227.886422] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.893190] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.900239] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.908063] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:10:48 2019][237227.914493] [] kthread+0xd1/0xe0 [Thu Dec 12 00:10:48 2019][237227.919539] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:10:48 2019][237227.926108] [] 0xffffffffffffffff [Thu Dec 12 00:10:48 2019][237227.931236] LNet: Service thread pid 66126 was inactive for 201.17s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:10:50 2019][237229.815918] LNet: Service thread pid 68032 was inactive for 200.37s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:10:50 2019][237229.828881] LustreError: dumping log to /tmp/lustre-log.1576138250.68032 [Thu Dec 12 00:10:51 2019][237230.839941] LNet: Service thread pid 67718 was inactive for 200.50s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:10:51 2019][237230.852901] LNet: Skipped 3 previous similar messages [Thu Dec 12 00:10:51 2019][237230.858054] LustreError: dumping log to /tmp/lustre-log.1576138251.67718 [Thu Dec 12 00:10:52 2019][237231.863959] LustreError: dumping log to /tmp/lustre-log.1576138252.67781 [Thu Dec 12 00:10:53 2019][237232.887976] LNet: Service thread pid 68043 was inactive for 200.48s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:10:53 2019][237232.900923] LNet: Skipped 2 previous similar messages [Thu Dec 12 00:10:53 2019][237232.906072] LustreError: dumping log to /tmp/lustre-log.1576138253.68043 [Thu Dec 12 00:10:54 2019][237234.424012] LustreError: dumping log to /tmp/lustre-log.1576138254.67827 [Thu Dec 12 00:10:56 2019][237235.960051] LustreError: dumping log to /tmp/lustre-log.1576138256.67824 [Thu Dec 12 00:10:58 2019][237238.008081] LNet: Service thread pid 67769 was inactive for 200.14s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:10:58 2019][237238.021049] LNet: Skipped 4 previous similar messages [Thu Dec 12 00:10:58 2019][237238.026194] LustreError: dumping log to /tmp/lustre-log.1576138258.67769 [Thu Dec 12 00:11:00 2019][237240.056127] LustreError: dumping log to /tmp/lustre-log.1576138260.67905 [Thu Dec 12 00:11:01 2019][237240.568137] LustreError: dumping log to /tmp/lustre-log.1576138261.67944 [Thu Dec 12 00:11:02 2019][237242.104166] LustreError: dumping log to /tmp/lustre-log.1576138262.67688 [Thu Dec 12 00:11:04 2019][237244.152206] LustreError: dumping log to /tmp/lustre-log.1576138264.67885 [Thu Dec 12 00:11:05 2019][237244.664220] LustreError: dumping log to /tmp/lustre-log.1576138265.67597 [Thu Dec 12 00:11:06 2019][237245.930444] INFO: task ll_ost_io00_000:66124 blocked for more than 120 seconds. [Thu Dec 12 00:11:06 2019][237245.937848] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:06 2019][237245.945773] ll_ost_io00_000 D ffff8912e629a080 0 66124 2 0x00000080 [Thu Dec 12 00:11:06 2019][237245.952969] Call Trace: [Thu Dec 12 00:11:06 2019][237245.955566] [] schedule+0x29/0x70 [Thu Dec 12 00:11:06 2019][237245.960650] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:06 2019][237245.967669] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:06 2019][237245.973595] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:06 2019][237245.980735] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:06 2019][237245.987362] [] ? osd_declare_xattr_set+0xf1/0x3a0 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237245.995045] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:06 2019][237246.001159] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:06 2019][237246.007874] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237246.015146] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:06 2019][237246.022649] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237246.029726] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:11:06 2019][237246.035932] [] ofd_object_punch+0x73d/0xd30 [ofd] [Thu Dec 12 00:11:06 2019][237246.042401] [] ofd_punch_hdl+0x493/0xa30 [ofd] [Thu Dec 12 00:11:06 2019][237246.048642] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.055670] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.063354] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:06 2019][237246.070549] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.078351] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.085236] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:06 2019][237246.090603] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.097021] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.104504] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:06 2019][237246.109503] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:06 2019][237246.115704] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:06 2019][237246.122256] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:06 2019][237246.128463] INFO: task ll_ost_io00_002:66126 blocked for more than 120 seconds. [Thu Dec 12 00:11:06 2019][237246.135860] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:06 2019][237246.143801] ll_ost_io00_002 D ffff8903a9ada080 0 66126 2 0x00000080 [Thu Dec 12 00:11:06 2019][237246.150990] Call Trace: [Thu Dec 12 00:11:06 2019][237246.153538] [] schedule+0x29/0x70 [Thu Dec 12 00:11:06 2019][237246.158632] [] bitmap_startwrite+0x1f5/0x210 [Thu Dec 12 00:11:06 2019][237246.164662] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:06 2019][237246.170594] [] add_stripe_bio+0x451/0x7f0 [raid456] [Thu Dec 12 00:11:06 2019][237246.177235] [] raid5_make_request+0x1e4/0xca0 [raid456] [Thu Dec 12 00:11:06 2019][237246.184241] [] ? find_get_pages+0x180/0x1d0 [Thu Dec 12 00:11:06 2019][237246.190182] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:06 2019][237246.196135] [] ? mempool_alloc_slab+0x15/0x20 [Thu Dec 12 00:11:06 2019][237246.202251] [] md_handle_request+0xd0/0x150 [Thu Dec 12 00:11:06 2019][237246.208205] [] ? generic_make_request_checks+0x2a7/0x440 [Thu Dec 12 00:11:06 2019][237246.215287] [] md_make_request+0x79/0x190 [Thu Dec 12 00:11:06 2019][237246.221066] [] generic_make_request+0x147/0x380 [Thu Dec 12 00:11:06 2019][237246.227341] [] ? md_mergeable_bvec+0x46/0x50 [Thu Dec 12 00:11:06 2019][237246.233373] [] submit_bio+0x70/0x150 [Thu Dec 12 00:11:06 2019][237246.238725] [] ? lprocfs_oh_tally+0x17/0x40 [obdclass] [Thu Dec 12 00:11:06 2019][237246.245612] [] osd_submit_bio+0x1c/0x60 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237246.252426] [] osd_do_bio.isra.35+0x489/0xab0 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237246.259757] [] ? __find_get_page+0x1e/0xa0 [Thu Dec 12 00:11:06 2019][237246.265603] [] osd_write_commit+0x3ec/0x8c0 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237246.272769] [] ? osd_trans_start+0x235/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:06 2019][237246.280027] [] ofd_commitrw_write+0xfbe/0x1d40 [ofd] [Thu Dec 12 00:11:06 2019][237246.286754] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:11:06 2019][237246.292923] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.299576] [] ? lustre_msg_buf+0x17/0x60 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.306182] [] ? __req_capsule_get+0x163/0x740 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.313180] [] ? class_handle2object+0xb9/0x1c0 [obdclass] [Thu Dec 12 00:11:06 2019][237246.320427] [] ? update_curr+0x14c/0x1e0 [Thu Dec 12 00:11:06 2019][237246.326092] [] ? account_entity_dequeue+0xae/0xd0 [Thu Dec 12 00:11:06 2019][237246.332569] [] ? target_send_reply_msg+0x170/0x170 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.339919] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.346937] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.354605] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:06 2019][237246.361824] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.369605] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.376489] [] ? wake_up_state+0x20/0x20 [Thu Dec 12 00:11:06 2019][237246.382187] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.388599] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:06 2019][237246.396088] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:06 2019][237246.401092] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:06 2019][237246.407284] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:06 2019][237246.413835] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:06 2019][237246.420038] INFO: task ll_ost_io01_000:66127 blocked for more than 120 seconds. [Thu Dec 12 00:11:06 2019][237246.427467] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:06 2019][237246.435405] ll_ost_io01_000 D ffff8912f2f12080 0 66127 2 0x00000080 [Thu Dec 12 00:11:06 2019][237246.442596] Call Trace: [Thu Dec 12 00:11:06 2019][237246.445142] [] schedule+0x29/0x70 [Thu Dec 12 00:11:06 2019][237246.450243] [] bitmap_startwrite+0x1f5/0x210 [Thu Dec 12 00:11:06 2019][237246.456264] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:06 2019][237246.462193] [] add_stripe_bio+0x451/0x7f0 [raid456] [Thu Dec 12 00:11:06 2019][237246.468826] [] raid5_make_request+0x1e4/0xca0 [raid456] [Thu Dec 12 00:11:06 2019][237246.475794] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:06 2019][237246.481740] [] ? mempool_alloc_slab+0x15/0x20 [Thu Dec 12 00:11:06 2019][237246.487842] [] md_handle_request+0xd0/0x150 [Thu Dec 12 00:11:06 2019][237246.493819] [] ? generic_make_request_checks+0x2a7/0x440 [Thu Dec 12 00:11:06 2019][237246.500868] [] md_make_request+0x79/0x190 [Thu Dec 12 00:11:06 2019][237246.506640] [] generic_make_request+0x147/0x380 [Thu Dec 12 00:11:06 2019][237246.512913] [] ? md_mergeable_bvec+0x46/0x50 [Thu Dec 12 00:11:06 2019][237246.518927] [] submit_bio+0x70/0x150 [Thu Dec 12 00:11:06 2019][237246.524274] [] ? lprocfs_oh_tally+0x17/0x40 [obdclass] [Thu Dec 12 00:11:06 2019][237246.531183] [] osd_submit_bio+0x1c/0x60 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.537985] [] osd_do_bio.isra.35+0x489/0xab0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.545318] [] ? __find_get_page+0x1e/0xa0 [Thu Dec 12 00:11:07 2019][237246.551159] [] osd_write_commit+0x3ec/0x8c0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.558307] [] ? osd_trans_start+0x235/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.565536] [] ofd_commitrw_write+0xfbe/0x1d40 [ofd] [Thu Dec 12 00:11:07 2019][237246.572276] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:11:07 2019][237246.578433] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.585121] [] ? lustre_msg_buf+0x17/0x60 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.591685] [] ? __req_capsule_get+0x163/0x740 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.598707] [] ? class_handle2object+0xb9/0x1c0 [obdclass] [Thu Dec 12 00:11:07 2019][237246.605937] [] ? update_curr+0x14c/0x1e0 [Thu Dec 12 00:11:07 2019][237246.611602] [] ? account_entity_dequeue+0xae/0xd0 [Thu Dec 12 00:11:07 2019][237246.618095] [] ? target_send_reply_msg+0x170/0x170 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.625443] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.632471] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.640132] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:07 2019][237246.647314] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.655089] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.661982] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:07 2019][237246.667343] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.673762] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.681285] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:07 2019][237246.686274] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237246.692473] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:07 2019][237246.699022] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237246.705209] INFO: task ll_ost_io03_002:66137 blocked for more than 120 seconds. [Thu Dec 12 00:11:07 2019][237246.712622] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:07 2019][237246.720553] ll_ost_io03_002 D ffff89125d170000 0 66137 2 0x00000080 [Thu Dec 12 00:11:07 2019][237246.727754] Call Trace: [Thu Dec 12 00:11:07 2019][237246.730299] [] schedule+0x29/0x70 [Thu Dec 12 00:11:07 2019][237246.735378] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:07 2019][237246.742396] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:07 2019][237246.748326] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:07 2019][237246.755505] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:07 2019][237246.762128] [] ? osd_declare_xattr_set+0xf1/0x3a0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.769803] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:07 2019][237246.775906] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:07 2019][237246.782647] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.789882] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:07 2019][237246.797411] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237246.804477] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:11:07 2019][237246.810721] [] ofd_commitrw_write+0xa31/0x1d40 [ofd] [Thu Dec 12 00:11:07 2019][237246.817431] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:11:07 2019][237246.823617] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.830265] [] ? lustre_msg_buf+0x17/0x60 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.836853] [] ? __req_capsule_get+0x163/0x740 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.843861] [] ? class_handle2object+0xb9/0x1c0 [obdclass] [Thu Dec 12 00:11:07 2019][237246.851092] [] ? update_curr+0x14c/0x1e0 [Thu Dec 12 00:11:07 2019][237246.856771] [] ? mutex_lock+0x12/0x2f [Thu Dec 12 00:11:07 2019][237246.862216] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.869251] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.876919] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:07 2019][237246.884102] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.891882] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.898797] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:07 2019][237246.904158] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.910586] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:07 2019][237246.918075] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:07 2019][237246.923079] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237246.929271] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:07 2019][237246.935851] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237246.942039] INFO: task jbd2/md2-8:66168 blocked for more than 120 seconds. [Thu Dec 12 00:11:07 2019][237246.949004] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:07 2019][237246.956928] jbd2/md2-8 D ffff8912f5c68000 0 66168 2 0x00000080 [Thu Dec 12 00:11:07 2019][237246.964142] Call Trace: [Thu Dec 12 00:11:07 2019][237246.966690] [] schedule+0x29/0x70 [Thu Dec 12 00:11:07 2019][237246.971790] [] jbd2_journal_commit_transaction+0x23c/0x19b0 [jbd2] [Thu Dec 12 00:11:07 2019][237246.979729] [] ? dequeue_task_fair+0x41e/0x660 [Thu Dec 12 00:11:07 2019][237246.985943] [] ? __switch_to+0xce/0x580 [Thu Dec 12 00:11:07 2019][237246.991525] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:07 2019][237246.997454] [] ? __schedule+0x42a/0x860 [Thu Dec 12 00:11:07 2019][237247.003050] [] ? try_to_del_timer_sync+0x5e/0x90 [Thu Dec 12 00:11:07 2019][237247.009427] [] kjournald2+0xc9/0x260 [jbd2] [Thu Dec 12 00:11:07 2019][237247.015366] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:07 2019][237247.021307] [] ? commit_timeout+0x10/0x10 [jbd2] [Thu Dec 12 00:11:07 2019][237247.027681] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:07 2019][237247.032649] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237247.038853] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:07 2019][237247.045381] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237247.051588] INFO: task ll_ost02_005:66899 blocked for more than 120 seconds. [Thu Dec 12 00:11:07 2019][237247.058743] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:07 2019][237247.066672] ll_ost02_005 D ffff8912c7616180 0 66899 2 0x00000080 [Thu Dec 12 00:11:07 2019][237247.073875] Call Trace: [Thu Dec 12 00:11:07 2019][237247.076430] [] schedule+0x29/0x70 [Thu Dec 12 00:11:07 2019][237247.081491] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:07 2019][237247.088472] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:07 2019][237247.094404] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:07 2019][237247.101577] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:07 2019][237247.108210] [] ? osd_declare_write+0x350/0x490 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237247.115646] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:07 2019][237247.121746] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:07 2019][237247.128487] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237247.135725] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:07 2019][237247.143248] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237247.150305] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:11:07 2019][237247.156495] [] ofd_attr_set+0x464/0xb60 [ofd] [Thu Dec 12 00:11:07 2019][237247.162595] [] ofd_setattr_hdl+0x31d/0x8e0 [ofd] [Thu Dec 12 00:11:07 2019][237247.169036] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.176040] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.183720] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:07 2019][237247.190916] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.198730] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.205626] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:07 2019][237247.210993] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.217392] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.224271] LNet: Service thread pid 67794 was inactive for 200.41s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:11:07 2019][237247.224273] LNet: Skipped 11 previous similar messages [Thu Dec 12 00:11:07 2019][237247.224275] LustreError: dumping log to /tmp/lustre-log.1576138267.67794 [Thu Dec 12 00:11:07 2019][237247.249902] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:07 2019][237247.254877] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237247.261095] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:07 2019][237247.267641] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237247.273844] INFO: task ll_ost_io00_003:67590 blocked for more than 120 seconds. [Thu Dec 12 00:11:07 2019][237247.281255] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:07 2019][237247.289197] ll_ost_io00_003 D ffff8903a99c8000 0 67590 2 0x00000080 [Thu Dec 12 00:11:07 2019][237247.296406] Call Trace: [Thu Dec 12 00:11:07 2019][237247.298950] [] schedule+0x29/0x70 [Thu Dec 12 00:11:07 2019][237247.304052] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:07 2019][237247.311037] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:07 2019][237247.316992] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:07 2019][237247.324151] [] ? crypto_mod_get+0x19/0x40 [Thu Dec 12 00:11:07 2019][237247.329921] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:07 2019][237247.336548] [] ? osd_declare_qid+0x200/0x4a0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237247.343793] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:07 2019][237247.349907] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:07 2019][237247.356635] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237247.363885] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:07 2019][237247.371405] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:07 2019][237247.378481] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:11:07 2019][237247.385205] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:11:07 2019][237247.391384] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.398040] [] ? lustre_msg_buf+0x17/0x60 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.404643] [] ? __req_capsule_get+0x163/0x740 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.411612] [] ? __enqueue_entity+0x78/0x80 [Thu Dec 12 00:11:07 2019][237247.417609] [] ? target_send_reply_msg+0x170/0x170 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.424957] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.431972] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.439650] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:07 2019][237247.446832] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.454624] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.461529] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:07 2019][237247.466896] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.473323] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:07 2019][237247.480864] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:07 2019][237247.485861] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237247.492049] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:07 2019][237247.498591] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:07 2019][237247.504796] INFO: task ll_ost_io01_003:67594 blocked for more than 120 seconds. [Thu Dec 12 00:11:07 2019][237247.512214] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:08 2019][237247.520135] ll_ost_io01_003 D ffff8912f1b12080 0 67594 2 0x00000080 [Thu Dec 12 00:11:08 2019][237247.527326] Call Trace: [Thu Dec 12 00:11:08 2019][237247.529870] [] schedule+0x29/0x70 [Thu Dec 12 00:11:08 2019][237247.534962] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:08 2019][237247.541940] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:08 2019][237247.547888] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:08 2019][237247.555045] [] ? crypto_mod_get+0x19/0x40 [Thu Dec 12 00:11:08 2019][237247.560821] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:08 2019][237247.567468] [] ? osd_declare_qid+0x200/0x4a0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.574710] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:08 2019][237247.580829] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:08 2019][237247.587557] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.594794] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:08 2019][237247.602325] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.609385] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:11:08 2019][237247.616127] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:11:08 2019][237247.622290] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.628940] [] ? lustre_msg_buf+0x17/0x60 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.635523] [] ? __req_capsule_get+0x163/0x740 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.642489] [] ? __enqueue_entity+0x78/0x80 [Thu Dec 12 00:11:08 2019][237247.648457] [] ? target_send_reply_msg+0x170/0x170 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.655804] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.662857] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.670521] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:08 2019][237247.677729] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.685502] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.692381] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:08 2019][237247.697762] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.704154] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.711655] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:08 2019][237247.716644] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:08 2019][237247.722848] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:08 2019][237247.729411] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:08 2019][237247.735620] INFO: task ll_ost_io01_004:67597 blocked for more than 120 seconds. [Thu Dec 12 00:11:08 2019][237247.736285] LustreError: dumping log to /tmp/lustre-log.1576138268.67917 [Thu Dec 12 00:11:08 2019][237247.749831] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:08 2019][237247.757760] ll_ost_io01_004 D ffff8912f4bd9040 0 67597 2 0x00000080 [Thu Dec 12 00:11:08 2019][237247.764963] Call Trace: [Thu Dec 12 00:11:08 2019][237247.767520] [] schedule+0x29/0x70 [Thu Dec 12 00:11:08 2019][237247.772586] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:08 2019][237247.779603] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:08 2019][237247.785536] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:08 2019][237247.792692] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:08 2019][237247.799318] [] ? osd_declare_xattr_set+0xf1/0x3a0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.807018] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:08 2019][237247.813123] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:08 2019][237247.819871] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.827108] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:08 2019][237247.834623] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.841698] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:11:08 2019][237247.847884] [] ofd_commitrw_write+0xa31/0x1d40 [ofd] [Thu Dec 12 00:11:08 2019][237247.854608] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:11:08 2019][237247.860778] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.867430] [] ? lustre_msg_buf+0x17/0x60 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.873991] [] ? __req_capsule_get+0x163/0x740 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.880973] [] ? mutex_lock+0x12/0x2f [Thu Dec 12 00:11:08 2019][237247.886432] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.893448] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.901127] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:08 2019][237247.908295] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.916082] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.922993] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:08 2019][237247.928341] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.934743] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:08 2019][237247.942231] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:08 2019][237247.947218] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:08 2019][237247.953433] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:08 2019][237247.959961] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:08 2019][237247.966184] INFO: task ll_ost03_027:67684 blocked for more than 120 seconds. [Thu Dec 12 00:11:08 2019][237247.973326] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [Thu Dec 12 00:11:08 2019][237247.981279] ll_ost03_027 D ffff8922cb341040 0 67684 2 0x00000080 [Thu Dec 12 00:11:08 2019][237247.988473] Call Trace: [Thu Dec 12 00:11:08 2019][237247.991025] [] ? fid_is_on_ost+0x3f4/0x420 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237247.998109] [] schedule+0x29/0x70 [Thu Dec 12 00:11:08 2019][237248.003179] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:11:08 2019][237248.010159] [] ? wake_up_atomic_t+0x30/0x30 [Thu Dec 12 00:11:08 2019][237248.016098] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:11:08 2019][237248.023269] [] ? lprocfs_counter_add+0xf9/0x160 [obdclass] [Thu Dec 12 00:11:08 2019][237248.030529] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:11:08 2019][237248.037185] [] ? osd_declare_qid+0x200/0x4a0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237248.044415] [] ? kmem_cache_alloc+0x1c2/0x1f0 [Thu Dec 12 00:11:08 2019][237248.050545] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:11:08 2019][237248.057273] [] ? osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237248.064524] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:11:08 2019][237248.072039] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:11:08 2019][237248.079108] [] ofd_precreate_objects+0xa57/0x1d80 [ofd] [Thu Dec 12 00:11:08 2019][237248.086096] [] ofd_create_hdl+0x474/0x20e0 [ofd] [Thu Dec 12 00:11:08 2019][237248.092515] [] ? lustre_pack_reply_v2+0x135/0x290 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.099797] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.106809] [] ? ptlrpc_nrs_req_get_nolock0+0xd1/0x170 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.114494] [] ? ktime_get_real_seconds+0xe/0x10 [libcfs] [Thu Dec 12 00:11:08 2019][237248.121697] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.129494] [] ? ptlrpc_wait_event+0xa5/0x360 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.136392] [] ? __wake_up+0x44/0x50 [Thu Dec 12 00:11:08 2019][237248.141753] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.148156] [] ? ptlrpc_register_service+0xf80/0xf80 [ptlrpc] [Thu Dec 12 00:11:08 2019][237248.155646] [] kthread+0xd1/0xe0 [Thu Dec 12 00:11:08 2019][237248.160645] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:08 2019][237248.166833] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:11:08 2019][237248.173395] [] ? insert_kthread_work+0x40/0x40 [Thu Dec 12 00:11:09 2019][237249.272305] LustreError: dumping log to /tmp/lustre-log.1576138269.67928 [Thu Dec 12 00:11:10 2019][237249.784318] LustreError: dumping log to /tmp/lustre-log.1576138270.68019 [Thu Dec 12 00:11:11 2019][237251.320348] LustreError: dumping log to /tmp/lustre-log.1576138271.68044 [Thu Dec 12 00:11:14 2019][237254.392416] LustreError: dumping log to /tmp/lustre-log.1576138274.67962 [Thu Dec 12 00:11:15 2019][237255.416430] LustreError: dumping log to /tmp/lustre-log.1576138275.68025 [Thu Dec 12 00:11:16 2019][237256.440464] LustreError: dumping log to /tmp/lustre-log.1576138276.67720 [Thu Dec 12 00:11:17 2019][237257.464471] LustreError: dumping log to /tmp/lustre-log.1576138277.68037 [Thu Dec 12 00:11:18 2019][237257.976484] LustreError: dumping log to /tmp/lustre-log.1576138278.68000 [Thu Dec 12 00:11:24 2019][237263.608604] LNet: Service thread pid 67732 was inactive for 200.10s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:11:24 2019][237263.621573] LNet: Skipped 20 previous similar messages [Thu Dec 12 00:11:24 2019][237263.626807] LustreError: dumping log to /tmp/lustre-log.1576138284.67732 [Thu Dec 12 00:11:28 2019][237267.704683] LustreError: dumping log to /tmp/lustre-log.1576138288.67971 [Thu Dec 12 00:11:32 2019][237272.312775] LustreError: dumping log to /tmp/lustre-log.1576138292.68046 [Thu Dec 12 00:11:34 2019][237273.848807] LustreError: dumping log to /tmp/lustre-log.1576138294.67728 [Thu Dec 12 00:11:35 2019][237274.872830] LustreError: dumping log to /tmp/lustre-log.1576138295.68018 [Thu Dec 12 00:11:36 2019][237275.896850] LustreError: dumping log to /tmp/lustre-log.1576138296.67959 [Thu Dec 12 00:11:39 2019][237278.968904] LustreError: dumping log to /tmp/lustre-log.1576138299.112549 [Thu Dec 12 00:11:44 2019][237284.089008] LustreError: dumping log to /tmp/lustre-log.1576138304.68001 [Thu Dec 12 00:11:46 2019][237285.625037] LustreError: dumping log to /tmp/lustre-log.1576138306.66124 [Thu Dec 12 00:11:47 2019][237286.649056] LustreError: dumping log to /tmp/lustre-log.1576138307.67880 [Thu Dec 12 00:11:50 2019][237290.233133] LustreError: dumping log to /tmp/lustre-log.1576138310.67770 [Thu Dec 12 00:11:59 2019][237299.449316] LNet: Service thread pid 67594 was inactive for 200.23s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:11:59 2019][237299.462289] LNet: Skipped 17 previous similar messages [Thu Dec 12 00:11:59 2019][237299.467526] LustreError: dumping log to /tmp/lustre-log.1576138319.67594 [Thu Dec 12 00:12:00 2019][237300.473336] LustreError: dumping log to /tmp/lustre-log.1576138320.67713 [Thu Dec 12 00:12:03 2019][237303.033394] LustreError: dumping log to /tmp/lustre-log.1576138323.67951 [Thu Dec 12 00:12:06 2019][237305.593441] LustreError: dumping log to /tmp/lustre-log.1576138326.67912 [Thu Dec 12 00:12:17 2019][237316.857665] LustreError: dumping log to /tmp/lustre-log.1576138337.67722 [Thu Dec 12 00:12:18 2019][237317.881688] LustreError: dumping log to /tmp/lustre-log.1576138338.67976 [Thu Dec 12 00:12:19 2019][237318.905707] LustreError: dumping log to /tmp/lustre-log.1576138339.67887 [Thu Dec 12 00:12:24 2019][237323.513803] LustreError: dumping log to /tmp/lustre-log.1576138343.66899 [Thu Dec 12 00:12:26 2019][237326.073852] LustreError: dumping log to /tmp/lustre-log.1576138346.87151 [Thu Dec 12 00:12:27 2019][237327.097865] LustreError: dumping log to /tmp/lustre-log.1576138347.67983 [Thu Dec 12 00:12:28 2019][237327.609877] LustreError: dumping log to /tmp/lustre-log.1576138348.67684 [Thu Dec 12 00:12:30 2019][237330.169932] LustreError: dumping log to /tmp/lustre-log.1576138350.67715 [Thu Dec 12 00:12:31 2019][237331.193955] LustreError: dumping log to /tmp/lustre-log.1576138351.67590 [Thu Dec 12 00:12:36 2019][237335.802048] LustreError: dumping log to /tmp/lustre-log.1576138356.67945 [Thu Dec 12 00:12:38 2019][237338.362100] LustreError: dumping log to /tmp/lustre-log.1576138358.68035 [Thu Dec 12 00:12:39 2019][237339.386117] LustreError: dumping log to /tmp/lustre-log.1576138359.68013 [Thu Dec 12 00:12:40 2019][237339.898126] LustreError: dumping log to /tmp/lustre-log.1576138360.67859 [Thu Dec 12 00:12:42 2019][237341.699169] LustreError: 67688:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576138062, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0056_UUID lock: ffff89018ca4ee40/0x7066c9c1908b2888 lrc: 3/0,1 mode: --/PW res: [0x1880000401:0x11129f:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67688 timeout: 0 lvb_type: 0 [Thu Dec 12 00:12:42 2019][237341.742968] LustreError: dumping log to /tmp/lustre-log.1576138362.67688 [Thu Dec 12 00:12:53 2019][237353.210396] LustreError: dumping log to /tmp/lustre-log.1576138373.67618 [Thu Dec 12 00:13:04 2019][237363.962609] LNet: Service thread pid 67892 was inactive for 236.60s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:13:04 2019][237363.975560] LNet: Skipped 21 previous similar messages [Thu Dec 12 00:13:04 2019][237363.980831] LustreError: dumping log to /tmp/lustre-log.1576138384.67892 [Thu Dec 12 00:13:05 2019][237364.986631] LustreError: dumping log to /tmp/lustre-log.1576138385.67904 [Thu Dec 12 00:13:18 2019][237378.279904] LustreError: 112549:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576138098, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0056_UUID lock: ffff8920d77c7740/0x7066c9c1908c0acd lrc: 3/0,1 mode: --/PW res: [0x1880000402:0x2efb73:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112549 timeout: 0 lvb_type: 0 [Thu Dec 12 00:13:18 2019][237378.323825] LustreError: 112549:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 1 previous similar message [Thu Dec 12 00:13:35 2019][237394.683221] LustreError: dumping log to /tmp/lustre-log.1576138415.67940 [Thu Dec 12 00:13:37 2019][237396.731277] LustreError: dumping log to /tmp/lustre-log.1576138417.67754 [Thu Dec 12 00:13:42 2019][237401.851365] LustreError: dumping log to /tmp/lustre-log.1576138422.67970 [Thu Dec 12 00:13:44 2019][237403.899424] LustreError: dumping log to /tmp/lustre-log.1576138424.67894 [Thu Dec 12 00:13:46 2019][237405.947447] LustreError: dumping log to /tmp/lustre-log.1576138426.67982 [Thu Dec 12 00:13:47 2019][237406.971470] LustreError: dumping log to /tmp/lustre-log.1576138427.66134 [Thu Dec 12 00:13:49 2019][237409.019513] LustreError: dumping log to /tmp/lustre-log.1576138429.68042 [Thu Dec 12 00:13:53 2019][237413.115598] LustreError: dumping log to /tmp/lustre-log.1576138433.68031 [Thu Dec 12 00:14:17 2019][237436.668067] LustreError: dumping log to /tmp/lustre-log.1576138457.67989 [Thu Dec 12 00:14:21 2019][237440.744152] LustreError: 67618:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576138161, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0056_UUID lock: ffff88f482c39f80/0x7066c9c1908d9f29 lrc: 3/0,1 mode: --/PW res: [0x1880000401:0x1112a1:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67618 timeout: 0 lvb_type: 0 [Thu Dec 12 00:14:48 2019][237468.412705] LustreError: dumping log to /tmp/lustre-log.1576138488.68023 [Thu Dec 12 00:14:58 2019][237477.628890] LustreError: dumping log to /tmp/lustre-log.1576138498.67963 [Thu Dec 12 00:15:08 2019][237487.869097] LustreError: dumping log to /tmp/lustre-log.1576138508.113352 [Thu Dec 12 00:15:16 2019][237496.061263] LNet: Service thread pid 112519 was inactive for 313.06s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:15:16 2019][237496.074297] LNet: Skipped 20 previous similar messages [Thu Dec 12 00:15:16 2019][237496.079557] LustreError: dumping log to /tmp/lustre-log.1576138516.112519 [Thu Dec 12 00:15:35 2019][237514.549421] Lustre: fir-OST0054: Connection restored to 687b1eea-b865-b791-9de5-a67096eac725 (at 10.8.23.26@o2ib6) [Thu Dec 12 00:15:35 2019][237514.559886] Lustre: Skipped 2 previous similar messages [Thu Dec 12 00:15:48 2019][237527.781354] Lustre: fir-OST0054: Connection restored to ca09bd61-a4b3-111c-b997-9c7823236764 (at 10.8.22.17@o2ib6) [Thu Dec 12 00:15:48 2019][237527.791798] Lustre: Skipped 4 previous similar messages [Thu Dec 12 00:15:50 2019][237530.102940] Lustre: fir-OST0054: Connection restored to 00850750-7463-78da-94ee-623be2781c44 (at 10.8.22.22@o2ib6) [Thu Dec 12 00:15:50 2019][237530.113398] Lustre: Skipped 4 previous similar messages [Thu Dec 12 00:16:01 2019][237541.118188] LNet: Service thread pid 67986 was inactive for 362.65s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:16:01 2019][237541.135213] Pid: 67986, comm: ll_ost_io01_039 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:16:01 2019][237541.145992] Call Trace: [Thu Dec 12 00:16:01 2019][237541.148566] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:16:01 2019][237541.155561] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:16:01 2019][237541.162728] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:16:01 2019][237541.169368] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:16:01 2019][237541.176102] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:16:01 2019][237541.183617] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:16:01 2019][237541.190700] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:16:01 2019][237541.197433] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:16:01 2019][237541.203561] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:16:01 2019][237541.210260] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:16:01 2019][237541.217289] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:16:01 2019][237541.225106] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:16:01 2019][237541.231519] [] kthread+0xd1/0xe0 [Thu Dec 12 00:16:01 2019][237541.236535] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:16:01 2019][237541.243098] [] 0xffffffffffffffff [Thu Dec 12 00:16:01 2019][237541.248197] LustreError: dumping log to /tmp/lustre-log.1576138561.67986 [Thu Dec 12 00:16:02 2019][237542.142185] Pid: 67591, comm: ll_ost_io02_003 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:16:02 2019][237542.152975] Call Trace: [Thu Dec 12 00:16:02 2019][237542.155545] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:16:02 2019][237542.162566] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:16:02 2019][237542.169736] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:16:02 2019][237542.176415] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:16:02 2019][237542.183151] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:16:02 2019][237542.190720] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:16:02 2019][237542.197808] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:16:02 2019][237542.204580] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:16:02 2019][237542.210728] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:16:02 2019][237542.217443] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:16:02 2019][237542.224476] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:16:02 2019][237542.232306] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:16:02 2019][237542.238722] [] kthread+0xd1/0xe0 [Thu Dec 12 00:16:02 2019][237542.243738] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:16:02 2019][237542.250318] [] 0xffffffffffffffff [Thu Dec 12 00:16:02 2019][237542.255444] LustreError: dumping log to /tmp/lustre-log.1576138562.67591 [Thu Dec 12 00:16:07 2019][237546.903790] Lustre: fir-OST0054: Connection restored to a507eb44-8ff1-13e2-fab8-30d1823663f8 (at 10.8.22.24@o2ib6) [Thu Dec 12 00:16:07 2019][237546.914227] Lustre: Skipped 4 previous similar messages [Thu Dec 12 00:16:09 2019][237549.311321] LNet: Service thread pid 67746 was inactive for 362.01s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:16:09 2019][237549.328363] LNet: Skipped 1 previous similar message [Thu Dec 12 00:16:09 2019][237549.333426] Pid: 67746, comm: ll_ost02_036 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:16:09 2019][237549.343970] Call Trace: [Thu Dec 12 00:16:09 2019][237549.346539] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:16:09 2019][237549.353560] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:16:09 2019][237549.360748] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:16:09 2019][237549.367413] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:16:09 2019][237549.374169] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:16:09 2019][237549.381704] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:16:09 2019][237549.388816] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:16:09 2019][237549.395050] [] ofd_attr_set+0x464/0xb60 [ofd] [Thu Dec 12 00:16:09 2019][237549.401208] [] ofd_setattr_hdl+0x31d/0x8e0 [ofd] [Thu Dec 12 00:16:09 2019][237549.407622] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:16:09 2019][237549.414743] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:16:09 2019][237549.422570] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:16:09 2019][237549.429009] [] kthread+0xd1/0xe0 [Thu Dec 12 00:16:09 2019][237549.434011] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:16:09 2019][237549.440573] [] 0xffffffffffffffff [Thu Dec 12 00:16:09 2019][237549.445687] LustreError: dumping log to /tmp/lustre-log.1576138569.67746 [Thu Dec 12 00:16:13 2019][237553.406410] LNet: Service thread pid 67937 was inactive for 362.21s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:16:13 2019][237553.423451] Pid: 67937, comm: ll_ost_io03_025 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:16:13 2019][237553.434231] Call Trace: [Thu Dec 12 00:16:13 2019][237553.436808] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:16:13 2019][237553.443803] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:16:13 2019][237553.450988] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:16:13 2019][237553.457639] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:16:13 2019][237553.464381] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:16:13 2019][237553.471910] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:16:13 2019][237553.479017] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:16:13 2019][237553.485759] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:16:13 2019][237553.491940] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:16:13 2019][237553.498659] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:16:13 2019][237553.505703] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:16:13 2019][237553.513505] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:16:13 2019][237553.519944] [] kthread+0xd1/0xe0 [Thu Dec 12 00:16:13 2019][237553.524967] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:16:13 2019][237553.531529] [] 0xffffffffffffffff [Thu Dec 12 00:16:13 2019][237553.536641] LustreError: dumping log to /tmp/lustre-log.1576138573.67937 [Thu Dec 12 00:16:15 2019][237555.454451] Pid: 67964, comm: ll_ost_io02_043 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:16:15 2019][237555.465236] Call Trace: [Thu Dec 12 00:16:15 2019][237555.467815] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:16:15 2019][237555.474837] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:16:15 2019][237555.482026] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:16:15 2019][237555.488692] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:16:15 2019][237555.495444] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:16:15 2019][237555.502984] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:16:15 2019][237555.510090] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:16:15 2019][237555.516869] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:16:15 2019][237555.523014] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:16:15 2019][237555.529756] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:16:15 2019][237555.536842] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:16:16 2019][237555.544687] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:16:16 2019][237555.551136] [] kthread+0xd1/0xe0 [Thu Dec 12 00:16:16 2019][237555.556185] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:16:16 2019][237555.562777] [] 0xffffffffffffffff [Thu Dec 12 00:16:16 2019][237555.567895] LustreError: dumping log to /tmp/lustre-log.1576138576.67964 [Thu Dec 12 00:16:18 2019][237557.502496] LustreError: dumping log to /tmp/lustre-log.1576138577.67973 [Thu Dec 12 00:16:19 2019][237558.526510] LustreError: dumping log to /tmp/lustre-log.1576138578.67960 [Thu Dec 12 00:16:22 2019][237561.598572] LustreError: dumping log to /tmp/lustre-log.1576138582.68049 [Thu Dec 12 00:16:24 2019][237563.646619] LustreError: dumping log to /tmp/lustre-log.1576138584.113613 [Thu Dec 12 00:16:25 2019][237565.308284] Lustre: fir-OST0056: Export ffff89036eda0800 already connecting from 10.8.23.26@o2ib6 [Thu Dec 12 00:16:38 2019][237578.501849] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 00:16:41 2019][237580.894384] Lustre: fir-OST0056: Export ffff88e42919e800 already connecting from 10.8.22.22@o2ib6 [Thu Dec 12 00:16:50 2019][237590.271148] LustreError: dumping log to /tmp/lustre-log.1576138610.68026 [Thu Dec 12 00:16:58 2019][237597.632231] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 00:17:16 2019][237615.485143] Lustre: fir-OST0056: Export ffff89036eda0800 already connecting from 10.8.23.26@o2ib6 [Thu Dec 12 00:17:19 2019][237618.943722] LustreError: dumping log to /tmp/lustre-log.1576138639.68050 [Thu Dec 12 00:17:21 2019][237620.991766] LustreError: dumping log to /tmp/lustre-log.1576138641.67955 [Thu Dec 12 00:17:22 2019][237621.801792] Lustre: 67968:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:17:22 2019][237621.801792] req@ffff890f1b8c4850 x1652591481752896/t0(0) o4->0f4b0f7a-80c1-4@10.9.110.62@o2ib4:647/0 lens 1352/664 e 24 to 0 dl 1576138647 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:17:22 2019][237622.509804] Lustre: 68038:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:17:22 2019][237622.509804] req@ffff88e4f04ec050 x1652591481755520/t0(0) o4->0f4b0f7a-80c1-4@10.9.110.62@o2ib4:647/0 lens 1352/664 e 24 to 0 dl 1576138647 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:17:22 2019][237622.536983] Lustre: 68038:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 3 previous similar messages [Thu Dec 12 00:17:24 2019][237623.807832] Lustre: 67953:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:17:24 2019][237623.807832] req@ffff890712200050 x1648846345542000/t0(0) o4->75b6516e-d912-63bd-698a-8f68fc05bdf0@10.9.110.15@o2ib4:649/0 lens 488/448 e 24 to 0 dl 1576138649 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:17:24 2019][237623.836762] Lustre: 67953:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1 previous similar message [Thu Dec 12 00:17:25 2019][237625.087842] LustreError: dumping log to /tmp/lustre-log.1576138645.67994 [Thu Dec 12 00:17:27 2019][237627.203908] Lustre: 67877:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:17:27 2019][237627.203908] req@ffff89224f575050 x1649530969658688/t0(0) o4->1c192c26-6a2d-8fff-8f45-c6fac242e547@10.9.104.15@o2ib4:652/0 lens 488/448 e 24 to 0 dl 1576138652 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:17:27 2019][237627.232835] Lustre: 67877:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 4 previous similar messages [Thu Dec 12 00:17:28 2019][237628.030913] Lustre: fir-OST0056: Client 0f4b0f7a-80c1-4 (at 10.9.110.62@o2ib4) reconnecting [Thu Dec 12 00:17:28 2019][237628.039357] Lustre: Skipped 5 previous similar messages [Thu Dec 12 00:17:28 2019][237628.044707] Lustre: fir-OST0056: Connection restored to d849fafe-3a33-7fd6-08c1-09a87a8abd8b (at 10.9.110.62@o2ib4) [Thu Dec 12 00:17:29 2019][237628.678722] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 00:17:29 2019][237629.183925] LustreError: dumping log to /tmp/lustre-log.1576138649.67768 [Thu Dec 12 00:17:31 2019][237631.231975] LustreError: dumping log to /tmp/lustre-log.1576138651.67995 [Thu Dec 12 00:17:33 2019][237633.050024] Lustre: 68022:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:17:33 2019][237633.050024] req@ffff8902fd4f3850 x1648532634593856/t0(0) o4->a5082367-d733-7058-3cc5-0eedec6c0c1c@10.8.30.16@o2ib6:658/0 lens 2488/448 e 24 to 0 dl 1576138658 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:17:33 2019][237633.078952] Lustre: 68022:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 6 previous similar messages [Thu Dec 12 00:17:33 2019][237633.280013] LustreError: dumping log to /tmp/lustre-log.1576138653.67952 [Thu Dec 12 00:17:34 2019][237634.304028] LustreError: dumping log to /tmp/lustre-log.1576138654.68005 [Thu Dec 12 00:17:35 2019][237635.328055] LustreError: dumping log to /tmp/lustre-log.1576138655.67897 [Thu Dec 12 00:17:37 2019][237637.376092] LustreError: dumping log to /tmp/lustre-log.1576138657.68039 [Thu Dec 12 00:17:38 2019][237638.085392] Lustre: fir-OST0056: Connection restored to (at 10.9.108.56@o2ib4) [Thu Dec 12 00:17:38 2019][237638.092791] Lustre: Skipped 9 previous similar messages [Thu Dec 12 00:17:39 2019][237639.424133] LustreError: dumping log to /tmp/lustre-log.1576138659.68006 [Thu Dec 12 00:17:42 2019][237642.232218] Lustre: 66135:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:17:42 2019][237642.232218] req@ffff891a4f63b050 x1649046890263936/t0(0) o10->7126efc2-9676-1db9-94d0-ae09c1520697@10.9.101.26@o2ib4:667/0 lens 440/432 e 17 to 0 dl 1576138667 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:17:42 2019][237642.261241] Lustre: 66135:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 10 previous similar messages [Thu Dec 12 00:17:43 2019][237642.629898] Lustre: fir-OST0056: deleting orphan objects from 0x1880000401:1118894 to 0x1880000401:1118913 [Thu Dec 12 00:17:46 2019][237645.568265] LustreError: dumping log to /tmp/lustre-log.1576138666.112531 [Thu Dec 12 00:17:48 2019][237647.809088] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 00:17:48 2019][237647.818055] Lustre: Skipped 1 previous similar message [Thu Dec 12 00:17:55 2019][237654.608801] Lustre: fir-OST0056: Connection restored to 8d232f07-b6ab-bc70-4dd8-277e82f65db5 (at 10.9.107.58@o2ib4) [Thu Dec 12 00:17:55 2019][237654.619327] Lustre: Skipped 9 previous similar messages [Thu Dec 12 00:18:00 2019][237659.904550] LustreError: dumping log to /tmp/lustre-log.1576138680.67598 [Thu Dec 12 00:18:02 2019][237661.888598] Lustre: 67966:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:18:02 2019][237661.888598] req@ffff8903a9aa0850 x1649656820965840/t0(0) o4->d5b9405e-1c60-945f-2d9d-6a877d61380f@10.8.30.30@o2ib6:687/0 lens 1720/448 e 12 to 0 dl 1576138687 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:18:02 2019][237661.917553] Lustre: 67966:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 19 previous similar messages [Thu Dec 12 00:18:02 2019][237661.952589] LustreError: dumping log to /tmp/lustre-log.1576138682.68014 [Thu Dec 12 00:18:21 2019][237681.247793] Lustre: fir-OST0056: Export ffff88e42919e800 already connecting from 10.8.22.22@o2ib6 [Thu Dec 12 00:18:21 2019][237681.256757] Lustre: Skipped 2 previous similar messages [Thu Dec 12 00:18:27 2019][237687.211349] Lustre: fir-OST0056: Connection restored to ec8d663e-70c3-0c7c-9511-dfaaba3f32c1 (at 10.9.104.45@o2ib4) [Thu Dec 12 00:18:27 2019][237687.221878] Lustre: Skipped 7 previous similar messages [Thu Dec 12 00:18:34 2019][237693.697223] LustreError: dumping log to /tmp/lustre-log.1576138714.67884 [Thu Dec 12 00:18:34 2019][237694.529253] Lustre: 26939:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:18:35 2019][237694.529253] req@ffff8902569da050 x1649291658560656/t0(0) o4->89038d66-847b-1ff4-ff67-a551d70b6de8@10.9.110.70@o2ib4:719/0 lens 488/448 e 8 to 0 dl 1576138719 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:18:35 2019][237694.558093] Lustre: 26939:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 11 previous similar messages [Thu Dec 12 00:18:37 2019][237696.573443] Lustre: fir-OST0056: Client 72ec26e6-8490-9625-4bfc-aa584f79f189 (at 10.9.102.25@o2ib4) reconnecting [Thu Dec 12 00:18:37 2019][237696.583710] Lustre: Skipped 29 previous similar messages [Thu Dec 12 00:18:38 2019][237697.793305] LustreError: dumping log to /tmp/lustre-log.1576138718.67730 [Thu Dec 12 00:18:41 2019][237700.865378] LustreError: dumping log to /tmp/lustre-log.1576138721.67820 [Thu Dec 12 00:18:43 2019][237702.913411] LustreError: dumping log to /tmp/lustre-log.1576138723.67907 [Thu Dec 12 00:18:47 2019][237707.009489] LustreError: dumping log to /tmp/lustre-log.1576138727.68003 [Thu Dec 12 00:18:51 2019][237711.105570] LustreError: dumping log to /tmp/lustre-log.1576138731.67906 [Thu Dec 12 00:19:21 2019][237740.560319] Lustre: fir-OST0056: haven't heard from client 7e5bcac9-70c5-4 (at ) in 227 seconds. I think it's dead, and I am evicting it. exp ffff89036eda0800, cur 1576138761 expire 1576138611 last 1576138534 [Thu Dec 12 00:19:21 2019][237740.579755] Lustre: Skipped 5 previous similar messages [Thu Dec 12 00:19:28 2019][237748.163230] Lustre: fir-OST0056: Export ffff8902f7fafc00 already connecting from 10.8.22.24@o2ib6 [Thu Dec 12 00:19:28 2019][237748.172193] Lustre: Skipped 4 previous similar messages [Thu Dec 12 00:19:31 2019][237751.479667] Lustre: fir-OST0056: Connection restored to 860089bf-2de2-f0b4-c239-a266e1c756b4 (at 10.9.102.54@o2ib4) [Thu Dec 12 00:19:31 2019][237751.490187] Lustre: Skipped 19 previous similar messages [Thu Dec 12 00:19:40 2019][237760.386572] Lustre: 27017:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:19:40 2019][237760.386572] req@ffff89079a151850 x1648858224189808/t0(0) o4->8e4fe161-7440-1bc3-60cf-ef16452a7501@10.9.105.43@o2ib4:30/0 lens 6576/448 e 4 to 0 dl 1576138785 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:19:40 2019][237760.415403] Lustre: 27017:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 37 previous similar messages [Thu Dec 12 00:19:56 2019][237775.706527] Lustre: fir-OST0056: deleting orphan objects from 0x0:27479877 to 0x0:27479905 [Thu Dec 12 00:19:57 2019][237776.642884] LNet: Service thread pid 67991 was inactive for 512.09s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:19:57 2019][237776.655854] LNet: Skipped 42 previous similar messages [Thu Dec 12 00:19:57 2019][237776.661088] LustreError: dumping log to /tmp/lustre-log.1576138797.67991 [Thu Dec 12 00:20:07 2019][237786.883094] LustreError: dumping log to /tmp/lustre-log.1576138807.67753 [Thu Dec 12 00:20:09 2019][237788.931149] LustreError: dumping log to /tmp/lustre-log.1576138809.67822 [Thu Dec 12 00:20:29 2019][237809.411550] LustreError: dumping log to /tmp/lustre-log.1576138829.68008 [Thu Dec 12 00:20:31 2019][237811.459584] LustreError: dumping log to /tmp/lustre-log.1576138831.67931 [Thu Dec 12 00:20:46 2019][237826.287005] Lustre: fir-OST0056: Client 7520ece1-a22b-161c-9a9c-7f1c99e6d5c6 (at 10.9.108.37@o2ib4) reconnecting [Thu Dec 12 00:20:46 2019][237826.297268] Lustre: Skipped 29 previous similar messages [Thu Dec 12 00:21:21 2019][237860.612574] LNet: Service thread pid 68004 was inactive for 563.65s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:21:21 2019][237860.629597] LNet: Skipped 1 previous similar message [Thu Dec 12 00:21:21 2019][237860.634660] Pid: 68004, comm: ll_ost_io00_034 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:21:21 2019][237860.645456] Call Trace: [Thu Dec 12 00:21:21 2019][237860.648027] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:21:21 2019][237860.655041] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:21:21 2019][237860.662208] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:21:21 2019][237860.668886] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:21:21 2019][237860.675626] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:21:21 2019][237860.683185] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:21:21 2019][237860.690278] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:21:21 2019][237860.697014] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:21:21 2019][237860.703141] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:21:21 2019][237860.709830] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:21:21 2019][237860.716876] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:21:21 2019][237860.724678] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:21:21 2019][237860.731107] [] kthread+0xd1/0xe0 [Thu Dec 12 00:21:21 2019][237860.736109] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:21:21 2019][237860.742703] [] 0xffffffffffffffff [Thu Dec 12 00:21:21 2019][237860.747812] LustreError: dumping log to /tmp/lustre-log.1576138881.68004 [Thu Dec 12 00:21:31 2019][237870.852778] Pid: 67777, comm: ll_ost01_049 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:21:31 2019][237870.863330] Call Trace: [Thu Dec 12 00:21:31 2019][237870.865907] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:21:31 2019][237870.872901] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:21:31 2019][237870.880114] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:21:31 2019][237870.886779] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:21:31 2019][237870.893527] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:21:31 2019][237870.901052] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:21:31 2019][237870.908193] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:21:31 2019][237870.914407] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 00:21:31 2019][237870.920445] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 00:21:31 2019][237870.927094] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 00:21:31 2019][237870.933480] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:21:31 2019][237870.940528] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:21:31 2019][237870.948336] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:21:31 2019][237870.954762] [] kthread+0xd1/0xe0 [Thu Dec 12 00:21:31 2019][237870.959769] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:21:31 2019][237870.966329] [] 0xffffffffffffffff [Thu Dec 12 00:21:31 2019][237870.971437] LustreError: dumping log to /tmp/lustre-log.1576138891.67777 [Thu Dec 12 00:21:31 2019][237870.978842] Pid: 67725, comm: ll_ost01_041 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:21:31 2019][237870.989398] Call Trace: [Thu Dec 12 00:21:31 2019][237870.991974] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:21:31 2019][237870.998970] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:21:31 2019][237871.006135] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:21:31 2019][237871.012783] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:21:31 2019][237871.019533] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:21:31 2019][237871.027059] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:21:31 2019][237871.034147] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:21:31 2019][237871.040378] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 00:21:31 2019][237871.046436] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 00:21:31 2019][237871.053092] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 00:21:31 2019][237871.059482] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:21:31 2019][237871.066575] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:21:31 2019][237871.074388] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:21:31 2019][237871.080810] [] kthread+0xd1/0xe0 [Thu Dec 12 00:21:31 2019][237871.085811] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:21:31 2019][237871.092398] [] 0xffffffffffffffff [Thu Dec 12 00:21:40 2019][237879.563699] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 00:21:40 2019][237879.572669] Lustre: Skipped 8 previous similar messages [Thu Dec 12 00:21:52 2019][237891.845210] Lustre: 27074:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:21:52 2019][237891.845210] req@ffff88f2fbd54050 x1650958282745904/t0(0) o4->0c302cf4-1147-d945-dfa2-e9bc796b3175@10.9.101.32@o2ib4:162/0 lens 7904/448 e 3 to 0 dl 1576138917 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:21:52 2019][237891.874134] Lustre: 27074:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 52 previous similar messages [Thu Dec 12 00:22:07 2019][237907.467083] Lustre: fir-OST0056: Connection restored to b4c9913c-f59e-b8ac-70a9-c2d8d6c39257 (at 10.9.101.34@o2ib4) [Thu Dec 12 00:22:07 2019][237907.477608] Lustre: Skipped 23 previous similar messages [Thu Dec 12 00:22:18 2019][237917.957722] LNet: Service thread pid 68041 was inactive for 612.09s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:22:18 2019][237917.974748] LNet: Skipped 2 previous similar messages [Thu Dec 12 00:22:18 2019][237917.979913] Pid: 68041, comm: ll_ost_io00_053 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:22:18 2019][237917.990726] Call Trace: [Thu Dec 12 00:22:18 2019][237917.993304] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:22:18 2019][237918.000304] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:22:18 2019][237918.007520] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:22:18 2019][237918.014166] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:22:18 2019][237918.020918] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:22:18 2019][237918.028459] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:22:18 2019][237918.035563] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:22:18 2019][237918.042325] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:22:18 2019][237918.048452] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:22:18 2019][237918.055153] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:22:18 2019][237918.062197] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:22:18 2019][237918.070008] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:22:18 2019][237918.076475] [] kthread+0xd1/0xe0 [Thu Dec 12 00:22:18 2019][237918.081484] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:22:18 2019][237918.088079] [] 0xffffffffffffffff [Thu Dec 12 00:22:18 2019][237918.093185] LustreError: dumping log to /tmp/lustre-log.1576138938.68041 [Thu Dec 12 00:22:20 2019][237920.005760] Pid: 67941, comm: ll_ost_io01_029 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:22:20 2019][237920.016543] Call Trace: [Thu Dec 12 00:22:20 2019][237920.019118] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:22:20 2019][237920.026145] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:22:20 2019][237920.033333] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:22:20 2019][237920.040001] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:22:20 2019][237920.046772] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:22:20 2019][237920.054298] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:22:20 2019][237920.061400] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:22:20 2019][237920.067637] [] ofd_object_punch+0x73d/0xd30 [ofd] [Thu Dec 12 00:22:20 2019][237920.074111] [] ofd_punch_hdl+0x493/0xa30 [ofd] [Thu Dec 12 00:22:20 2019][237920.080340] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:22:20 2019][237920.087405] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:22:20 2019][237920.095229] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:22:20 2019][237920.101681] [] kthread+0xd1/0xe0 [Thu Dec 12 00:22:20 2019][237920.106680] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:22:20 2019][237920.113267] [] 0xffffffffffffffff [Thu Dec 12 00:22:20 2019][237920.118406] LustreError: dumping log to /tmp/lustre-log.1576138940.67941 [Thu Dec 12 00:22:28 2019][237928.197929] LustreError: dumping log to /tmp/lustre-log.1576138948.67750 [Thu Dec 12 00:23:48 2019][238008.071530] LustreError: dumping log to /tmp/lustre-log.1576139028.112488 [Thu Dec 12 00:23:52 2019][238011.731862] Lustre: fir-OST0056: deleting orphan objects from 0x1880000400:11777327 to 0x1880000400:11777377 [Thu Dec 12 00:23:54 2019][238014.215651] LustreError: dumping log to /tmp/lustre-log.1576139034.112522 [Thu Dec 12 00:24:49 2019][238069.512760] LustreError: dumping log to /tmp/lustre-log.1576139089.67896 [Thu Dec 12 00:24:56 2019][238075.656890] LustreError: dumping log to /tmp/lustre-log.1576139096.67930 [Thu Dec 12 00:25:07 2019][238087.489627] Lustre: fir-OST0056: Client dffc1cc0-26ab-9b78-f3a0-8d9b8d410b62 (at 10.9.108.46@o2ib4) reconnecting [Thu Dec 12 00:25:07 2019][238087.499887] Lustre: Skipped 17 previous similar messages [Thu Dec 12 00:26:02 2019][238142.343776] Lustre: fir-OST0056: Export ffff891d99820800 already connecting from 10.8.23.26@o2ib6 [Thu Dec 12 00:26:02 2019][238142.352760] Lustre: Skipped 26 previous similar messages [Thu Dec 12 00:26:14 2019][238153.610462] Lustre: 27127:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:26:14 2019][238153.610462] req@ffff88f2acc5c050 x1650958603467184/t0(0) o4->717fa73e-8071-a76f-931e-8957a8ca32aa@10.9.101.41@o2ib4:424/0 lens 2056/448 e 2 to 0 dl 1576139179 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:26:14 2019][238153.639401] Lustre: 27127:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 21 previous similar messages [Thu Dec 12 00:26:15 2019][238155.530496] LustreError: dumping log to /tmp/lustre-log.1576139175.67974 [Thu Dec 12 00:26:20 2019][238159.626567] LustreError: dumping log to /tmp/lustre-log.1576139180.67949 [Thu Dec 12 00:26:34 2019][238173.612853] Lustre: fir-OST0056: Connection restored to bb7d080c-8ae8-f7ed-5d33-d34ca54d93de (at 10.9.108.19@o2ib4) [Thu Dec 12 00:26:34 2019][238173.623374] Lustre: Skipped 16 previous similar messages [Thu Dec 12 00:26:38 2019][238178.058942] LNet: Service thread pid 67807 was inactive for 763.33s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:26:38 2019][238178.075968] LNet: Skipped 1 previous similar message [Thu Dec 12 00:26:38 2019][238178.081030] Pid: 67807, comm: ll_ost01_056 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:26:38 2019][238178.091578] Call Trace: [Thu Dec 12 00:26:38 2019][238178.094150] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:26:38 2019][238178.101149] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:26:38 2019][238178.108346] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:26:38 2019][238178.114996] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:26:38 2019][238178.121761] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:26:38 2019][238178.129290] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:26:38 2019][238178.136394] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:26:38 2019][238178.142611] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 00:26:38 2019][238178.148665] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 00:26:38 2019][238178.155314] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 00:26:38 2019][238178.161718] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:26:38 2019][238178.168767] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:26:38 2019][238178.176590] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:26:38 2019][238178.183006] [] kthread+0xd1/0xe0 [Thu Dec 12 00:26:38 2019][238178.188023] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:26:38 2019][238178.194586] [] 0xffffffffffffffff [Thu Dec 12 00:26:38 2019][238178.199699] LustreError: dumping log to /tmp/lustre-log.1576139198.67807 [Thu Dec 12 00:27:21 2019][238221.067805] Pid: 87152, comm: ll_ost_io01_067 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:27:21 2019][238221.078590] Call Trace: [Thu Dec 12 00:27:21 2019][238221.081166] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:27:21 2019][238221.088162] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:27:21 2019][238221.095345] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:27:21 2019][238221.101992] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:27:21 2019][238221.108728] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:27:21 2019][238221.116254] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:27:21 2019][238221.123365] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:27:21 2019][238221.129581] [] ofd_commitrw_write+0xa31/0x1d40 [ofd] [Thu Dec 12 00:27:21 2019][238221.136316] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:27:21 2019][238221.142458] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:27:21 2019][238221.149174] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:27:21 2019][238221.156206] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:27:21 2019][238221.164013] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:27:21 2019][238221.170453] [] kthread+0xd1/0xe0 [Thu Dec 12 00:27:21 2019][238221.175456] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:27:21 2019][238221.182057] [] 0xffffffffffffffff [Thu Dec 12 00:27:21 2019][238221.187166] LustreError: dumping log to /tmp/lustre-log.1576139241.87152 [Thu Dec 12 00:27:23 2019][238223.115840] Pid: 67714, comm: ll_ost_io03_004 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:27:23 2019][238223.126625] Call Trace: [Thu Dec 12 00:27:23 2019][238223.129215] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:27:23 2019][238223.136213] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:27:23 2019][238223.143412] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:27:23 2019][238223.150064] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:27:23 2019][238223.156814] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:27:23 2019][238223.164354] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:27:23 2019][238223.171463] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:27:23 2019][238223.178227] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:27:23 2019][238223.184378] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:27:23 2019][238223.191099] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:27:23 2019][238223.198145] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:27:23 2019][238223.205958] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:27:23 2019][238223.212387] [] kthread+0xd1/0xe0 [Thu Dec 12 00:27:23 2019][238223.217389] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:27:23 2019][238223.224000] [] 0xffffffffffffffff [Thu Dec 12 00:27:23 2019][238223.229100] LustreError: dumping log to /tmp/lustre-log.1576139243.67714 [Thu Dec 12 00:27:25 2019][238225.163886] Pid: 67657, comm: ll_ost02_023 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:27:25 2019][238225.174436] Call Trace: [Thu Dec 12 00:27:25 2019][238225.177012] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:27:25 2019][238225.184019] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:27:25 2019][238225.191282] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:27:25 2019][238225.197939] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:27:25 2019][238225.204725] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:27:25 2019][238225.212282] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:27:25 2019][238225.219407] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:27:25 2019][238225.225636] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 00:27:25 2019][238225.231696] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 00:27:25 2019][238225.238360] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 00:27:25 2019][238225.244768] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:27:25 2019][238225.251844] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:27:25 2019][238225.259685] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:27:25 2019][238225.266158] [] kthread+0xd1/0xe0 [Thu Dec 12 00:27:25 2019][238225.271178] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:27:25 2019][238225.277742] [] 0xffffffffffffffff [Thu Dec 12 00:27:25 2019][238225.282870] LustreError: dumping log to /tmp/lustre-log.1576139245.67657 [Thu Dec 12 00:29:53 2019][238372.622843] LNet: Service thread pid 67943 was inactive for 912.97s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:29:53 2019][238372.639862] LNet: Skipped 3 previous similar messages [Thu Dec 12 00:29:53 2019][238372.645009] Pid: 67943, comm: ll_ost_io01_030 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:29:53 2019][238372.655808] Call Trace: [Thu Dec 12 00:29:53 2019][238372.658383] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:29:53 2019][238372.665382] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:29:53 2019][238372.672563] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:29:53 2019][238372.679214] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:29:53 2019][238372.685964] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:29:53 2019][238372.693489] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:29:53 2019][238372.700596] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:29:53 2019][238372.706810] [] ofd_object_punch+0x73d/0xd30 [ofd] [Thu Dec 12 00:29:53 2019][238372.713299] [] ofd_punch_hdl+0x493/0xa30 [ofd] [Thu Dec 12 00:29:53 2019][238372.719514] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:29:53 2019][238372.726575] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:29:53 2019][238372.734379] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:29:53 2019][238372.740806] [] kthread+0xd1/0xe0 [Thu Dec 12 00:29:53 2019][238372.745809] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:29:53 2019][238372.752386] [] 0xffffffffffffffff [Thu Dec 12 00:29:53 2019][238372.757485] LustreError: dumping log to /tmp/lustre-log.1576139393.67943 [Thu Dec 12 00:31:10 2019][238450.448399] LNet: Service thread pid 67762 was inactive for 964.37s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:31:10 2019][238450.461346] LNet: Skipped 13 previous similar messages [Thu Dec 12 00:31:10 2019][238450.466583] LustreError: dumping log to /tmp/lustre-log.1576139470.67762 [Thu Dec 12 00:31:19 2019][238458.640557] LustreError: dumping log to /tmp/lustre-log.1576139479.67745 [Thu Dec 12 00:32:20 2019][238520.081793] LNet: Service thread pid 66128 was inactive for 1015.91s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:32:20 2019][238520.098899] Pid: 66128, comm: ll_ost_io01_001 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:32:20 2019][238520.109680] Call Trace: [Thu Dec 12 00:32:20 2019][238520.112254] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:32:20 2019][238520.119251] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:32:20 2019][238520.126436] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:32:20 2019][238520.133085] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:32:20 2019][238520.139835] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:32:20 2019][238520.147359] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:32:20 2019][238520.154463] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:32:20 2019][238520.160680] [] ofd_commitrw_write+0xa31/0x1d40 [ofd] [Thu Dec 12 00:32:20 2019][238520.167431] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:32:20 2019][238520.173560] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:32:20 2019][238520.180275] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:32:20 2019][238520.187319] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:32:20 2019][238520.195137] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:32:20 2019][238520.201552] [] kthread+0xd1/0xe0 [Thu Dec 12 00:32:20 2019][238520.206553] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:32:20 2019][238520.213122] [] 0xffffffffffffffff [Thu Dec 12 00:32:20 2019][238520.218222] LustreError: dumping log to /tmp/lustre-log.1576139540.66128 [Thu Dec 12 00:32:28 2019][238528.273957] Pid: 112533, comm: ll_ost02_080 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:32:28 2019][238528.284564] Call Trace: [Thu Dec 12 00:32:28 2019][238528.287134] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:32:28 2019][238528.294131] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:32:28 2019][238528.301329] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:32:28 2019][238528.307979] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:32:28 2019][238528.314729] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:32:28 2019][238528.322255] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:32:28 2019][238528.329349] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 00:32:28 2019][238528.336660] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 00:32:28 2019][238528.343270] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 00:32:28 2019][238528.349658] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 00:32:28 2019][238528.356953] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 00:32:28 2019][238528.363985] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:32:28 2019][238528.371814] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:32:28 2019][238528.378225] [] kthread+0xd1/0xe0 [Thu Dec 12 00:32:28 2019][238528.383241] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:32:28 2019][238528.389805] [] 0xffffffffffffffff [Thu Dec 12 00:32:28 2019][238528.394917] LustreError: dumping log to /tmp/lustre-log.1576139548.112533 [Thu Dec 12 00:32:40 2019][238540.562197] Pid: 67868, comm: ll_ost01_068 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:32:40 2019][238540.572718] Call Trace: [Thu Dec 12 00:32:41 2019][238540.575287] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:32:41 2019][238540.582297] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:32:41 2019][238540.589492] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:32:41 2019][238540.596161] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:32:41 2019][238540.602912] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:32:41 2019][238540.610438] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:32:41 2019][238540.617541] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 00:32:41 2019][238540.624850] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 00:32:41 2019][238540.631461] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 00:32:41 2019][238540.637849] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 00:32:41 2019][238540.645143] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 00:32:41 2019][238540.652178] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:32:41 2019][238540.660000] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:32:41 2019][238540.666434] [] kthread+0xd1/0xe0 [Thu Dec 12 00:32:41 2019][238540.671448] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:32:41 2019][238540.678013] [] 0xffffffffffffffff [Thu Dec 12 00:32:41 2019][238540.683124] LustreError: dumping log to /tmp/lustre-log.1576139561.67868 [Thu Dec 12 00:32:45 2019][238544.658296] Pid: 67671, comm: ll_ost00_024 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:32:45 2019][238544.668816] Call Trace: [Thu Dec 12 00:32:45 2019][238544.671388] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:32:45 2019][238544.678391] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:32:45 2019][238544.685574] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:32:45 2019][238544.692225] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:32:45 2019][238544.698987] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:32:45 2019][238544.706532] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:32:45 2019][238544.713636] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 00:32:45 2019][238544.720941] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 00:32:45 2019][238544.727567] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 00:32:45 2019][238544.733956] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 00:32:45 2019][238544.741259] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 00:32:45 2019][238544.748283] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:32:45 2019][238544.756097] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:32:45 2019][238544.762514] [] kthread+0xd1/0xe0 [Thu Dec 12 00:32:45 2019][238544.767543] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:32:45 2019][238544.774117] [] 0xffffffffffffffff [Thu Dec 12 00:32:45 2019][238544.779231] LustreError: dumping log to /tmp/lustre-log.1576139565.67671 [Thu Dec 12 00:33:05 2019][238565.138725] Pid: 26830, comm: ll_ost_io03_048 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:33:05 2019][238565.149503] Call Trace: [Thu Dec 12 00:33:05 2019][238565.152072] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:33:05 2019][238565.159070] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:33:05 2019][238565.166256] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:33:05 2019][238565.172919] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:33:05 2019][238565.179669] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:33:05 2019][238565.187195] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:33:05 2019][238565.194299] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:33:05 2019][238565.200515] [] ofd_object_punch+0x73d/0xd30 [ofd] [Thu Dec 12 00:33:05 2019][238565.207004] [] ofd_punch_hdl+0x493/0xa30 [ofd] [Thu Dec 12 00:33:05 2019][238565.213221] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:33:05 2019][238565.220281] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:33:05 2019][238565.228085] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:33:05 2019][238565.234513] [] kthread+0xd1/0xe0 [Thu Dec 12 00:33:05 2019][238565.239520] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:33:05 2019][238565.246118] [] 0xffffffffffffffff [Thu Dec 12 00:33:05 2019][238565.251220] LustreError: dumping log to /tmp/lustre-log.1576139585.26830 [Thu Dec 12 00:33:50 2019][238610.195609] LustreError: dumping log to /tmp/lustre-log.1576139630.112514 [Thu Dec 12 00:33:51 2019][238610.850728] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 00:33:51 2019][238610.860907] Lustre: Skipped 92 previous similar messages [Thu Dec 12 00:34:46 2019][238666.472729] Lustre: 112543:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/-150), not sending early reply [Thu Dec 12 00:34:46 2019][238666.472729] req@ffff88f2af871850 x1652161335594432/t0(0) o19->ae1d0080-04fa-5436-e145-ffdf0db9990d@10.0.10.3@o2ib7:181/0 lens 336/336 e 0 to 0 dl 1576139691 ref 2 fl Interpret:/0/0 rc 0/0 [Thu Dec 12 00:34:47 2019][238666.501801] Lustre: 112543:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 192 previous similar messages [Thu Dec 12 00:34:56 2019][238675.732910] LustreError: dumping log to /tmp/lustre-log.1576139696.68007 [Thu Dec 12 00:35:00 2019][238679.828995] LustreError: dumping log to /tmp/lustre-log.1576139700.67771 [Thu Dec 12 00:35:02 2019][238682.395663] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 00:35:02 2019][238682.404631] Lustre: Skipped 52 previous similar messages [Thu Dec 12 00:35:04 2019][238683.925084] LustreError: dumping log to /tmp/lustre-log.1576139704.67958 [Thu Dec 12 00:35:09 2019][238688.549482] Lustre: fir-OST0056: Connection restored to dffc1cc0-26ab-9b78-f3a0-8d9b8d410b62 (at 10.9.108.46@o2ib4) [Thu Dec 12 00:35:09 2019][238688.560020] Lustre: Skipped 91 previous similar messages [Thu Dec 12 00:36:05 2019][238745.366307] LustreError: dumping log to /tmp/lustre-log.1576139765.67936 [Thu Dec 12 00:36:09 2019][238749.462393] LustreError: dumping log to /tmp/lustre-log.1576139769.67813 [Thu Dec 12 00:36:14 2019][238753.558481] LustreError: dumping log to /tmp/lustre-log.1576139773.68021 [Thu Dec 12 00:36:18 2019][238757.654552] LustreError: dumping log to /tmp/lustre-log.1576139778.67946 [Thu Dec 12 00:36:22 2019][238761.750633] LustreError: dumping log to /tmp/lustre-log.1576139782.67662 [Thu Dec 12 00:36:26 2019][238765.846715] LustreError: dumping log to /tmp/lustre-log.1576139786.112537 [Thu Dec 12 00:37:07 2019][238806.807531] LustreError: dumping log to /tmp/lustre-log.1576139827.67957 [Thu Dec 12 00:37:31 2019][238831.384032] LNet: Service thread pid 68010 was inactive for 1201.76s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:37:31 2019][238831.401160] LNet: Skipped 4 previous similar messages [Thu Dec 12 00:37:31 2019][238831.406327] Pid: 68010, comm: ll_ost_io00_036 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:37:31 2019][238831.417115] Call Trace: [Thu Dec 12 00:37:31 2019][238831.419670] [] call_rwsem_down_read_failed+0x18/0x30 [Thu Dec 12 00:37:31 2019][238831.426409] [] osd_read_lock+0x5c/0xe0 [osd_ldiskfs] [Thu Dec 12 00:37:31 2019][238831.433181] [] ofd_preprw_write.isra.31+0xd3/0xea0 [ofd] [Thu Dec 12 00:37:31 2019][238831.440274] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:37:31 2019][238831.446328] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:37:31 2019][238831.452947] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:37:31 2019][238831.459990] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:37:31 2019][238831.467792] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:37:31 2019][238831.474222] [] kthread+0xd1/0xe0 [Thu Dec 12 00:37:31 2019][238831.479224] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:37:31 2019][238831.485800] [] 0xffffffffffffffff [Thu Dec 12 00:37:31 2019][238831.490910] LustreError: dumping log to /tmp/lustre-log.1576139851.68010 [Thu Dec 12 00:37:31 2019][238831.498312] Pid: 67948, comm: ll_ost_io03_026 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:37:31 2019][238831.509122] Call Trace: [Thu Dec 12 00:37:31 2019][238831.511670] [] __lock_page+0x74/0x90 [Thu Dec 12 00:37:31 2019][238831.517017] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:37:31 2019][238831.522805] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:37:31 2019][238831.528841] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:37:31 2019][238831.535684] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:37:31 2019][238831.542856] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:37:31 2019][238831.548910] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:37:31 2019][238831.555499] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:37:31 2019][238831.562550] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.570346] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.576776] [] kthread+0xd1/0xe0 [Thu Dec 12 00:37:32 2019][238831.581769] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:37:32 2019][238831.588337] [] 0xffffffffffffffff [Thu Dec 12 00:37:32 2019][238831.593430] Pid: 68051, comm: ll_ost_io01_063 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:37:32 2019][238831.604226] Call Trace: [Thu Dec 12 00:37:32 2019][238831.606770] [] __lock_page+0x74/0x90 [Thu Dec 12 00:37:32 2019][238831.612113] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:37:32 2019][238831.617900] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:37:32 2019][238831.623950] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:37:32 2019][238831.630785] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:37:32 2019][238831.637957] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:37:32 2019][238831.644016] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.650602] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.657639] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.665438] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.671867] [] kthread+0xd1/0xe0 [Thu Dec 12 00:37:32 2019][238831.676863] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:37:32 2019][238831.683430] [] 0xffffffffffffffff [Thu Dec 12 00:37:32 2019][238831.688529] Pid: 26875, comm: ll_ost_io03_050 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:37:32 2019][238831.699317] Call Trace: [Thu Dec 12 00:37:32 2019][238831.701860] [] call_rwsem_down_read_failed+0x18/0x30 [Thu Dec 12 00:37:32 2019][238831.708584] [] osd_read_lock+0x5c/0xe0 [osd_ldiskfs] [Thu Dec 12 00:37:32 2019][238831.715335] [] ofd_preprw_write.isra.31+0xd3/0xea0 [ofd] [Thu Dec 12 00:37:32 2019][238831.722415] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:37:32 2019][238831.728474] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.735060] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.742095] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.749898] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.756328] [] kthread+0xd1/0xe0 [Thu Dec 12 00:37:32 2019][238831.761321] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:37:32 2019][238831.767887] [] 0xffffffffffffffff [Thu Dec 12 00:37:32 2019][238831.772981] Pid: 67980, comm: ll_ost_io00_027 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:37:32 2019][238831.783777] Call Trace: [Thu Dec 12 00:37:32 2019][238831.786323] [] __lock_page+0x74/0x90 [Thu Dec 12 00:37:32 2019][238831.791664] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:37:32 2019][238831.797452] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:37:32 2019][238831.803485] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:37:32 2019][238831.810322] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:37:32 2019][238831.817507] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:37:32 2019][238831.823563] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.830153] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.837187] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.844990] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:37:32 2019][238831.851418] [] kthread+0xd1/0xe0 [Thu Dec 12 00:37:32 2019][238831.856414] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:37:32 2019][238831.862967] [] 0xffffffffffffffff [Thu Dec 12 00:37:35 2019][238835.480122] LustreError: dumping log to /tmp/lustre-log.1576139855.26895 [Thu Dec 12 00:37:40 2019][238839.576203] LustreError: dumping log to /tmp/lustre-log.1576139859.26898 [Thu Dec 12 00:37:44 2019][238843.672276] LustreError: dumping log to /tmp/lustre-log.1576139864.67947 [Thu Dec 12 00:37:48 2019][238847.768367] LustreError: dumping log to /tmp/lustre-log.1576139868.26831 [Thu Dec 12 00:37:52 2019][238851.864439] LustreError: dumping log to /tmp/lustre-log.1576139872.68047 [Thu Dec 12 00:37:56 2019][238855.960522] LustreError: dumping log to /tmp/lustre-log.1576139876.67817 [Thu Dec 12 00:38:00 2019][238860.056602] LustreError: dumping log to /tmp/lustre-log.1576139880.67942 [Thu Dec 12 00:38:12 2019][238872.344849] LustreError: dumping log to /tmp/lustre-log.1576139892.26937 [Thu Dec 12 00:38:16 2019][238876.440930] LustreError: dumping log to /tmp/lustre-log.1576139896.67966 [Thu Dec 12 00:38:20 2019][238880.537017] LustreError: dumping log to /tmp/lustre-log.1576139900.26919 [Thu Dec 12 00:38:25 2019][238884.633101] LustreError: dumping log to /tmp/lustre-log.1576139905.26943 [Thu Dec 12 00:38:29 2019][238888.729178] LustreError: dumping log to /tmp/lustre-log.1576139909.26936 [Thu Dec 12 00:38:37 2019][238896.921342] LustreError: dumping log to /tmp/lustre-log.1576139917.26948 [Thu Dec 12 00:38:41 2019][238901.017427] LustreError: dumping log to /tmp/lustre-log.1576139921.67987 [Thu Dec 12 00:38:45 2019][238905.113505] LustreError: dumping log to /tmp/lustre-log.1576139925.26969 [Thu Dec 12 00:38:49 2019][238909.209608] LustreError: dumping log to /tmp/lustre-log.1576139929.26907 [Thu Dec 12 00:38:53 2019][238913.305672] LustreError: dumping log to /tmp/lustre-log.1576139933.113359 [Thu Dec 12 00:38:57 2019][238917.401754] LustreError: dumping log to /tmp/lustre-log.1576139937.26971 [Thu Dec 12 00:39:01 2019][238921.497840] LustreError: dumping log to /tmp/lustre-log.1576139941.26973 [Thu Dec 12 00:39:06 2019][238925.593923] LustreError: dumping log to /tmp/lustre-log.1576139946.67758 [Thu Dec 12 00:39:10 2019][238929.689999] LustreError: dumping log to /tmp/lustre-log.1576139950.67755 [Thu Dec 12 00:39:13 2019][238933.234077] LustreError: 67644:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576139653, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0054_UUID lock: ffff88e87d178480/0x7066c9c190adca24 lrc: 3/0,1 mode: --/PW res: [0x1800000402:0x110c27:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67644 timeout: 0 lvb_type: 0 [Thu Dec 12 00:39:18 2019][238937.882165] LustreError: dumping log to /tmp/lustre-log.1576139958.26985 [Thu Dec 12 00:39:22 2019][238941.978246] LustreError: dumping log to /tmp/lustre-log.1576139962.67881 [Thu Dec 12 00:39:26 2019][238946.074325] LustreError: dumping log to /tmp/lustre-log.1576139966.26952 [Thu Dec 12 00:39:30 2019][238950.170405] LustreError: dumping log to /tmp/lustre-log.1576139970.26989 [Thu Dec 12 00:39:34 2019][238954.266489] LustreError: dumping log to /tmp/lustre-log.1576139974.66132 [Thu Dec 12 00:39:51 2019][238970.650816] LustreError: dumping log to /tmp/lustre-log.1576139991.67834 [Thu Dec 12 00:39:59 2019][238978.842981] LustreError: dumping log to /tmp/lustre-log.1576139999.67698 [Thu Dec 12 00:40:03 2019][238982.939064] LustreError: dumping log to /tmp/lustre-log.1576140003.26966 [Thu Dec 12 00:40:07 2019][238987.035146] LustreError: dumping log to /tmp/lustre-log.1576140007.27043 [Thu Dec 12 00:40:11 2019][238991.131230] LustreError: dumping log to /tmp/lustre-log.1576140011.26987 [Thu Dec 12 00:40:15 2019][238995.227321] LustreError: dumping log to /tmp/lustre-log.1576140015.26997 [Thu Dec 12 00:40:32 2019][239011.611649] LustreError: dumping log to /tmp/lustre-log.1576140032.27049 [Thu Dec 12 00:40:36 2019][239015.707840] LustreError: dumping log to /tmp/lustre-log.1576140036.26916 [Thu Dec 12 00:40:40 2019][239019.803804] LustreError: dumping log to /tmp/lustre-log.1576140040.27044 [Thu Dec 12 00:40:44 2019][239023.899892] LustreError: dumping log to /tmp/lustre-log.1576140044.26918 [Thu Dec 12 00:40:48 2019][239027.995968] LustreError: dumping log to /tmp/lustre-log.1576140048.26944 [Thu Dec 12 00:41:00 2019][239040.284214] LustreError: dumping log to /tmp/lustre-log.1576140060.27021 [Thu Dec 12 00:41:09 2019][239048.476388] LustreError: dumping log to /tmp/lustre-log.1576140068.27004 [Thu Dec 12 00:41:33 2019][239072.928876] LustreError: 67683:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576139793, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0054_UUID lock: ffff88fc261efbc0/0x7066c9c190add4e3 lrc: 3/0,1 mode: --/PW res: [0x1800000402:0x110c28:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67683 timeout: 0 lvb_type: 0 [Thu Dec 12 00:41:33 2019][239073.052881] LNet: Service thread pid 26990 was inactive for 1201.65s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:41:33 2019][239073.065932] LNet: Skipped 178 previous similar messages [Thu Dec 12 00:41:33 2019][239073.071248] LustreError: dumping log to /tmp/lustre-log.1576140093.26990 [Thu Dec 12 00:41:37 2019][239077.148952] LustreError: dumping log to /tmp/lustre-log.1576140097.27090 [Thu Dec 12 00:42:18 2019][239118.109781] LustreError: dumping log to /tmp/lustre-log.1576140138.112535 [Thu Dec 12 00:42:22 2019][239122.205857] LustreError: dumping log to /tmp/lustre-log.1576140142.27112 [Thu Dec 12 00:42:43 2019][239142.686281] Pid: 27079, comm: ll_ost_io03_076 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:42:43 2019][239142.697061] Call Trace: [Thu Dec 12 00:42:43 2019][239142.699631] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:42:43 2019][239142.706638] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:42:43 2019][239142.713838] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:42:43 2019][239142.720485] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:42:43 2019][239142.727235] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:42:43 2019][239142.734761] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:42:43 2019][239142.741866] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:42:43 2019][239142.748611] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:42:43 2019][239142.754752] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:42:43 2019][239142.761455] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:42:43 2019][239142.768511] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:42:43 2019][239142.776310] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:42:43 2019][239142.782738] [] kthread+0xd1/0xe0 [Thu Dec 12 00:42:43 2019][239142.787742] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:42:43 2019][239142.794317] [] 0xffffffffffffffff [Thu Dec 12 00:42:43 2019][239142.799419] LustreError: dumping log to /tmp/lustre-log.1576140163.27079 [Thu Dec 12 00:42:51 2019][239150.878428] Pid: 27113, comm: ll_ost_io03_080 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:42:51 2019][239150.889206] Call Trace: [Thu Dec 12 00:42:51 2019][239150.891762] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:42:51 2019][239150.898764] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:42:51 2019][239150.905958] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:42:51 2019][239150.912603] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:42:51 2019][239150.919352] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:42:51 2019][239150.926879] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:42:51 2019][239150.933990] [] ofd_commitrw_write+0xf1e/0x1d40 [ofd] [Thu Dec 12 00:42:51 2019][239150.940728] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:42:51 2019][239150.946884] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:42:51 2019][239150.953571] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:42:51 2019][239150.960606] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:42:51 2019][239150.968427] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:42:51 2019][239150.974839] [] kthread+0xd1/0xe0 [Thu Dec 12 00:42:51 2019][239150.979882] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:42:51 2019][239150.986438] [] 0xffffffffffffffff [Thu Dec 12 00:42:51 2019][239150.991541] LustreError: dumping log to /tmp/lustre-log.1576140171.27113 [Thu Dec 12 00:43:28 2019][239187.743171] Pid: 27093, comm: ll_ost_io02_095 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:43:28 2019][239187.753955] Call Trace: [Thu Dec 12 00:43:28 2019][239187.756530] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:43:28 2019][239187.763530] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:43:28 2019][239187.770712] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:43:28 2019][239187.777374] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:43:28 2019][239187.784126] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:43:28 2019][239187.791652] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 00:43:28 2019][239187.798756] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 00:43:28 2019][239187.804972] [] ofd_commitrw_write+0xa31/0x1d40 [ofd] [Thu Dec 12 00:43:28 2019][239187.811721] [] ofd_commitrw+0x48c/0x9e0 [ofd] [Thu Dec 12 00:43:28 2019][239187.817851] [] tgt_brw_write+0x10cb/0x1cf0 [ptlrpc] [Thu Dec 12 00:43:28 2019][239187.824564] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:43:28 2019][239187.831588] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:43:28 2019][239187.839418] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:43:28 2019][239187.845838] [] kthread+0xd1/0xe0 [Thu Dec 12 00:43:28 2019][239187.850851] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:43:28 2019][239187.857418] [] 0xffffffffffffffff [Thu Dec 12 00:43:28 2019][239187.862528] LustreError: dumping log to /tmp/lustre-log.1576140208.27093 [Thu Dec 12 00:43:32 2019][239191.839250] Pid: 27066, comm: ll_ost_io03_073 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:43:32 2019][239191.850035] Call Trace: [Thu Dec 12 00:43:32 2019][239191.852596] [] __lock_page+0x74/0x90 [Thu Dec 12 00:43:32 2019][239191.857946] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:43:32 2019][239191.863747] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:43:32 2019][239191.869808] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:43:32 2019][239191.876656] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:43:32 2019][239191.883824] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:43:32 2019][239191.889878] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:43:32 2019][239191.896494] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:43:32 2019][239191.903529] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:43:32 2019][239191.911333] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:43:32 2019][239191.917762] [] kthread+0xd1/0xe0 [Thu Dec 12 00:43:32 2019][239191.922765] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:43:32 2019][239191.929341] [] 0xffffffffffffffff [Thu Dec 12 00:43:32 2019][239191.934464] LustreError: dumping log to /tmp/lustre-log.1576140212.27066 [Thu Dec 12 00:43:32 2019][239191.941836] Pid: 27089, comm: ll_ost_io00_088 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:43:32 2019][239191.952648] Call Trace: [Thu Dec 12 00:43:32 2019][239191.955194] [] __lock_page+0x74/0x90 [Thu Dec 12 00:43:32 2019][239191.960543] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:43:32 2019][239191.966329] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:43:32 2019][239191.972373] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:43:32 2019][239191.979210] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:43:32 2019][239191.986379] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:43:32 2019][239191.992435] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:43:32 2019][239191.999039] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:43:32 2019][239192.006076] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:43:32 2019][239192.013883] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:43:32 2019][239192.020316] [] kthread+0xd1/0xe0 [Thu Dec 12 00:43:32 2019][239192.025309] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:43:32 2019][239192.031887] [] 0xffffffffffffffff [Thu Dec 12 00:43:52 2019][239211.879472] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 00:43:52 2019][239211.889654] Lustre: Skipped 241 previous similar messages [Thu Dec 12 00:43:52 2019][239212.319668] LustreError: dumping log to /tmp/lustre-log.1576140232.27054 [Thu Dec 12 00:43:56 2019][239216.415741] LustreError: dumping log to /tmp/lustre-log.1576140236.27070 [Thu Dec 12 00:44:33 2019][239252.857485] LustreError: 67592:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576139973, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0054_UUID lock: ffff88f028303a80/0x7066c9c190addd87 lrc: 3/0,1 mode: --/PW res: [0x1800000401:0xb3d37e:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67592 timeout: 0 lvb_type: 0 [Thu Dec 12 00:44:46 2019][239265.568733] LustreError: dumping log to /tmp/lustre-log.1576140285.27053 [Thu Dec 12 00:44:47 2019][239266.586759] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:44:47 2019][239266.586759] req@ffff8912fbf3d050 x1651926262561472/t0(0) o4->360200e4-9bb2-dc52-b96e-5f48834c2e13@10.8.27.21@o2ib6:26/0 lens 488/0 e 1 to 0 dl 1576140291 ref 2 fl New:/0/ffffffff rc 0/-1 [Thu Dec 12 00:44:47 2019][239266.615399] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 530 previous similar messages [Thu Dec 12 00:45:04 2019][239284.519561] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 00:45:04 2019][239284.528524] Lustre: Skipped 59 previous similar messages [Thu Dec 12 00:45:09 2019][239288.861084] Lustre: fir-OST0058: Connection restored to c93954af-761b-f1eb-f651-9881322a7a72 (at 10.9.108.51@o2ib4) [Thu Dec 12 00:45:09 2019][239288.871607] Lustre: Skipped 286 previous similar messages [Thu Dec 12 00:45:10 2019][239290.145220] LustreError: dumping log to /tmp/lustre-log.1576140310.27073 [Thu Dec 12 00:45:31 2019][239310.625632] LustreError: dumping log to /tmp/lustre-log.1576140331.27186 [Thu Dec 12 00:45:35 2019][239314.721711] LustreError: dumping log to /tmp/lustre-log.1576140335.27126 [Thu Dec 12 00:45:48 2019][239328.033980] LustreError: dumping log to /tmp/lustre-log.1576140348.67696 [Thu Dec 12 00:46:17 2019][239356.626551] LustreError: 112566:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576140077, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff8905d3ecc5c0/0x7066c9c190ade131 lrc: 3/0,1 mode: --/PW res: [0x1980000401:0xb4b138:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112566 timeout: 0 lvb_type: 0 [Thu Dec 12 00:46:24 2019][239363.874698] LustreError: dumping log to /tmp/lustre-log.1576140384.27076 [Thu Dec 12 00:46:36 2019][239376.162942] LustreError: dumping log to /tmp/lustre-log.1576140396.26988 [Thu Dec 12 00:46:48 2019][239388.451179] LustreError: dumping log to /tmp/lustre-log.1576140408.27185 [Thu Dec 12 00:46:49 2019][239389.256104] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117226 to 0x1800000402:1117281 [Thu Dec 12 00:47:05 2019][239404.835509] LustreError: dumping log to /tmp/lustre-log.1576140425.27219 [Thu Dec 12 00:47:27 2019][239427.358966] LustreError: 67696:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576140147, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff88fcfae90000/0x7066c9c190ade2a4 lrc: 3/0,1 mode: --/PW res: [0x1980000402:0x2f245a:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67696 timeout: 0 lvb_type: 0 [Thu Dec 12 00:47:29 2019][239429.412004] LustreError: dumping log to /tmp/lustre-log.1576140449.27226 [Thu Dec 12 00:47:33 2019][239433.508086] LustreError: dumping log to /tmp/lustre-log.1576140453.27075 [Thu Dec 12 00:47:38 2019][239437.604174] LustreError: dumping log to /tmp/lustre-log.1576140458.27254 [Thu Dec 12 00:47:42 2019][239441.700246] LustreError: dumping log to /tmp/lustre-log.1576140462.27259 [Thu Dec 12 00:47:46 2019][239445.796336] LNet: Service thread pid 27255 was inactive for 1202.86s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:47:46 2019][239445.813444] LNet: Skipped 9 previous similar messages [Thu Dec 12 00:47:46 2019][239445.818595] Pid: 27255, comm: ll_ost_io00_103 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:47:46 2019][239445.829372] Call Trace: [Thu Dec 12 00:47:46 2019][239445.831922] [] __lock_page+0x74/0x90 [Thu Dec 12 00:47:46 2019][239445.837275] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:47:46 2019][239445.843064] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:47:46 2019][239445.849096] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:47:46 2019][239445.855943] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:47:46 2019][239445.863114] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:47:46 2019][239445.869166] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:47:46 2019][239445.875774] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:47:46 2019][239445.882816] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:47:46 2019][239445.890620] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:47:46 2019][239445.897049] [] kthread+0xd1/0xe0 [Thu Dec 12 00:47:46 2019][239445.902053] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:47:46 2019][239445.908636] [] 0xffffffffffffffff [Thu Dec 12 00:47:46 2019][239445.913744] LustreError: dumping log to /tmp/lustre-log.1576140466.27255 [Thu Dec 12 00:47:50 2019][239449.892412] Pid: 27231, comm: ll_ost_io01_097 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:47:50 2019][239449.903196] Call Trace: [Thu Dec 12 00:47:50 2019][239449.905748] [] __lock_page+0x74/0x90 [Thu Dec 12 00:47:50 2019][239449.911099] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:47:50 2019][239449.916898] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:47:50 2019][239449.922931] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:47:50 2019][239449.929775] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:47:50 2019][239449.936958] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:47:50 2019][239449.943015] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239449.949633] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:47:50 2019][239449.956667] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239449.964469] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:47:50 2019][239449.970900] [] kthread+0xd1/0xe0 [Thu Dec 12 00:47:50 2019][239449.975903] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:47:50 2019][239449.982477] [] 0xffffffffffffffff [Thu Dec 12 00:47:50 2019][239449.987588] LustreError: dumping log to /tmp/lustre-log.1576140470.27231 [Thu Dec 12 00:47:50 2019][239449.994976] Pid: 27258, comm: ll_ost_io00_106 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:47:50 2019][239450.005769] Call Trace: [Thu Dec 12 00:47:50 2019][239450.008313] [] __lock_page+0x74/0x90 [Thu Dec 12 00:47:50 2019][239450.013653] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:47:50 2019][239450.019440] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:47:50 2019][239450.025477] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:47:50 2019][239450.032320] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:47:50 2019][239450.039490] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:47:50 2019][239450.045546] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.052145] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.059180] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.067000] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.073428] [] kthread+0xd1/0xe0 [Thu Dec 12 00:47:50 2019][239450.078422] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:47:50 2019][239450.084989] [] 0xffffffffffffffff [Thu Dec 12 00:47:50 2019][239450.090081] Pid: 27071, comm: ll_ost_io00_083 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:47:50 2019][239450.100868] Call Trace: [Thu Dec 12 00:47:50 2019][239450.103413] [] __lock_page+0x74/0x90 [Thu Dec 12 00:47:50 2019][239450.108755] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:47:50 2019][239450.114527] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:47:50 2019][239450.120583] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:47:50 2019][239450.127437] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:47:50 2019][239450.134623] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:47:50 2019][239450.140667] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.147270] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.154293] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.162105] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.168523] [] kthread+0xd1/0xe0 [Thu Dec 12 00:47:50 2019][239450.173529] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:47:50 2019][239450.180084] [] 0xffffffffffffffff [Thu Dec 12 00:47:50 2019][239450.185189] Pid: 27072, comm: ll_ost_io03_074 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:47:50 2019][239450.195977] Call Trace: [Thu Dec 12 00:47:50 2019][239450.198525] [] __lock_page+0x74/0x90 [Thu Dec 12 00:47:50 2019][239450.203867] [] __find_lock_page+0x54/0x70 [Thu Dec 12 00:47:50 2019][239450.209653] [] find_or_create_page+0x34/0xa0 [Thu Dec 12 00:47:50 2019][239450.215688] [] osd_bufs_get+0x413/0x870 [osd_ldiskfs] [Thu Dec 12 00:47:50 2019][239450.222524] [] ofd_preprw_write.isra.31+0x476/0xea0 [ofd] [Thu Dec 12 00:47:50 2019][239450.229694] [] ofd_preprw+0x422/0x11b0 [ofd] [Thu Dec 12 00:47:50 2019][239450.235748] [] tgt_brw_write+0xc7c/0x1cf0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.242338] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.249373] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.257191] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:47:50 2019][239450.263622] [] kthread+0xd1/0xe0 [Thu Dec 12 00:47:50 2019][239450.268617] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:47:50 2019][239450.275183] [] 0xffffffffffffffff [Thu Dec 12 00:47:54 2019][239453.988496] LustreError: dumping log to /tmp/lustre-log.1576140474.27250 [Thu Dec 12 00:47:57 2019][239457.037560] LustreError: 67629:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576140177, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0058_UUID lock: ffff890987068fc0/0x7066c9c190ade472 lrc: 3/0,1 mode: --/PW res: [0x1900000401:0xb402ed:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67629 timeout: 0 lvb_type: 0 [Thu Dec 12 00:47:58 2019][239458.084571] LustreError: dumping log to /tmp/lustre-log.1576140478.27279 [Thu Dec 12 00:48:02 2019][239462.180656] LustreError: dumping log to /tmp/lustre-log.1576140482.27263 [Thu Dec 12 00:48:10 2019][239470.372840] LustreError: dumping log to /tmp/lustre-log.1576140490.27260 [Thu Dec 12 00:48:19 2019][239478.564989] LustreError: dumping log to /tmp/lustre-log.1576140498.27264 [Thu Dec 12 00:48:27 2019][239486.757152] LustreError: dumping log to /tmp/lustre-log.1576140507.27275 [Thu Dec 12 00:48:31 2019][239490.853236] LustreError: dumping log to /tmp/lustre-log.1576140511.27261 [Thu Dec 12 00:48:35 2019][239494.949315] LustreError: dumping log to /tmp/lustre-log.1576140515.27225 [Thu Dec 12 00:48:43 2019][239503.141478] LustreError: dumping log to /tmp/lustre-log.1576140523.27278 [Thu Dec 12 00:48:47 2019][239507.237556] LustreError: dumping log to /tmp/lustre-log.1576140527.27310 [Thu Dec 12 00:48:51 2019][239511.333635] LustreError: dumping log to /tmp/lustre-log.1576140531.27312 [Thu Dec 12 00:48:55 2019][239515.429722] LustreError: dumping log to /tmp/lustre-log.1576140535.27085 [Thu Dec 12 00:48:59 2019][239519.525802] LustreError: dumping log to /tmp/lustre-log.1576140539.27272 [Thu Dec 12 00:49:04 2019][239523.621887] LustreError: dumping log to /tmp/lustre-log.1576140544.27232 [Thu Dec 12 00:49:08 2019][239527.717968] LustreError: dumping log to /tmp/lustre-log.1576140548.27298 [Thu Dec 12 00:49:12 2019][239531.814059] LustreError: dumping log to /tmp/lustre-log.1576140552.27314 [Thu Dec 12 00:49:32 2019][239552.294461] LustreError: dumping log to /tmp/lustre-log.1576140572.27324 [Thu Dec 12 00:49:36 2019][239556.390543] LustreError: dumping log to /tmp/lustre-log.1576140576.27322 [Thu Dec 12 00:49:53 2019][239572.671876] LustreError: 112528:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576140293, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005e_UUID lock: ffff890020f49440/0x7066c9c190ade74a lrc: 3/0,1 mode: --/PW res: [0x1a80000401:0x111eb5:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112528 timeout: 0 lvb_type: 0 [Thu Dec 12 00:50:09 2019][239589.159211] LustreError: dumping log to /tmp/lustre-log.1576140609.67861 [Thu Dec 12 00:50:17 2019][239597.351356] LustreError: dumping log to /tmp/lustre-log.1576140617.27336 [Thu Dec 12 00:50:21 2019][239601.447451] LustreError: dumping log to /tmp/lustre-log.1576140621.112525 [Thu Dec 12 00:50:25 2019][239605.543530] LustreError: dumping log to /tmp/lustre-log.1576140625.67921 [Thu Dec 12 00:50:34 2019][239613.735683] LustreError: dumping log to /tmp/lustre-log.1576140634.27339 [Thu Dec 12 00:50:42 2019][239621.927847] LustreError: dumping log to /tmp/lustre-log.1576140642.27357 [Thu Dec 12 00:51:35 2019][239675.176916] LNet: Service thread pid 67602 was inactive for 1201.88s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 00:51:35 2019][239675.189983] LNet: Skipped 118 previous similar messages [Thu Dec 12 00:51:35 2019][239675.195311] LustreError: dumping log to /tmp/lustre-log.1576140695.67602 [Thu Dec 12 00:52:09 2019][239709.333597] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785095 to 0x1800000401:11785185 [Thu Dec 12 00:52:16 2019][239716.137736] LustreError: dumping log to /tmp/lustre-log.1576140736.67843 [Thu Dec 12 00:52:37 2019][239736.618140] LustreError: dumping log to /tmp/lustre-log.1576140757.67614 [Thu Dec 12 00:53:52 2019][239811.969335] Lustre: fir-OST005a: Client 882378af-0b41-73ee-5c10-5cc51464645c (at 10.9.108.22@o2ib4) reconnecting [Thu Dec 12 00:53:52 2019][239811.979602] Lustre: Skipped 300 previous similar messages [Thu Dec 12 00:53:53 2019][239813.287543] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11841861 to 0x1980000401:11841953 [Thu Dec 12 00:53:55 2019][239815.080705] LustreError: 112521:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576140535, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005e_UUID lock: ffff890fa9aae9c0/0x7066c9c190adf5ba lrc: 3/0,1 mode: --/PW res: [0x1a31e13:0x0:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112521 timeout: 0 lvb_type: 0 [Thu Dec 12 00:54:15 2019][239834.924106] Pid: 67644, comm: ll_ost01_022 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:54:15 2019][239834.934621] Call Trace: [Thu Dec 12 00:54:15 2019][239834.937173] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 00:54:15 2019][239834.944215] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 00:54:15 2019][239834.951514] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 00:54:15 2019][239834.958160] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 00:54:15 2019][239834.964565] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:54:15 2019][239834.971602] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:54:15 2019][239834.979418] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:54:15 2019][239834.985834] [] kthread+0xd1/0xe0 [Thu Dec 12 00:54:15 2019][239834.990835] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:54:15 2019][239834.997402] [] 0xffffffffffffffff [Thu Dec 12 00:54:15 2019][239835.002510] LustreError: dumping log to /tmp/lustre-log.1576140855.67644 [Thu Dec 12 00:54:47 2019][239866.600723] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 00:54:47 2019][239866.600723] req@ffff88f2b0727850 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:626/0 lens 440/0 e 0 to 0 dl 1576140891 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 00:54:47 2019][239866.629355] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 710 previous similar messages [Thu Dec 12 00:54:52 2019][239871.788812] Pid: 66094, comm: ll_ost00_001 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:54:52 2019][239871.799329] Call Trace: [Thu Dec 12 00:54:52 2019][239871.801896] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:54:52 2019][239871.808894] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:54:52 2019][239871.816096] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:54:52 2019][239871.822745] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:54:52 2019][239871.829494] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:54:52 2019][239871.837020] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 00:54:52 2019][239871.844115] [] dqget+0x3fa/0x450 [Thu Dec 12 00:54:52 2019][239871.849119] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 00:54:52 2019][239871.854927] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 00:54:52 2019][239871.862549] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 00:54:52 2019][239871.869040] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 00:54:52 2019][239871.875172] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:54:52 2019][239871.882215] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:54:52 2019][239871.890017] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:54:52 2019][239871.896446] [] kthread+0xd1/0xe0 [Thu Dec 12 00:54:52 2019][239871.901449] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:54:52 2019][239871.908025] [] 0xffffffffffffffff [Thu Dec 12 00:54:52 2019][239871.913127] LustreError: dumping log to /tmp/lustre-log.1576140892.66094 [Thu Dec 12 00:54:56 2019][239875.884892] Pid: 67782, comm: ll_ost00_045 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:54:56 2019][239875.895408] Call Trace: [Thu Dec 12 00:54:56 2019][239875.897977] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:54:56 2019][239875.904976] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:54:56 2019][239875.912158] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:54:56 2019][239875.918808] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:54:56 2019][239875.925557] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:54:56 2019][239875.933082] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 00:54:56 2019][239875.940179] [] dqget+0x3fa/0x450 [Thu Dec 12 00:54:56 2019][239875.945181] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 00:54:56 2019][239875.950971] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 00:54:56 2019][239875.958594] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 00:54:56 2019][239875.965075] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 00:54:56 2019][239875.971226] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:54:56 2019][239875.978263] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:54:56 2019][239875.986064] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:54:56 2019][239875.992490] [] kthread+0xd1/0xe0 [Thu Dec 12 00:54:56 2019][239875.997487] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:54:56 2019][239876.004062] [] 0xffffffffffffffff [Thu Dec 12 00:54:56 2019][239876.009155] LustreError: dumping log to /tmp/lustre-log.1576140896.67782 [Thu Dec 12 00:55:04 2019][239883.857031] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089507 to 0x1980000402:3089569 [Thu Dec 12 00:55:07 2019][239886.643484] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 00:55:07 2019][239886.652449] Lustre: Skipped 62 previous similar messages [Thu Dec 12 00:55:09 2019][239888.989286] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 00:55:09 2019][239888.999806] Lustre: Skipped 345 previous similar messages [Thu Dec 12 00:55:33 2019][239913.208552] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797241 to 0x1900000401:11797281 [Thu Dec 12 00:55:37 2019][239916.845706] Pid: 67901, comm: ll_ost01_070 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:55:37 2019][239916.856230] Call Trace: [Thu Dec 12 00:55:37 2019][239916.858809] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:55:37 2019][239916.865806] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:55:37 2019][239916.872991] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:55:37 2019][239916.879638] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:55:37 2019][239916.886386] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:55:37 2019][239916.893911] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 00:55:37 2019][239916.901005] [] dqget+0x3fa/0x450 [Thu Dec 12 00:55:37 2019][239916.906011] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 00:55:37 2019][239916.911796] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 00:55:37 2019][239916.919410] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 00:55:37 2019][239916.925908] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 00:55:37 2019][239916.932051] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:55:37 2019][239916.939115] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:55:37 2019][239916.946919] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:55:37 2019][239916.953348] [] kthread+0xd1/0xe0 [Thu Dec 12 00:55:37 2019][239916.958349] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:55:37 2019][239916.964925] [] 0xffffffffffffffff [Thu Dec 12 00:55:37 2019][239916.970027] LustreError: dumping log to /tmp/lustre-log.1576140937.67901 [Thu Dec 12 00:55:41 2019][239920.941801] Pid: 67702, comm: ll_ost03_031 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:55:41 2019][239920.952318] Call Trace: [Thu Dec 12 00:55:41 2019][239920.954889] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 00:55:41 2019][239920.961886] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 00:55:41 2019][239920.969068] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 00:55:41 2019][239920.975717] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 00:55:41 2019][239920.982468] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 00:55:41 2019][239920.989991] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 00:55:41 2019][239920.997096] [] dqget+0x3fa/0x450 [Thu Dec 12 00:55:41 2019][239921.002099] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 00:55:41 2019][239921.007885] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 00:55:41 2019][239921.015498] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 00:55:41 2019][239921.021986] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 00:55:41 2019][239921.028131] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:55:41 2019][239921.035196] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:55:41 2019][239921.042998] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:55:41 2019][239921.049426] [] kthread+0xd1/0xe0 [Thu Dec 12 00:55:41 2019][239921.054430] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:55:41 2019][239921.061006] [] 0xffffffffffffffff [Thu Dec 12 00:55:41 2019][239921.066106] LustreError: dumping log to /tmp/lustre-log.1576140941.67702 [Thu Dec 12 00:55:49 2019][239929.133944] LustreError: dumping log to /tmp/lustre-log.1576140949.67900 [Thu Dec 12 00:56:19 2019][239959.099553] LustreError: 67744:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576140679, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff88e670859680/0x7066c9c190ae10c5 lrc: 3/0,1 mode: --/PW res: [0x1980000400:0x112c96:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67744 timeout: 0 lvb_type: 0 [Thu Dec 12 00:56:30 2019][239970.094745] LustreError: dumping log to /tmp/lustre-log.1576140990.67694 [Thu Dec 12 00:56:34 2019][239974.190821] LustreError: dumping log to /tmp/lustre-log.1576140994.67683 [Thu Dec 12 00:57:29 2019][240028.686582] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1121977 to 0x1a80000401:1122017 [Thu Dec 12 00:59:25 2019][240145.394789] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117286 to 0x1800000402:1117313 [Thu Dec 12 00:59:34 2019][240154.418384] LNet: Service thread pid 67592 was inactive for 1201.54s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 00:59:34 2019][240154.435493] LNet: Skipped 9 previous similar messages [Thu Dec 12 00:59:34 2019][240154.440636] Pid: 67592, comm: ll_ost02_009 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 00:59:34 2019][240154.451171] Call Trace: [Thu Dec 12 00:59:34 2019][240154.453730] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 00:59:34 2019][240154.460772] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 00:59:34 2019][240154.468068] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 00:59:34 2019][240154.474717] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 00:59:34 2019][240154.481120] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 00:59:34 2019][240154.488159] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 00:59:34 2019][240154.495974] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 00:59:34 2019][240154.502392] [] kthread+0xd1/0xe0 [Thu Dec 12 00:59:34 2019][240154.507404] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 00:59:34 2019][240154.513970] [] 0xffffffffffffffff [Thu Dec 12 00:59:34 2019][240154.519101] LustreError: dumping log to /tmp/lustre-log.1576141174.67592 [Thu Dec 12 01:00:52 2019][240232.243942] Pid: 112494, comm: ll_ost00_075 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:00:52 2019][240232.254546] Call Trace: [Thu Dec 12 01:00:52 2019][240232.257127] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:00:52 2019][240232.264131] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:00:52 2019][240232.271314] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:00:52 2019][240232.277961] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:00:52 2019][240232.284716] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:00:52 2019][240232.292237] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:00:52 2019][240232.299340] [] dqget+0x3fa/0x450 [Thu Dec 12 01:00:52 2019][240232.304366] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:00:52 2019][240232.310150] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:00:52 2019][240232.317777] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:00:52 2019][240232.324253] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:00:52 2019][240232.330395] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:00:52 2019][240232.337433] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:00:52 2019][240232.345257] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:00:52 2019][240232.351674] [] kthread+0xd1/0xe0 [Thu Dec 12 01:00:52 2019][240232.356674] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:00:52 2019][240232.363252] [] 0xffffffffffffffff [Thu Dec 12 01:00:52 2019][240232.368350] LustreError: dumping log to /tmp/lustre-log.1576141252.112494 [Thu Dec 12 01:00:56 2019][240236.340028] Pid: 67736, comm: ll_ost02_034 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:00:56 2019][240236.350548] Call Trace: [Thu Dec 12 01:00:56 2019][240236.353118] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:00:56 2019][240236.360121] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:00:56 2019][240236.367304] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:00:56 2019][240236.373954] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:00:56 2019][240236.380704] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:00:56 2019][240236.388229] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:00:56 2019][240236.395323] [] dqget+0x3fa/0x450 [Thu Dec 12 01:00:56 2019][240236.400344] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:00:56 2019][240236.406140] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:00:56 2019][240236.413754] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:00:56 2019][240236.420242] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:00:56 2019][240236.426372] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:00:56 2019][240236.433433] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:00:56 2019][240236.441235] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:00:56 2019][240236.447664] [] kthread+0xd1/0xe0 [Thu Dec 12 01:00:56 2019][240236.452667] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:00:56 2019][240236.459243] [] 0xffffffffffffffff [Thu Dec 12 01:00:56 2019][240236.464356] LustreError: dumping log to /tmp/lustre-log.1576141256.67736 [Thu Dec 12 01:01:00 2019][240240.436112] Pid: 67785, comm: ll_ost02_040 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:01:00 2019][240240.446628] Call Trace: [Thu Dec 12 01:01:00 2019][240240.449190] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:01:00 2019][240240.456184] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:01:00 2019][240240.463367] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:01:00 2019][240240.470018] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:01:00 2019][240240.476766] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:01:00 2019][240240.484291] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:01:00 2019][240240.491390] [] dqget+0x3fa/0x450 [Thu Dec 12 01:01:00 2019][240240.496391] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:01:00 2019][240240.502199] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:01:00 2019][240240.509799] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:01:00 2019][240240.516286] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:01:00 2019][240240.522417] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:01:00 2019][240240.529469] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:01:00 2019][240240.537271] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:01:00 2019][240240.543699] [] kthread+0xd1/0xe0 [Thu Dec 12 01:01:01 2019][240240.548704] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:01:01 2019][240240.555269] [] 0xffffffffffffffff [Thu Dec 12 01:01:01 2019][240240.560377] LustreError: dumping log to /tmp/lustre-log.1576141260.67785 [Thu Dec 12 01:01:17 2019][240256.820444] Pid: 112566, comm: ll_ost02_086 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:01:17 2019][240256.831052] Call Trace: [Thu Dec 12 01:01:17 2019][240256.833606] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:01:17 2019][240256.840646] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:01:17 2019][240256.847941] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:01:17 2019][240256.854590] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:01:17 2019][240256.860991] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:01:17 2019][240256.868034] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:01:17 2019][240256.875847] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:01:17 2019][240256.882282] [] kthread+0xd1/0xe0 [Thu Dec 12 01:01:17 2019][240256.887282] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:01:17 2019][240256.893850] [] 0xffffffffffffffff [Thu Dec 12 01:01:17 2019][240256.898960] LustreError: dumping log to /tmp/lustre-log.1576141277.112566 [Thu Dec 12 01:01:20 2019][240260.434521] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125531 to 0x1980000400:1125569 [Thu Dec 12 01:01:31 2019][240270.998164] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467286 to 0x0:27467329 [Thu Dec 12 01:02:15 2019][240314.616600] LustreError: 67741:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576141035, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff88e75d68f740/0x7066c9c190ae6a91 lrc: 3/0,1 mode: --/PW res: [0x1980000400:0x112c98:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67741 timeout: 0 lvb_type: 0 [Thu Dec 12 01:02:18 2019][240318.261695] LNet: Service thread pid 67675 was inactive for 1203.78s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 01:02:18 2019][240318.274731] LNet: Skipped 6 previous similar messages [Thu Dec 12 01:02:18 2019][240318.279879] LustreError: dumping log to /tmp/lustre-log.1576141338.67675 [Thu Dec 12 01:02:47 2019][240346.934237] LustreError: dumping log to /tmp/lustre-log.1576141367.112557 [Thu Dec 12 01:02:55 2019][240355.126422] LustreError: dumping log to /tmp/lustre-log.1576141375.67687 [Thu Dec 12 01:02:59 2019][240359.222485] LustreError: dumping log to /tmp/lustre-log.1576141379.67629 [Thu Dec 12 01:03:03 2019][240363.318573] LustreError: dumping log to /tmp/lustre-log.1576141383.67682 [Thu Dec 12 01:03:11 2019][240371.510737] LustreError: dumping log to /tmp/lustre-log.1576141391.67599 [Thu Dec 12 01:03:15 2019][240375.606814] LustreError: dumping log to /tmp/lustre-log.1576141395.112499 [Thu Dec 12 01:03:54 2019][240414.463197] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 01:03:54 2019][240414.473370] Lustre: Skipped 412 previous similar messages [Thu Dec 12 01:03:56 2019][240416.567635] LustreError: dumping log to /tmp/lustre-log.1576141436.112543 [Thu Dec 12 01:04:05 2019][240424.759800] LustreError: dumping log to /tmp/lustre-log.1576141445.67709 [Thu Dec 12 01:04:09 2019][240428.855879] LustreError: dumping log to /tmp/lustre-log.1576141449.67814 [Thu Dec 12 01:04:13 2019][240432.952007] LustreError: dumping log to /tmp/lustre-log.1576141453.112517 [Thu Dec 12 01:04:45 2019][240465.432406] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785188 to 0x1800000401:11785217 [Thu Dec 12 01:04:47 2019][240467.576679] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 01:04:47 2019][240467.576679] req@ffff88fd6e0f9050 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:472/0 lens 440/0 e 0 to 0 dl 1576141492 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 01:04:48 2019][240467.605315] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 700 previous similar messages [Thu Dec 12 01:04:54 2019][240473.912790] Pid: 112528, comm: ll_ost01_085 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:04:54 2019][240473.923403] Call Trace: [Thu Dec 12 01:04:54 2019][240473.925961] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:04:54 2019][240473.933002] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:04:54 2019][240473.940317] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:04:54 2019][240473.946979] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:04:54 2019][240473.953396] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:04:54 2019][240473.960439] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:04:54 2019][240473.968271] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:04:54 2019][240473.974688] [] kthread+0xd1/0xe0 [Thu Dec 12 01:04:54 2019][240473.979689] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:04:54 2019][240473.986265] [] 0xffffffffffffffff [Thu Dec 12 01:04:54 2019][240473.991373] LustreError: dumping log to /tmp/lustre-log.1576141494.112528 [Thu Dec 12 01:04:54 2019][240473.998923] Pid: 67595, comm: ll_ost00_008 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:04:54 2019][240474.009462] Call Trace: [Thu Dec 12 01:04:54 2019][240474.012021] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:04:54 2019][240474.019018] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:04:54 2019][240474.026201] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:04:54 2019][240474.032851] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:04:54 2019][240474.039586] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:04:54 2019][240474.047123] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:04:54 2019][240474.054216] [] dqget+0x3fa/0x450 [Thu Dec 12 01:04:54 2019][240474.059231] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:04:54 2019][240474.065006] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:04:54 2019][240474.072629] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:04:54 2019][240474.079106] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:04:54 2019][240474.085249] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:04:54 2019][240474.092279] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:04:54 2019][240474.100094] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:04:54 2019][240474.106510] [] kthread+0xd1/0xe0 [Thu Dec 12 01:04:54 2019][240474.111503] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:04:54 2019][240474.118073] [] 0xffffffffffffffff [Thu Dec 12 01:05:09 2019][240488.767686] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 01:05:09 2019][240488.776652] Lustre: Skipped 71 previous similar messages [Thu Dec 12 01:05:10 2019][240489.986614] Lustre: fir-OST005c: Connection restored to cec884d3-ca4b-8127-2f6b-7762665aa5f8 (at 10.9.0.64@o2ib4) [Thu Dec 12 01:05:10 2019][240489.996967] Lustre: Skipped 365 previous similar messages [Thu Dec 12 01:05:14 2019][240494.393191] Pid: 67651, comm: ll_ost02_021 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:05:14 2019][240494.403725] Call Trace: [Thu Dec 12 01:05:14 2019][240494.406297] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:05:14 2019][240494.412703] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:05:14 2019][240494.419773] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:05:14 2019][240494.427610] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:05:14 2019][240494.434032] [] kthread+0xd1/0xe0 [Thu Dec 12 01:05:14 2019][240494.439037] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:05:14 2019][240494.445631] [] 0xffffffffffffffff [Thu Dec 12 01:05:14 2019][240494.450750] LustreError: dumping log to /tmp/lustre-log.1576141514.67651 [Thu Dec 12 01:06:30 2019][240569.618538] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11841959 to 0x1980000401:11841985 [Thu Dec 12 01:06:44 2019][240584.506993] Pid: 67889, comm: ll_ost02_059 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:06:44 2019][240584.517517] Call Trace: [Thu Dec 12 01:06:44 2019][240584.520095] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:06:44 2019][240584.527093] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:06:44 2019][240584.534273] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:06:44 2019][240584.540924] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:06:44 2019][240584.547672] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:06:44 2019][240584.555212] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:06:44 2019][240584.562310] [] dqget+0x3fa/0x450 [Thu Dec 12 01:06:44 2019][240584.567314] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:06:44 2019][240584.573108] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:06:44 2019][240584.580721] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:06:44 2019][240584.587209] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:06:44 2019][240584.593339] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:06:45 2019][240584.600401] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:06:45 2019][240584.608205] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:06:45 2019][240584.614634] [] kthread+0xd1/0xe0 [Thu Dec 12 01:06:45 2019][240584.619649] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:06:45 2019][240584.626228] [] 0xffffffffffffffff [Thu Dec 12 01:06:45 2019][240584.631329] LustreError: dumping log to /tmp/lustre-log.1576141605.67889 [Thu Dec 12 01:06:53 2019][240592.699158] Pid: 66242, comm: ll_ost03_003 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:06:53 2019][240592.709677] Call Trace: [Thu Dec 12 01:06:53 2019][240592.712246] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:06:53 2019][240592.719242] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:06:53 2019][240592.726431] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:06:53 2019][240592.733078] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:06:53 2019][240592.739825] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:06:53 2019][240592.747365] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:06:53 2019][240592.754463] [] dqget+0x3fa/0x450 [Thu Dec 12 01:06:53 2019][240592.759467] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:06:53 2019][240592.765254] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:06:53 2019][240592.772865] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:06:53 2019][240592.779355] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:06:53 2019][240592.785485] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:06:53 2019][240592.792544] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:06:53 2019][240592.800350] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:06:53 2019][240592.806778] [] kthread+0xd1/0xe0 [Thu Dec 12 01:06:53 2019][240592.811794] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:06:53 2019][240592.818373] [] 0xffffffffffffffff [Thu Dec 12 01:06:53 2019][240592.823464] LustreError: dumping log to /tmp/lustre-log.1576141613.66242 [Thu Dec 12 01:06:57 2019][240596.795243] LustreError: dumping log to /tmp/lustre-log.1576141617.67716 [Thu Dec 12 01:07:25 2019][240625.467815] LustreError: dumping log to /tmp/lustre-log.1576141645.67677 [Thu Dec 12 01:07:29 2019][240629.563893] LustreError: dumping log to /tmp/lustre-log.1576141649.67878 [Thu Dec 12 01:07:34 2019][240633.659971] LustreError: dumping log to /tmp/lustre-log.1576141654.67778 [Thu Dec 12 01:07:38 2019][240637.756058] LustreError: dumping log to /tmp/lustre-log.1576141658.112501 [Thu Dec 12 01:07:40 2019][240640.052001] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089578 to 0x1980000402:3089601 [Thu Dec 12 01:08:09 2019][240669.116318] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797283 to 0x1900000401:11797313 [Thu Dec 12 01:08:56 2019][240715.581615] LustreError: dumping log to /tmp/lustre-log.1576141735.112521 [Thu Dec 12 01:09:08 2019][240727.869858] LustreError: dumping log to /tmp/lustre-log.1576141748.67607 [Thu Dec 12 01:09:13 2019][240733.302978] LustreError: 112496:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576141453, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0058_UUID lock: ffff89134edb5580/0x7066c9c190ae820e lrc: 3/0,1 mode: --/PW res: [0x1a3b7cb:0x0:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112496 timeout: 0 lvb_type: 0 [Thu Dec 12 01:10:05 2019][240784.663581] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122023 to 0x1a80000401:1122049 [Thu Dec 12 01:11:21 2019][240861.033099] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125573 to 0x1980000400:1125601 [Thu Dec 12 01:11:23 2019][240863.040457] LNet: Service thread pid 67744 was inactive for 1203.92s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 01:11:23 2019][240863.057568] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:11:23 2019][240863.062720] Pid: 67744, comm: ll_ost01_045 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:11:23 2019][240863.073256] Call Trace: [Thu Dec 12 01:11:23 2019][240863.075816] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:11:23 2019][240863.082855] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:11:23 2019][240863.090152] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:11:23 2019][240863.096808] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:11:23 2019][240863.103211] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:11:23 2019][240863.110268] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:11:23 2019][240863.118083] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:11:23 2019][240863.124499] [] kthread+0xd1/0xe0 [Thu Dec 12 01:11:23 2019][240863.129515] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:11:23 2019][240863.136080] [] 0xffffffffffffffff [Thu Dec 12 01:11:23 2019][240863.141201] LustreError: dumping log to /tmp/lustre-log.1576141883.67744 [Thu Dec 12 01:12:00 2019][240899.905137] Pid: 67865, comm: ll_ost02_057 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:12:00 2019][240899.915661] Call Trace: [Thu Dec 12 01:12:00 2019][240899.918230] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:12:00 2019][240899.925235] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:12:00 2019][240899.932424] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:12:00 2019][240899.939095] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:12:00 2019][240899.945845] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:12:00 2019][240899.953371] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 01:12:00 2019][240899.960483] [] tgt_client_data_update+0x303/0x5e0 [ptlrpc] [Thu Dec 12 01:12:00 2019][240899.967802] [] tgt_client_new+0x41b/0x610 [ptlrpc] [Thu Dec 12 01:12:00 2019][240899.974420] [] ofd_obd_connect+0x3a3/0x4c0 [ofd] [Thu Dec 12 01:12:00 2019][240899.980811] [] target_handle_connect+0xecb/0x2b10 [ptlrpc] [Thu Dec 12 01:12:00 2019][240899.988112] [] tgt_request_handle+0x50a/0x1580 [ptlrpc] [Thu Dec 12 01:12:00 2019][240899.995145] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:12:00 2019][240900.002973] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:12:00 2019][240900.009393] [] kthread+0xd1/0xe0 [Thu Dec 12 01:12:00 2019][240900.014410] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:12:00 2019][240900.020973] [] 0xffffffffffffffff [Thu Dec 12 01:12:00 2019][240900.026094] LustreError: dumping log to /tmp/lustre-log.1576141920.67865 [Thu Dec 12 01:12:01 2019][240901.521950] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117319 to 0x1800000402:1117345 [Thu Dec 12 01:12:16 2019][240916.289468] Pid: 112529, comm: ll_ost00_086 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:12:16 2019][240916.300076] Call Trace: [Thu Dec 12 01:12:16 2019][240916.302647] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:12:16 2019][240916.309643] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:12:16 2019][240916.316830] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:12:16 2019][240916.323491] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:12:16 2019][240916.330243] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:12:16 2019][240916.337767] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:12:16 2019][240916.344864] [] dqget+0x3fa/0x450 [Thu Dec 12 01:12:16 2019][240916.349868] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:12:16 2019][240916.355638] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:12:16 2019][240916.363264] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:12:16 2019][240916.369739] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:12:16 2019][240916.375883] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:12:16 2019][240916.382922] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:12:16 2019][240916.390749] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:12:16 2019][240916.397162] [] kthread+0xd1/0xe0 [Thu Dec 12 01:12:16 2019][240916.402176] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:12:16 2019][240916.408739] [] 0xffffffffffffffff [Thu Dec 12 01:12:16 2019][240916.413853] LustreError: dumping log to /tmp/lustre-log.1576141936.112529 [Thu Dec 12 01:13:01 2019][240961.346362] Pid: 66097, comm: ll_ost01_000 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:13:01 2019][240961.356882] Call Trace: [Thu Dec 12 01:13:01 2019][240961.359434] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:13:01 2019][240961.365827] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:13:01 2019][240961.372875] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:13:01 2019][240961.380680] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:13:01 2019][240961.387113] [] kthread+0xd1/0xe0 [Thu Dec 12 01:13:01 2019][240961.392113] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:13:01 2019][240961.398689] [] 0xffffffffffffffff [Thu Dec 12 01:13:01 2019][240961.403796] LustreError: dumping log to /tmp/lustre-log.1576141981.66097 [Thu Dec 12 01:13:55 2019][241015.435044] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 01:13:55 2019][241015.445224] Lustre: Skipped 494 previous similar messages [Thu Dec 12 01:14:08 2019][241028.022137] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467286 to 0x0:27467361 [Thu Dec 12 01:14:29 2019][241048.732841] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1104956 to 0x1a00000402:1105313 [Thu Dec 12 01:14:48 2019][241068.036509] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 01:14:48 2019][241068.036509] req@ffff88f8ca67e050 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:318/0 lens 440/0 e 0 to 0 dl 1576142093 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 01:14:48 2019][241068.065164] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 970 previous similar messages [Thu Dec 12 01:14:52 2019][241071.940600] Pid: 67604, comm: ll_ost00_010 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:14:52 2019][241071.951129] Call Trace: [Thu Dec 12 01:14:52 2019][241071.953697] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:14:52 2019][241071.960713] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:14:52 2019][241071.967898] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:14:52 2019][241071.974544] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:14:52 2019][241071.981293] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:14:52 2019][241071.988820] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:14:52 2019][241071.995916] [] dqget+0x3fa/0x450 [Thu Dec 12 01:14:52 2019][241072.000920] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:14:52 2019][241072.006692] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:14:52 2019][241072.014309] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:14:52 2019][241072.020783] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:14:52 2019][241072.026926] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:14:52 2019][241072.033966] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:14:52 2019][241072.041781] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:14:52 2019][241072.048197] [] kthread+0xd1/0xe0 [Thu Dec 12 01:14:52 2019][241072.053213] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:14:52 2019][241072.059790] [] 0xffffffffffffffff [Thu Dec 12 01:14:52 2019][241072.064906] LustreError: dumping log to /tmp/lustre-log.1576142092.67604 [Thu Dec 12 01:15:11 2019][241090.891516] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 01:15:11 2019][241090.900478] Lustre: Skipped 71 previous similar messages [Thu Dec 12 01:15:11 2019][241090.992913] Lustre: fir-OST005c: Connection restored to cec884d3-ca4b-8127-2f6b-7762665aa5f8 (at 10.9.0.64@o2ib4) [Thu Dec 12 01:15:11 2019][241091.003253] Lustre: Skipped 502 previous similar messages [Thu Dec 12 01:15:21 2019][241100.613140] LNet: Service thread pid 67739 was inactive for 1201.53s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 01:15:21 2019][241100.626170] LNet: Skipped 22 previous similar messages [Thu Dec 12 01:15:21 2019][241100.631400] LustreError: dumping log to /tmp/lustre-log.1576142121.67739 [Thu Dec 12 01:16:49 2019][241188.704342] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506640 to 0x0:27506657 [Thu Dec 12 01:16:51 2019][241190.726934] Pid: 67743, comm: ll_ost03_040 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:16:51 2019][241190.737459] Call Trace: [Thu Dec 12 01:16:51 2019][241190.740019] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:16:51 2019][241190.746416] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:16:51 2019][241190.753481] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:16:51 2019][241190.761282] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:16:51 2019][241190.767717] [] kthread+0xd1/0xe0 [Thu Dec 12 01:16:51 2019][241190.772738] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:16:51 2019][241190.779333] [] 0xffffffffffffffff [Thu Dec 12 01:16:51 2019][241190.784484] LustreError: dumping log to /tmp/lustre-log.1576142211.67743 [Thu Dec 12 01:17:15 2019][241215.303418] Pid: 67741, comm: ll_ost01_044 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:17:15 2019][241215.313938] Call Trace: [Thu Dec 12 01:17:15 2019][241215.316490] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:17:15 2019][241215.323533] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:17:15 2019][241215.330828] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:17:15 2019][241215.337476] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:17:15 2019][241215.343878] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:17:15 2019][241215.350937] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:17:15 2019][241215.358753] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:17:15 2019][241215.365168] [] kthread+0xd1/0xe0 [Thu Dec 12 01:17:15 2019][241215.370167] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:17:15 2019][241215.376736] [] 0xffffffffffffffff [Thu Dec 12 01:17:15 2019][241215.381845] LustreError: dumping log to /tmp/lustre-log.1576142235.67741 [Thu Dec 12 01:17:21 2019][241220.735308] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785223 to 0x1800000401:11785249 [Thu Dec 12 01:17:27 2019][241227.591675] Pid: 112495, comm: ll_ost00_076 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:17:27 2019][241227.602275] Call Trace: [Thu Dec 12 01:17:27 2019][241227.604843] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:17:27 2019][241227.611842] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:17:28 2019][241227.619027] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:17:28 2019][241227.625674] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:17:28 2019][241227.632425] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:17:28 2019][241227.639949] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:17:28 2019][241227.647045] [] dqget+0x3fa/0x450 [Thu Dec 12 01:17:28 2019][241227.652048] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:17:28 2019][241227.657821] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:17:28 2019][241227.665445] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:17:28 2019][241227.671929] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:17:28 2019][241227.678072] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:17:28 2019][241227.685112] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:17:28 2019][241227.692927] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:17:28 2019][241227.699342] [] kthread+0xd1/0xe0 [Thu Dec 12 01:17:28 2019][241227.704359] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:17:28 2019][241227.710923] [] 0xffffffffffffffff [Thu Dec 12 01:17:28 2019][241227.716033] LustreError: dumping log to /tmp/lustre-log.1576142248.112495 [Thu Dec 12 01:17:48 2019][241248.072071] Pid: 67793, comm: ll_ost02_042 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:17:48 2019][241248.082590] Call Trace: [Thu Dec 12 01:17:48 2019][241248.085141] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:17:48 2019][241248.091542] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:17:48 2019][241248.098620] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:17:48 2019][241248.106423] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:17:48 2019][241248.112852] [] kthread+0xd1/0xe0 [Thu Dec 12 01:17:48 2019][241248.117854] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:17:48 2019][241248.124429] [] 0xffffffffffffffff [Thu Dec 12 01:17:48 2019][241248.129539] LustreError: dumping log to /tmp/lustre-log.1576142268.67793 [Thu Dec 12 01:18:21 2019][241280.840714] Pid: 113357, comm: ll_ost02_095 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:18:21 2019][241280.851317] Call Trace: [Thu Dec 12 01:18:21 2019][241280.853885] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:18:21 2019][241280.860889] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:18:21 2019][241280.868082] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:18:21 2019][241280.874750] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:18:21 2019][241280.881501] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:18:21 2019][241280.889027] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:18:21 2019][241280.896121] [] dqget+0x3fa/0x450 [Thu Dec 12 01:18:21 2019][241280.901126] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:18:21 2019][241280.906899] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:18:21 2019][241280.914517] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:18:21 2019][241280.920991] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:18:21 2019][241280.927133] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:18:21 2019][241280.934194] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:18:21 2019][241280.942014] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:18:21 2019][241280.948429] [] kthread+0xd1/0xe0 [Thu Dec 12 01:18:21 2019][241280.953431] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:18:21 2019][241280.960000] [] 0xffffffffffffffff [Thu Dec 12 01:18:21 2019][241280.965098] LustreError: dumping log to /tmp/lustre-log.1576142301.113357 [Thu Dec 12 01:18:25 2019][241284.936832] LustreError: dumping log to /tmp/lustre-log.1576142305.67407 [Thu Dec 12 01:18:29 2019][241289.014892] LustreError: 67855:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576142009, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff890f4781a640/0x7066c9c190aea1b8 lrc: 3/0,1 mode: --/PW res: [0x1980000400:0x112ce2:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67855 timeout: 0 lvb_type: 0 [Thu Dec 12 01:18:29 2019][241289.032891] LustreError: dumping log to /tmp/lustre-log.1576142309.67795 [Thu Dec 12 01:18:29 2019][241289.065418] LustreError: 67855:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 1 previous similar message [Thu Dec 12 01:19:06 2019][241326.198724] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11841995 to 0x1980000401:11842017 [Thu Dec 12 01:20:07 2019][241387.338834] LustreError: dumping log to /tmp/lustre-log.1576142407.67700 [Thu Dec 12 01:20:16 2019][241396.226912] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089607 to 0x1980000402:3089633 [Thu Dec 12 01:20:45 2019][241425.227585] LustreError: dumping log to /tmp/lustre-log.1576142445.67588 [Thu Dec 12 01:20:45 2019][241425.571294] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797320 to 0x1900000401:11797345 [Thu Dec 12 01:21:22 2019][241462.040565] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125604 to 0x1980000400:1125633 [Thu Dec 12 01:21:46 2019][241485.644788] LustreError: dumping log to /tmp/lustre-log.1576142506.67790 [Thu Dec 12 01:22:18 2019][241518.413451] LNet: Service thread pid 67699 was inactive for 1203.91s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 01:22:18 2019][241518.430559] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:22:18 2019][241518.435707] Pid: 67699, comm: ll_ost00_034 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:22:18 2019][241518.446243] Call Trace: [Thu Dec 12 01:22:18 2019][241518.448817] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:22:18 2019][241518.455823] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:22:18 2019][241518.463006] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:22:18 2019][241518.469657] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:22:18 2019][241518.476405] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:22:18 2019][241518.483945] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:22:18 2019][241518.491050] [] dqget+0x3fa/0x450 [Thu Dec 12 01:22:18 2019][241518.496054] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:22:18 2019][241518.501826] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:22:18 2019][241518.509447] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:22:18 2019][241518.515929] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:22:18 2019][241518.522073] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:22:18 2019][241518.529109] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:22:18 2019][241518.536911] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:22:18 2019][241518.543345] [] kthread+0xd1/0xe0 [Thu Dec 12 01:22:18 2019][241518.548351] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:22:18 2019][241518.554920] [] 0xffffffffffffffff [Thu Dec 12 01:22:18 2019][241518.560034] LustreError: dumping log to /tmp/lustre-log.1576142538.67699 [Thu Dec 12 01:22:41 2019][241540.726529] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122052 to 0x1a80000401:1122081 [Thu Dec 12 01:23:56 2019][241616.406994] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 01:23:56 2019][241616.417195] Lustre: Skipped 536 previous similar messages [Thu Dec 12 01:24:08 2019][241627.983609] Pid: 67775, comm: ll_ost03_044 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:24:08 2019][241627.994133] Call Trace: [Thu Dec 12 01:24:08 2019][241627.996685] [] ldlm_completion_ast+0x430/0x860 [ptlrpc] [Thu Dec 12 01:24:08 2019][241628.003726] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:24:08 2019][241628.011024] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:24:08 2019][241628.017680] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:24:08 2019][241628.024081] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:24:08 2019][241628.031122] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:24:08 2019][241628.038937] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:24:08 2019][241628.045351] [] kthread+0xd1/0xe0 [Thu Dec 12 01:24:08 2019][241628.050353] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:24:08 2019][241628.056928] [] 0xffffffffffffffff [Thu Dec 12 01:24:08 2019][241628.062037] LustreError: dumping log to /tmp/lustre-log.1576142648.67775 [Thu Dec 12 01:24:17 2019][241637.199792] Pid: 112496, comm: ll_ost02_068 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:24:17 2019][241637.210396] Call Trace: [Thu Dec 12 01:24:17 2019][241637.212948] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:24:17 2019][241637.219990] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:24:17 2019][241637.227285] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:24:17 2019][241637.233935] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:24:17 2019][241637.240335] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:24:17 2019][241637.247377] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:24:17 2019][241637.255191] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:24:17 2019][241637.261609] [] kthread+0xd1/0xe0 [Thu Dec 12 01:24:17 2019][241637.266608] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:24:17 2019][241637.273201] [] 0xffffffffffffffff [Thu Dec 12 01:24:17 2019][241637.278320] LustreError: dumping log to /tmp/lustre-log.1576142657.112496 [Thu Dec 12 01:24:29 2019][241649.488033] Pid: 67845, comm: ll_ost01_063 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:24:29 2019][241649.498553] Call Trace: [Thu Dec 12 01:24:29 2019][241649.501114] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:24:29 2019][241649.508154] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:24:29 2019][241649.515451] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:24:29 2019][241649.522108] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:24:29 2019][241649.528513] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:24:29 2019][241649.535568] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:24:29 2019][241649.543384] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:24:29 2019][241649.549799] [] kthread+0xd1/0xe0 [Thu Dec 12 01:24:29 2019][241649.554800] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:24:29 2019][241649.561377] [] 0xffffffffffffffff [Thu Dec 12 01:24:29 2019][241649.566486] LustreError: dumping log to /tmp/lustre-log.1576142669.67845 [Thu Dec 12 01:24:30 2019][241650.328678] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105317 to 0x1a00000402:1105345 [Thu Dec 12 01:24:37 2019][241657.552831] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117354 to 0x1800000402:1117377 [Thu Dec 12 01:24:49 2019][241668.648423] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 01:24:49 2019][241668.648423] req@ffff88f298f11050 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:164/0 lens 440/0 e 0 to 0 dl 1576142694 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 01:24:49 2019][241668.677062] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1066 previous similar messages [Thu Dec 12 01:24:54 2019][241674.064536] Pid: 67692, comm: ll_ost00_032 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:24:54 2019][241674.075058] Call Trace: [Thu Dec 12 01:24:54 2019][241674.077624] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:24:54 2019][241674.084616] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:24:54 2019][241674.091819] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:24:54 2019][241674.098483] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:24:54 2019][241674.105234] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:24:54 2019][241674.112775] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:24:54 2019][241674.119871] [] dqget+0x3fa/0x450 [Thu Dec 12 01:24:54 2019][241674.124874] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:24:54 2019][241674.130647] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:24:54 2019][241674.138271] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:24:54 2019][241674.144757] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:24:54 2019][241674.150899] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:24:54 2019][241674.157938] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:24:54 2019][241674.165755] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:24:54 2019][241674.172170] [] kthread+0xd1/0xe0 [Thu Dec 12 01:24:54 2019][241674.177185] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:24:54 2019][241674.183748] [] 0xffffffffffffffff [Thu Dec 12 01:24:54 2019][241674.188862] LustreError: dumping log to /tmp/lustre-log.1576142694.67692 [Thu Dec 12 01:25:12 2019][241691.962836] Lustre: fir-OST005c: Connection restored to cec884d3-ca4b-8127-2f6b-7762665aa5f8 (at 10.9.0.64@o2ib4) [Thu Dec 12 01:25:12 2019][241691.973189] Lustre: Skipped 546 previous similar messages [Thu Dec 12 01:25:13 2019][241693.015409] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 01:25:13 2019][241693.024375] Lustre: Skipped 71 previous similar messages [Thu Dec 12 01:25:39 2019][241719.121448] LNet: Service thread pid 67918 was inactive for 1203.27s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 01:25:39 2019][241719.134481] LNet: Skipped 6 previous similar messages [Thu Dec 12 01:25:39 2019][241719.139625] LustreError: dumping log to /tmp/lustre-log.1576142739.67918 [Thu Dec 12 01:26:45 2019][241784.612142] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467363 to 0x0:27467393 [Thu Dec 12 01:27:30 2019][241829.715609] Pid: 67652, comm: ll_ost00_020 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:27:30 2019][241829.726126] Call Trace: [Thu Dec 12 01:27:30 2019][241829.728695] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:27:30 2019][241829.735694] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:27:30 2019][241829.742896] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:27:30 2019][241829.749543] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:27:30 2019][241829.756296] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:27:30 2019][241829.763816] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:27:30 2019][241829.770912] [] dqget+0x3fa/0x450 [Thu Dec 12 01:27:30 2019][241829.775929] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:27:30 2019][241829.781733] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:27:30 2019][241829.789340] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:27:30 2019][241829.795828] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:27:30 2019][241829.801975] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:27:30 2019][241829.809029] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:27:30 2019][241829.816831] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:27:30 2019][241829.823258] [] kthread+0xd1/0xe0 [Thu Dec 12 01:27:30 2019][241829.828255] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:27:30 2019][241829.834832] [] 0xffffffffffffffff [Thu Dec 12 01:27:30 2019][241829.839945] LustreError: dumping log to /tmp/lustre-log.1576142850.67652 [Thu Dec 12 01:27:58 2019][241858.388217] Pid: 112491, comm: ll_ost00_072 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:27:58 2019][241858.398823] Call Trace: [Thu Dec 12 01:27:58 2019][241858.401395] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:27:58 2019][241858.408393] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:27:58 2019][241858.415577] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:27:58 2019][241858.422217] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:27:58 2019][241858.428966] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:27:58 2019][241858.436498] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:27:58 2019][241858.443580] [] dqget+0x3fa/0x450 [Thu Dec 12 01:27:58 2019][241858.448588] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:27:58 2019][241858.454362] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:27:58 2019][241858.461986] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:27:58 2019][241858.468462] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:27:58 2019][241858.474608] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:27:58 2019][241858.481643] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:27:58 2019][241858.489444] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:27:58 2019][241858.495874] [] kthread+0xd1/0xe0 [Thu Dec 12 01:27:58 2019][241858.500867] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:27:58 2019][241858.507435] [] 0xffffffffffffffff [Thu Dec 12 01:27:58 2019][241858.512539] LustreError: dumping log to /tmp/lustre-log.1576142878.112491 [Thu Dec 12 01:29:24 2019][241944.405877] Pid: 66337, comm: ll_ost03_004 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:29:24 2019][241944.416400] Call Trace: [Thu Dec 12 01:29:24 2019][241944.418953] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:29:24 2019][241944.425371] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:29:24 2019][241944.432432] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:29:24 2019][241944.440246] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:29:24 2019][241944.446677] [] kthread+0xd1/0xe0 [Thu Dec 12 01:29:24 2019][241944.451682] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:29:24 2019][241944.458276] [] 0xffffffffffffffff [Thu Dec 12 01:29:24 2019][241944.463397] LustreError: dumping log to /tmp/lustre-log.1576142964.66337 [Thu Dec 12 01:29:24 2019][241944.471028] Pid: 112544, comm: ll_ost03_072 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:29:24 2019][241944.481667] Call Trace: [Thu Dec 12 01:29:24 2019][241944.484218] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:29:24 2019][241944.491210] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:29:24 2019][241944.498399] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:29:24 2019][241944.505049] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:29:24 2019][241944.511815] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:29:24 2019][241944.519343] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:29:24 2019][241944.526450] [] dqget+0x3fa/0x450 [Thu Dec 12 01:29:24 2019][241944.531442] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:29:24 2019][241944.537216] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:29:24 2019][241944.544832] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:29:24 2019][241944.551304] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:29:24 2019][241944.557449] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:29:24 2019][241944.564470] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:29:24 2019][241944.572286] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:29:24 2019][241944.578701] [] kthread+0xd1/0xe0 [Thu Dec 12 01:29:24 2019][241944.583695] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:29:25 2019][241944.590277] [] 0xffffffffffffffff [Thu Dec 12 01:29:25 2019][241945.106104] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506661 to 0x0:27506689 [Thu Dec 12 01:29:29 2019][241948.940996] LustreError: 67630:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576142669, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0058_UUID lock: ffff88fcea0e69c0/0x7066c9c190aed28e lrc: 3/0,1 mode: --/PW res: [0x1900000400:0x1127a7:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67630 timeout: 0 lvb_type: 0 [Thu Dec 12 01:29:29 2019][241948.984752] LustreError: 67630:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 8 previous similar messages [Thu Dec 12 01:29:32 2019][241952.598043] Pid: 67707, comm: ll_ost03_033 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:29:32 2019][241952.608559] Call Trace: [Thu Dec 12 01:29:32 2019][241952.611130] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:29:32 2019][241952.618128] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:29:32 2019][241952.625311] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:29:32 2019][241952.631959] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:29:33 2019][241952.638709] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:29:33 2019][241952.646234] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:29:33 2019][241952.653331] [] dqget+0x3fa/0x450 [Thu Dec 12 01:29:33 2019][241952.658333] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:29:33 2019][241952.664119] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:29:33 2019][241952.671732] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:29:33 2019][241952.678229] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:29:33 2019][241952.684359] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:29:33 2019][241952.691421] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:29:33 2019][241952.699223] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:29:33 2019][241952.705666] [] kthread+0xd1/0xe0 [Thu Dec 12 01:29:33 2019][241952.710672] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:29:33 2019][241952.717248] [] 0xffffffffffffffff [Thu Dec 12 01:29:33 2019][241952.722349] LustreError: dumping log to /tmp/lustre-log.1576142973.67707 [Thu Dec 12 01:29:37 2019][241956.694121] LustreError: dumping log to /tmp/lustre-log.1576142977.67791 [Thu Dec 12 01:29:45 2019][241964.886285] LustreError: dumping log to /tmp/lustre-log.1576142985.67628 [Thu Dec 12 01:29:57 2019][241976.707678] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785256 to 0x1800000401:11785281 [Thu Dec 12 01:30:01 2019][241980.782343] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3048832 to 0x1a00000401:3048865 [Thu Dec 12 01:30:05 2019][241985.366702] LustreError: dumping log to /tmp/lustre-log.1576143005.67871 [Thu Dec 12 01:30:26 2019][242005.847095] LustreError: dumping log to /tmp/lustre-log.1576143026.67797 [Thu Dec 12 01:31:23 2019][242063.108364] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125639 to 0x1980000400:1125665 [Thu Dec 12 01:31:42 2019][242082.336325] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842023 to 0x1980000401:11842049 [Thu Dec 12 01:32:16 2019][242116.441290] LustreError: dumping log to /tmp/lustre-log.1576143136.66715 [Thu Dec 12 01:32:41 2019][242141.017776] LNet: Service thread pid 67842 was inactive for 1200.82s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 01:32:41 2019][242141.034883] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:32:41 2019][242141.040025] Pid: 67842, comm: ll_ost00_056 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:32:41 2019][242141.050578] Call Trace: [Thu Dec 12 01:32:41 2019][242141.053143] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:32:41 2019][242141.060138] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:32:41 2019][242141.067318] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:32:41 2019][242141.073971] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:32:41 2019][242141.080719] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:32:41 2019][242141.088245] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:32:41 2019][242141.095339] [] dqget+0x3fa/0x450 [Thu Dec 12 01:32:41 2019][242141.100342] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:32:41 2019][242141.106114] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:32:41 2019][242141.113742] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:32:41 2019][242141.120217] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:32:41 2019][242141.126371] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:32:41 2019][242141.133407] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:32:41 2019][242141.141220] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:32:41 2019][242141.147638] [] kthread+0xd1/0xe0 [Thu Dec 12 01:32:41 2019][242141.152639] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:32:41 2019][242141.159209] [] 0xffffffffffffffff [Thu Dec 12 01:32:41 2019][242141.164306] LustreError: dumping log to /tmp/lustre-log.1576143161.67842 [Thu Dec 12 01:32:41 2019][242141.171832] Pid: 67619, comm: ll_ost00_013 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:32:41 2019][242141.182378] Call Trace: [Thu Dec 12 01:32:41 2019][242141.184931] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:32:41 2019][242141.191924] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:32:41 2019][242141.199123] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:32:41 2019][242141.205779] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:32:41 2019][242141.212532] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:32:41 2019][242141.220049] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:32:41 2019][242141.227144] [] dqget+0x3fa/0x450 [Thu Dec 12 01:32:41 2019][242141.232144] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:32:41 2019][242141.237920] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:32:41 2019][242141.245538] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:32:41 2019][242141.252016] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:32:41 2019][242141.258180] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:32:41 2019][242141.265216] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:32:41 2019][242141.273022] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:32:41 2019][242141.279454] [] kthread+0xd1/0xe0 [Thu Dec 12 01:32:41 2019][242141.284445] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:32:41 2019][242141.291016] [] 0xffffffffffffffff [Thu Dec 12 01:32:52 2019][242151.665744] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089644 to 0x1980000402:3089665 [Thu Dec 12 01:33:21 2019][242180.690249] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797355 to 0x1900000401:11797377 [Thu Dec 12 01:33:24 2019][242183.682367] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3085791 to 0x1900000402:3085857 [Thu Dec 12 01:33:30 2019][242190.170744] Pid: 67855, comm: ll_ost01_065 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:33:30 2019][242190.181262] Call Trace: [Thu Dec 12 01:33:30 2019][242190.183819] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:33:30 2019][242190.190863] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:33:30 2019][242190.198161] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:33:30 2019][242190.204816] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:33:30 2019][242190.211220] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:33:30 2019][242190.218258] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:33:30 2019][242190.226084] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:33:30 2019][242190.232514] [] kthread+0xd1/0xe0 [Thu Dec 12 01:33:30 2019][242190.237516] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:33:30 2019][242190.244093] [] 0xffffffffffffffff [Thu Dec 12 01:33:30 2019][242190.249202] LustreError: dumping log to /tmp/lustre-log.1576143210.67855 [Thu Dec 12 01:33:34 2019][242194.266835] Pid: 67672, comm: ll_ost03_024 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:33:34 2019][242194.277357] Call Trace: [Thu Dec 12 01:33:34 2019][242194.279928] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:33:34 2019][242194.286932] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:33:34 2019][242194.294120] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:33:34 2019][242194.300765] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:33:34 2019][242194.307515] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:33:34 2019][242194.315041] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:33:34 2019][242194.322161] [] dqget+0x3fa/0x450 [Thu Dec 12 01:33:34 2019][242194.327165] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:33:34 2019][242194.332936] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:33:34 2019][242194.340568] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:33:34 2019][242194.347047] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:33:34 2019][242194.353189] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:33:34 2019][242194.360236] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:33:34 2019][242194.368054] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:33:34 2019][242194.374484] [] kthread+0xd1/0xe0 [Thu Dec 12 01:33:34 2019][242194.379501] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:33:34 2019][242194.386064] [] 0xffffffffffffffff [Thu Dec 12 01:33:34 2019][242194.391177] LustreError: dumping log to /tmp/lustre-log.1576143214.67672 [Thu Dec 12 01:33:38 2019][242198.362921] Pid: 67052, comm: ll_ost02_007 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:33:38 2019][242198.373438] Call Trace: [Thu Dec 12 01:33:38 2019][242198.376007] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:33:38 2019][242198.383005] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:33:38 2019][242198.390186] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:33:38 2019][242198.396840] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:33:38 2019][242198.403578] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:33:38 2019][242198.411118] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:33:38 2019][242198.418225] [] dqget+0x3fa/0x450 [Thu Dec 12 01:33:38 2019][242198.423227] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:33:38 2019][242198.429015] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:33:38 2019][242198.436626] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:33:38 2019][242198.443115] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:33:38 2019][242198.449247] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:33:38 2019][242198.456307] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:33:38 2019][242198.464110] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:33:38 2019][242198.470551] [] kthread+0xd1/0xe0 [Thu Dec 12 01:33:38 2019][242198.475548] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:33:38 2019][242198.482125] [] 0xffffffffffffffff [Thu Dec 12 01:33:38 2019][242198.487227] LustreError: dumping log to /tmp/lustre-log.1576143218.67052 [Thu Dec 12 01:33:42 2019][242202.459007] LustreError: dumping log to /tmp/lustre-log.1576143222.67784 [Thu Dec 12 01:33:57 2019][242217.378925] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 01:33:57 2019][242217.389106] Lustre: Skipped 568 previous similar messages [Thu Dec 12 01:34:23 2019][242243.419813] LustreError: dumping log to /tmp/lustre-log.1576143263.113358 [Thu Dec 12 01:34:31 2019][242250.676570] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105356 to 0x1a00000402:1105377 [Thu Dec 12 01:34:50 2019][242270.612364] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 01:34:50 2019][242270.612364] req@ffff88f60b563050 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:10/0 lens 440/0 e 0 to 0 dl 1576143295 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 01:34:51 2019][242270.640911] Lustre: 26941:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1185 previous similar messages [Thu Dec 12 01:34:52 2019][242272.092384] LustreError: dumping log to /tmp/lustre-log.1576143292.67804 [Thu Dec 12 01:35:04 2019][242284.380627] LustreError: dumping log to /tmp/lustre-log.1576143304.67883 [Thu Dec 12 01:35:12 2019][242292.487208] Lustre: fir-OST0058: Connection restored to a8841932-bc4a-ab11-1ace-8e1fdda46930 (at 10.8.23.23@o2ib6) [Thu Dec 12 01:35:12 2019][242292.497646] Lustre: Skipped 591 previous similar messages [Thu Dec 12 01:35:15 2019][242295.139417] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 01:35:15 2019][242295.148382] Lustre: Skipped 71 previous similar messages [Thu Dec 12 01:35:17 2019][242296.949481] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122085 to 0x1a80000401:1122113 [Thu Dec 12 01:36:14 2019][242354.014010] LNet: Service thread pid 67633 was inactive for 1200.75s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 01:36:14 2019][242354.027048] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:36:14 2019][242354.032195] LustreError: dumping log to /tmp/lustre-log.1576143374.67633 [Thu Dec 12 01:36:22 2019][242362.206172] LustreError: dumping log to /tmp/lustre-log.1576143382.67841 [Thu Dec 12 01:36:59 2019][242399.070925] LustreError: dumping log to /tmp/lustre-log.1576143419.112508 [Thu Dec 12 01:37:05 2019][242405.463637] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124270 to 0x1900000400:1124353 [Thu Dec 12 01:37:07 2019][242407.263092] LustreError: dumping log to /tmp/lustre-log.1576143427.67849 [Thu Dec 12 01:37:11 2019][242411.359167] LustreError: dumping log to /tmp/lustre-log.1576143431.66108 [Thu Dec 12 01:37:13 2019][242412.759801] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117382 to 0x1800000402:1117409 [Thu Dec 12 01:37:24 2019][242423.910203] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3081886 to 0x1a80000402:3081953 [Thu Dec 12 01:37:28 2019][242427.743515] LustreError: dumping log to /tmp/lustre-log.1576143448.67697 [Thu Dec 12 01:38:17 2019][242476.896501] Pid: 67037, comm: ll_ost01_006 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:38:17 2019][242476.907023] Call Trace: [Thu Dec 12 01:38:17 2019][242476.909577] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:38:17 2019][242476.915976] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:38:17 2019][242476.923040] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:38:17 2019][242476.930846] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:38:17 2019][242476.937273] [] kthread+0xd1/0xe0 [Thu Dec 12 01:38:17 2019][242476.942278] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:38:17 2019][242476.948852] [] 0xffffffffffffffff [Thu Dec 12 01:38:17 2019][242476.953963] LustreError: dumping log to /tmp/lustre-log.1576143497.67037 [Thu Dec 12 01:38:33 2019][242493.280855] Pid: 66105, comm: ll_ost01_001 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:38:33 2019][242493.291379] Call Trace: [Thu Dec 12 01:38:33 2019][242493.293954] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:38:33 2019][242493.300954] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:38:33 2019][242493.308138] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:38:33 2019][242493.314786] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:38:33 2019][242493.321534] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:38:33 2019][242493.329059] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:38:33 2019][242493.336156] [] dqget+0x3fa/0x450 [Thu Dec 12 01:38:33 2019][242493.341161] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:38:33 2019][242493.346931] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:38:33 2019][242493.354550] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:38:33 2019][242493.361024] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:38:33 2019][242493.367169] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:38:33 2019][242493.374214] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:38:33 2019][242493.382030] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:38:33 2019][242493.388447] [] kthread+0xd1/0xe0 [Thu Dec 12 01:38:33 2019][242493.393461] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:38:33 2019][242493.400026] [] 0xffffffffffffffff [Thu Dec 12 01:38:33 2019][242493.405139] LustreError: dumping log to /tmp/lustre-log.1576143513.66105 [Thu Dec 12 01:38:51 2019][242511.352773] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800376 to 0x1a80000400:11800449 [Thu Dec 12 01:39:20 2019][242539.832472] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467396 to 0x0:27467425 [Thu Dec 12 01:39:39 2019][242558.818134] Pid: 67831, comm: ll_ost02_049 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:39:39 2019][242558.828654] Call Trace: [Thu Dec 12 01:39:39 2019][242558.831215] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:39:39 2019][242558.838253] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:39:39 2019][242558.845552] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:39:39 2019][242558.852200] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:39:39 2019][242558.858614] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:39:39 2019][242558.865652] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:39:39 2019][242558.873464] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:39:39 2019][242558.879881] [] kthread+0xd1/0xe0 [Thu Dec 12 01:39:39 2019][242558.884882] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:39:39 2019][242558.891451] [] 0xffffffffffffffff [Thu Dec 12 01:39:39 2019][242558.896558] LustreError: dumping log to /tmp/lustre-log.1576143579.67831 [Thu Dec 12 01:40:03 2019][242583.394627] Pid: 112493, comm: ll_ost00_074 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:40:03 2019][242583.405241] Call Trace: [Thu Dec 12 01:40:03 2019][242583.407815] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:40:03 2019][242583.414814] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:40:03 2019][242583.422015] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:40:03 2019][242583.428666] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:40:03 2019][242583.435414] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:40:03 2019][242583.442938] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:40:03 2019][242583.450034] [] dqget+0x3fa/0x450 [Thu Dec 12 01:40:03 2019][242583.455038] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:40:03 2019][242583.460818] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:40:03 2019][242583.468434] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:40:03 2019][242583.474911] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:40:03 2019][242583.481068] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:40:03 2019][242583.488118] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:40:03 2019][242583.495933] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:40:03 2019][242583.502349] [] kthread+0xd1/0xe0 [Thu Dec 12 01:40:03 2019][242583.507349] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:40:03 2019][242583.513927] [] 0xffffffffffffffff [Thu Dec 12 01:40:03 2019][242583.519027] LustreError: dumping log to /tmp/lustre-log.1576143603.112493 [Thu Dec 12 01:40:28 2019][242607.971110] Pid: 67805, comm: ll_ost02_045 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:40:28 2019][242607.981631] Call Trace: [Thu Dec 12 01:40:28 2019][242607.984183] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:40:28 2019][242607.990576] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:40:28 2019][242607.997652] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:40:28 2019][242608.005455] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:40:28 2019][242608.011883] [] kthread+0xd1/0xe0 [Thu Dec 12 01:40:28 2019][242608.016887] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:40:28 2019][242608.023463] [] 0xffffffffffffffff [Thu Dec 12 01:40:28 2019][242608.028574] LustreError: dumping log to /tmp/lustre-log.1576143628.67805 [Thu Dec 12 01:40:32 2019][242612.067191] LustreError: dumping log to /tmp/lustre-log.1576143632.112505 [Thu Dec 12 01:40:48 2019][242627.572508] LustreError: 67625:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576143347, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005c_UUID lock: ffff8910b6c8a880/0x7066c9c190af09a7 lrc: 3/0,1 mode: --/PW res: [0x1a00000401:0x2e857d:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67625 timeout: 0 lvb_type: 0 [Thu Dec 12 01:40:48 2019][242627.616229] LustreError: 67625:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 3 previous similar messages [Thu Dec 12 01:40:52 2019][242632.547610] LustreError: dumping log to /tmp/lustre-log.1576143652.112539 [Thu Dec 12 01:41:24 2019][242664.408412] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125670 to 0x1980000400:1125697 [Thu Dec 12 01:42:01 2019][242701.134334] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506691 to 0x0:27506721 [Thu Dec 12 01:42:02 2019][242702.181039] LustreError: dumping log to /tmp/lustre-log.1576143722.67765 [Thu Dec 12 01:42:18 2019][242718.565377] LustreError: dumping log to /tmp/lustre-log.1576143738.67767 [Thu Dec 12 01:42:33 2019][242733.250818] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785289 to 0x1800000401:11785313 [Thu Dec 12 01:42:37 2019][242737.645523] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3048873 to 0x1a00000401:3048897 [Thu Dec 12 01:42:39 2019][242739.045809] LustreError: dumping log to /tmp/lustre-log.1576143759.67673 [Thu Dec 12 01:43:58 2019][242818.350993] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 01:43:58 2019][242818.361176] Lustre: Skipped 565 previous similar messages [Thu Dec 12 01:44:17 2019][242837.351756] LNet: Service thread pid 112532 was inactive for 1204.12s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 01:44:17 2019][242837.368972] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:44:17 2019][242837.374125] Pid: 112532, comm: ll_ost01_086 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:44:17 2019][242837.384932] Call Trace: [Thu Dec 12 01:44:17 2019][242837.387513] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:44:17 2019][242837.394522] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:44:17 2019][242837.401822] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:44:17 2019][242837.408564] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:44:17 2019][242837.415305] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:44:17 2019][242837.422887] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:44:17 2019][242837.429981] [] dqget+0x3fa/0x450 [Thu Dec 12 01:44:17 2019][242837.435086] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:44:17 2019][242837.440882] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:44:17 2019][242837.448552] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:44:17 2019][242837.455035] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:44:17 2019][242837.461281] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:44:17 2019][242837.468357] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:44:17 2019][242837.476239] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:44:17 2019][242837.482674] [] kthread+0xd1/0xe0 [Thu Dec 12 01:44:17 2019][242837.487708] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:44:17 2019][242837.494294] [] 0xffffffffffffffff [Thu Dec 12 01:44:17 2019][242837.499501] LustreError: dumping log to /tmp/lustre-log.1576143857.112532 [Thu Dec 12 01:44:18 2019][242837.887424] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842056 to 0x1980000401:11842081 [Thu Dec 12 01:44:25 2019][242845.543923] Pid: 67620, comm: ll_ost01_014 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:44:25 2019][242845.554442] Call Trace: [Thu Dec 12 01:44:25 2019][242845.557016] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:44:25 2019][242845.564008] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:44:25 2019][242845.571198] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:44:25 2019][242845.577866] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:44:25 2019][242845.584618] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:44:25 2019][242845.592143] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:44:25 2019][242845.599236] [] dqget+0x3fa/0x450 [Thu Dec 12 01:44:25 2019][242845.604242] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:44:25 2019][242845.610027] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:44:25 2019][242845.617639] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:44:25 2019][242845.624127] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:44:25 2019][242845.630259] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:44:26 2019][242845.637333] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:44:26 2019][242845.645131] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:44:26 2019][242845.651558] [] kthread+0xd1/0xe0 [Thu Dec 12 01:44:26 2019][242845.656564] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:44:26 2019][242845.663147] [] 0xffffffffffffffff [Thu Dec 12 01:44:26 2019][242845.668257] LustreError: dumping log to /tmp/lustre-log.1576143866.67620 [Thu Dec 12 01:44:30 2019][242849.640011] Pid: 67925, comm: ll_ost00_068 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:44:30 2019][242849.650531] Call Trace: [Thu Dec 12 01:44:30 2019][242849.653101] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:44:30 2019][242849.660099] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:44:30 2019][242849.667299] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:44:30 2019][242849.673949] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:44:30 2019][242849.680695] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:44:30 2019][242849.688225] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:44:30 2019][242849.695320] [] dqget+0x3fa/0x450 [Thu Dec 12 01:44:30 2019][242849.700321] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:44:30 2019][242849.706110] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:44:30 2019][242849.713720] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:44:30 2019][242849.720219] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:44:30 2019][242849.726349] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.733420] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.741229] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.747643] [] kthread+0xd1/0xe0 [Thu Dec 12 01:44:30 2019][242849.752659] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:44:30 2019][242849.759223] [] 0xffffffffffffffff [Thu Dec 12 01:44:30 2019][242849.764335] LustreError: dumping log to /tmp/lustre-log.1576143870.67925 [Thu Dec 12 01:44:30 2019][242849.771700] Pid: 67630, comm: ll_ost01_018 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:44:30 2019][242849.782223] Call Trace: [Thu Dec 12 01:44:30 2019][242849.784770] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.791792] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.799117] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:44:30 2019][242849.805763] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:44:30 2019][242849.812178] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.819204] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.827033] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:44:30 2019][242849.833445] [] kthread+0xd1/0xe0 [Thu Dec 12 01:44:30 2019][242849.838451] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:44:30 2019][242849.845007] [] 0xffffffffffffffff [Thu Dec 12 01:44:32 2019][242852.570711] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105381 to 0x1a00000402:1105409 [Thu Dec 12 01:44:51 2019][242871.236451] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 01:44:51 2019][242871.236451] req@ffff88f103e63850 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:611/0 lens 440/0 e 0 to 0 dl 1576143896 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 01:44:51 2019][242871.265086] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1183 previous similar messages [Thu Dec 12 01:44:54 2019][242874.216504] Pid: 67631, comm: ll_ost00_016 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:44:54 2019][242874.227023] Call Trace: [Thu Dec 12 01:44:54 2019][242874.229600] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:44:54 2019][242874.236597] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:44:54 2019][242874.243781] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:44:54 2019][242874.250431] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:44:54 2019][242874.257179] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:44:54 2019][242874.264704] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:44:54 2019][242874.271799] [] dqget+0x3fa/0x450 [Thu Dec 12 01:44:54 2019][242874.276819] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:44:54 2019][242874.282607] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:44:54 2019][242874.290219] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:44:54 2019][242874.296708] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:44:54 2019][242874.302838] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:44:54 2019][242874.309882] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:44:54 2019][242874.317685] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:44:54 2019][242874.324111] [] kthread+0xd1/0xe0 [Thu Dec 12 01:44:54 2019][242874.329117] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:44:54 2019][242874.335708] [] 0xffffffffffffffff [Thu Dec 12 01:44:54 2019][242874.340826] LustreError: dumping log to /tmp/lustre-log.1576143894.67631 [Thu Dec 12 01:45:14 2019][242893.983421] Lustre: fir-OST005c: Connection restored to e2e512e9-5e98-1086-a71a-3e4545e26e0b (at 10.8.25.1@o2ib6) [Thu Dec 12 01:45:14 2019][242893.993788] Lustre: Skipped 625 previous similar messages [Thu Dec 12 01:45:17 2019][242897.264212] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 01:45:17 2019][242897.273187] Lustre: Skipped 71 previous similar messages [Thu Dec 12 01:45:19 2019][242898.792992] LustreError: dumping log to /tmp/lustre-log.1576143919.66096 [Thu Dec 12 01:45:28 2019][242907.918468] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089681 to 0x1980000402:3089697 [Thu Dec 12 01:45:57 2019][242936.801426] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797383 to 0x1900000401:11797409 [Thu Dec 12 01:46:00 2019][242939.953635] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3085869 to 0x1900000402:3085889 [Thu Dec 12 01:46:16 2019][242956.138147] LNet: Service thread pid 67663 was inactive for 1201.35s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 01:46:16 2019][242956.151186] LNet: Skipped 14 previous similar messages [Thu Dec 12 01:46:16 2019][242956.156422] LustreError: dumping log to /tmp/lustre-log.1576143976.67663 [Thu Dec 12 01:47:01 2019][243001.195060] LustreError: dumping log to /tmp/lustre-log.1576144021.67679 [Thu Dec 12 01:47:25 2019][243024.833392] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3081963 to 0x1a80000402:3081985 [Thu Dec 12 01:47:26 2019][243025.771548] LustreError: dumping log to /tmp/lustre-log.1576144046.67637 [Thu Dec 12 01:47:30 2019][243029.867631] LustreError: dumping log to /tmp/lustre-log.1576144050.67668 [Thu Dec 12 01:47:54 2019][243053.796775] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122117 to 0x1a80000401:1122145 [Thu Dec 12 01:49:29 2019][243149.189864] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070575 to 0x1800000400:3070625 [Thu Dec 12 01:49:33 2019][243152.750103] Pid: 112506, comm: ll_ost02_072 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:49:33 2019][243152.760730] Call Trace: [Thu Dec 12 01:49:33 2019][243152.763303] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:49:33 2019][243152.770306] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:49:33 2019][243152.777498] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:49:33 2019][243152.784148] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:49:33 2019][243152.790894] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:49:33 2019][243152.798431] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 01:49:33 2019][243152.805568] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 01:49:33 2019][243152.811786] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 01:49:33 2019][243152.817850] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 01:49:33 2019][243152.824499] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:49:33 2019][243152.830901] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:49:33 2019][243152.837951] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:49:33 2019][243152.845766] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:49:33 2019][243152.852181] [] kthread+0xd1/0xe0 [Thu Dec 12 01:49:33 2019][243152.857197] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:49:33 2019][243152.863762] [] 0xffffffffffffffff [Thu Dec 12 01:49:33 2019][243152.868887] LustreError: dumping log to /tmp/lustre-log.1576144173.112506 [Thu Dec 12 01:49:33 2019][243152.876872] Pid: 67639, comm: ll_ost02_017 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:49:33 2019][243152.887421] Call Trace: [Thu Dec 12 01:49:33 2019][243152.889970] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:49:33 2019][243152.896964] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:49:33 2019][243152.904147] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:49:33 2019][243152.910795] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:49:33 2019][243152.917546] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:49:33 2019][243152.925063] [] osd_trans_start+0x20e/0x4e0 [osd_ldiskfs] [Thu Dec 12 01:49:33 2019][243152.932157] [] ofd_trans_start+0x75/0xf0 [ofd] [Thu Dec 12 01:49:33 2019][243152.938388] [] ofd_destroy+0x5d0/0x960 [ofd] [Thu Dec 12 01:49:33 2019][243152.944446] [] ofd_destroy_by_fid+0x1f4/0x4a0 [ofd] [Thu Dec 12 01:49:33 2019][243152.951098] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:49:33 2019][243152.957499] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:49:33 2019][243152.964531] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:49:33 2019][243152.972355] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:49:33 2019][243152.978778] [] kthread+0xd1/0xe0 [Thu Dec 12 01:49:33 2019][243152.983794] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:49:33 2019][243152.990349] [] 0xffffffffffffffff [Thu Dec 12 01:49:41 2019][243161.526995] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124364 to 0x1900000400:1124385 [Thu Dec 12 01:49:49 2019][243169.631148] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117411 to 0x1800000402:1117441 [Thu Dec 12 01:50:05 2019][243185.518767] Pid: 67689, comm: ll_ost00_031 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:50:05 2019][243185.529287] Call Trace: [Thu Dec 12 01:50:05 2019][243185.531853] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:50:05 2019][243185.538854] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:50:05 2019][243185.546037] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:50:05 2019][243185.552688] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:50:05 2019][243185.559434] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:50:05 2019][243185.566959] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:50:05 2019][243185.574058] [] dqget+0x3fa/0x450 [Thu Dec 12 01:50:05 2019][243185.579060] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:50:05 2019][243185.584846] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:50:05 2019][243185.592457] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:50:05 2019][243185.598947] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:50:05 2019][243185.605077] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:50:05 2019][243185.612129] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:50:05 2019][243185.619932] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:50:05 2019][243185.626361] [] kthread+0xd1/0xe0 [Thu Dec 12 01:50:06 2019][243185.631363] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:50:06 2019][243185.637940] [] 0xffffffffffffffff [Thu Dec 12 01:50:06 2019][243185.643040] LustreError: dumping log to /tmp/lustre-log.1576144205.67689 [Thu Dec 12 01:50:18 2019][243197.807078] Pid: 67863, comm: ll_ost01_067 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:50:18 2019][243197.817596] Call Trace: [Thu Dec 12 01:50:18 2019][243197.820150] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:50:18 2019][243197.827189] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:50:18 2019][243197.834478] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:50:18 2019][243197.841127] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:50:18 2019][243197.847527] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:50:18 2019][243197.854569] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:50:18 2019][243197.862386] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:50:18 2019][243197.868800] [] kthread+0xd1/0xe0 [Thu Dec 12 01:50:18 2019][243197.873799] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:50:18 2019][243197.880378] [] 0xffffffffffffffff [Thu Dec 12 01:50:18 2019][243197.885485] LustreError: dumping log to /tmp/lustre-log.1576144218.67863 [Thu Dec 12 01:50:50 2019][243230.575679] Pid: 67666, comm: ll_ost01_030 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:50:50 2019][243230.586199] Call Trace: [Thu Dec 12 01:50:51 2019][243230.588756] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:50:51 2019][243230.595157] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:50:51 2019][243230.602227] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:50:51 2019][243230.610042] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:50:51 2019][243230.616474] [] kthread+0xd1/0xe0 [Thu Dec 12 01:50:51 2019][243230.621478] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:50:51 2019][243230.628053] [] 0xffffffffffffffff [Thu Dec 12 01:50:51 2019][243230.633161] LustreError: dumping log to /tmp/lustre-log.1576144250.67666 [Thu Dec 12 01:51:25 2019][243264.812787] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125699 to 0x1980000400:1125729 [Thu Dec 12 01:51:28 2019][243268.243254] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800455 to 0x1a80000400:11800481 [Thu Dec 12 01:51:56 2019][243296.482338] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467430 to 0x0:27467457 [Thu Dec 12 01:52:16 2019][243315.830430] LustreError: 67608:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576144036, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST0058_UUID lock: ffff88f5fed1ad00/0x7066c9c190af3456 lrc: 3/0,1 mode: --/PW res: [0x1900000400:0x112809:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67608 timeout: 0 lvb_type: 0 [Thu Dec 12 01:52:16 2019][243315.874153] LustreError: 67608:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 9 previous similar messages [Thu Dec 12 01:52:16 2019][243316.593424] LustreError: dumping log to /tmp/lustre-log.1576144336.66093 [Thu Dec 12 01:52:41 2019][243341.169907] LustreError: dumping log to /tmp/lustre-log.1576144361.67678 [Thu Dec 12 01:53:06 2019][243365.746399] LustreError: dumping log to /tmp/lustre-log.1576144386.112520 [Thu Dec 12 01:53:10 2019][243369.842480] LustreError: dumping log to /tmp/lustre-log.1576144390.67726 [Thu Dec 12 01:53:49 2019][243408.828570] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192412 to 0x0:27192449 [Thu Dec 12 01:53:59 2019][243419.323023] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 01:53:59 2019][243419.333200] Lustre: Skipped 710 previous similar messages [Thu Dec 12 01:54:23 2019][243443.571970] LustreError: dumping log to /tmp/lustre-log.1576144463.67649 [Thu Dec 12 01:54:32 2019][243451.764129] LustreError: dumping log to /tmp/lustre-log.1576144472.67818 [Thu Dec 12 01:54:34 2019][243453.852918] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105411 to 0x1a00000402:1105441 [Thu Dec 12 01:54:37 2019][243456.869525] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506724 to 0x0:27506753 [Thu Dec 12 01:54:40 2019][243459.956299] LNet: Service thread pid 67858 was inactive for 1201.49s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 01:54:40 2019][243459.973412] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:54:40 2019][243459.978559] Pid: 67858, comm: ll_ost03_059 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:54:40 2019][243459.989099] Call Trace: [Thu Dec 12 01:54:40 2019][243459.991654] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 01:54:40 2019][243459.998055] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:54:40 2019][243460.005117] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:54:40 2019][243460.012934] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:54:40 2019][243460.019365] [] kthread+0xd1/0xe0 [Thu Dec 12 01:54:40 2019][243460.024369] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:54:40 2019][243460.030945] [] 0xffffffffffffffff [Thu Dec 12 01:54:40 2019][243460.036054] LustreError: dumping log to /tmp/lustre-log.1576144480.67858 [Thu Dec 12 01:54:52 2019][243471.880550] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 01:54:52 2019][243471.880550] req@ffff891237d13050 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:457/0 lens 440/0 e 0 to 0 dl 1576144497 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 01:54:52 2019][243471.909187] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1246 previous similar messages [Thu Dec 12 01:54:52 2019][243472.244537] Pid: 67670, comm: ll_ost00_023 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:54:52 2019][243472.255055] Call Trace: [Thu Dec 12 01:54:52 2019][243472.257632] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:54:52 2019][243472.264640] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:54:52 2019][243472.271823] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:54:52 2019][243472.278471] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:54:52 2019][243472.285223] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:54:52 2019][243472.292746] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:54:52 2019][243472.299840] [] dqget+0x3fa/0x450 [Thu Dec 12 01:54:52 2019][243472.304845] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:54:52 2019][243472.310640] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:54:52 2019][243472.318253] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:54:52 2019][243472.324740] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:54:52 2019][243472.330886] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:54:52 2019][243472.337942] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:54:52 2019][243472.345744] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:54:52 2019][243472.352170] [] kthread+0xd1/0xe0 [Thu Dec 12 01:54:52 2019][243472.357175] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:54:52 2019][243472.363751] [] 0xffffffffffffffff [Thu Dec 12 01:54:52 2019][243472.368852] LustreError: dumping log to /tmp/lustre-log.1576144492.67670 [Thu Dec 12 01:55:09 2019][243488.700547] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785320 to 0x1800000401:11785345 [Thu Dec 12 01:55:13 2019][243493.140727] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3048914 to 0x1a00000401:3048929 [Thu Dec 12 01:55:15 2019][243494.959161] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 01:55:15 2019][243494.969719] Lustre: Skipped 658 previous similar messages [Thu Dec 12 01:55:17 2019][243496.821040] Pid: 67641, comm: ll_ost00_019 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:55:17 2019][243496.831560] Call Trace: [Thu Dec 12 01:55:17 2019][243496.834130] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 01:55:17 2019][243496.841127] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 01:55:17 2019][243496.848311] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 01:55:17 2019][243496.854962] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 01:55:17 2019][243496.861711] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 01:55:17 2019][243496.869252] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 01:55:17 2019][243496.876347] [] dqget+0x3fa/0x450 [Thu Dec 12 01:55:17 2019][243496.881352] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 01:55:17 2019][243496.887136] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 01:55:17 2019][243496.894749] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 01:55:17 2019][243496.901239] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 01:55:17 2019][243496.907368] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:55:17 2019][243496.914413] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:55:17 2019][243496.922218] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:55:17 2019][243496.928645] [] kthread+0xd1/0xe0 [Thu Dec 12 01:55:17 2019][243496.933662] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:55:17 2019][243496.940241] [] 0xffffffffffffffff [Thu Dec 12 01:55:17 2019][243496.945340] LustreError: dumping log to /tmp/lustre-log.1576144517.67641 [Thu Dec 12 01:55:19 2019][243499.387679] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 01:55:19 2019][243499.396651] Lustre: Skipped 71 previous similar messages [Thu Dec 12 01:55:49 2019][243529.589697] Pid: 67625, comm: ll_ost03_015 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:55:49 2019][243529.600223] Call Trace: [Thu Dec 12 01:55:49 2019][243529.602789] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:55:49 2019][243529.609831] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:55:49 2019][243529.617168] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:55:49 2019][243529.623835] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:55:49 2019][243529.630239] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:55:50 2019][243529.637270] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:55:50 2019][243529.645103] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:55:50 2019][243529.651520] [] kthread+0xd1/0xe0 [Thu Dec 12 01:55:50 2019][243529.656518] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:55:50 2019][243529.663088] [] 0xffffffffffffffff [Thu Dec 12 01:55:50 2019][243529.668198] LustreError: dumping log to /tmp/lustre-log.1576144549.67625 [Thu Dec 12 01:56:54 2019][243594.438729] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842091 to 0x1980000401:11842113 [Thu Dec 12 01:56:55 2019][243595.127017] Pid: 66114, comm: ll_ost03_001 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 01:56:55 2019][243595.137533] Call Trace: [Thu Dec 12 01:56:55 2019][243595.140111] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 01:56:55 2019][243595.147141] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 01:56:55 2019][243595.154464] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 01:56:55 2019][243595.161125] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 01:56:55 2019][243595.167565] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 01:56:55 2019][243595.174608] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 01:56:55 2019][243595.182430] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 01:56:55 2019][243595.188847] [] kthread+0xd1/0xe0 [Thu Dec 12 01:56:55 2019][243595.193847] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 01:56:55 2019][243595.200415] [] 0xffffffffffffffff [Thu Dec 12 01:56:55 2019][243595.205522] LustreError: dumping log to /tmp/lustre-log.1576144615.66114 [Thu Dec 12 01:56:55 2019][243595.213232] LNet: Service thread pid 67660 was inactive for 1202.61s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 01:56:55 2019][243595.226294] LNet: Skipped 9 previous similar messages [Thu Dec 12 01:57:24 2019][243623.799598] LustreError: dumping log to /tmp/lustre-log.1576144644.67655 [Thu Dec 12 01:57:26 2019][243625.898880] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3081992 to 0x1a80000402:3082017 [Thu Dec 12 01:57:28 2019][243627.895668] LustreError: dumping log to /tmp/lustre-log.1576144648.112554 [Thu Dec 12 01:57:32 2019][243631.991757] LustreError: dumping log to /tmp/lustre-log.1576144652.67731 [Thu Dec 12 01:57:36 2019][243636.087837] LustreError: dumping log to /tmp/lustre-log.1576144656.67615 [Thu Dec 12 01:57:52 2019][243652.472166] LustreError: dumping log to /tmp/lustre-log.1576144672.112492 [Thu Dec 12 01:57:56 2019][243656.568278] LustreError: dumping log to /tmp/lustre-log.1576144676.67686 [Thu Dec 12 01:58:04 2019][243663.760132] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089704 to 0x1980000402:3089729 [Thu Dec 12 01:58:33 2019][243692.946701] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797424 to 0x1900000401:11797441 [Thu Dec 12 01:58:36 2019][243696.104758] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3085896 to 0x1900000402:3085921 [Thu Dec 12 01:59:35 2019][243754.874206] LustreError: dumping log to /tmp/lustre-log.1576144775.67740 [Thu Dec 12 02:00:03 2019][243783.546787] Pid: 67611, comm: ll_ost00_012 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:00:03 2019][243783.557305] Call Trace: [Thu Dec 12 02:00:03 2019][243783.559881] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:00:03 2019][243783.566885] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:00:03 2019][243783.574083] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:00:03 2019][243783.580736] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:00:03 2019][243783.587484] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:00:03 2019][243783.595010] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:00:03 2019][243783.602104] [] dqget+0x3fa/0x450 [Thu Dec 12 02:00:03 2019][243783.607109] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:00:03 2019][243783.612882] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:00:03 2019][243783.620498] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:00:03 2019][243783.626975] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:00:03 2019][243783.633129] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:00:03 2019][243783.640173] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:00:03 2019][243783.647987] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:00:03 2019][243783.654404] [] kthread+0xd1/0xe0 [Thu Dec 12 02:00:04 2019][243783.659419] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:00:04 2019][243783.665982] [] 0xffffffffffffffff [Thu Dec 12 02:00:04 2019][243783.671095] LustreError: dumping log to /tmp/lustre-log.1576144803.67611 [Thu Dec 12 02:00:31 2019][243811.012004] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122155 to 0x1a80000401:1122177 [Thu Dec 12 02:00:40 2019][243820.411516] Pid: 67840, comm: ll_ost01_062 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:00:40 2019][243820.422034] Call Trace: [Thu Dec 12 02:00:40 2019][243820.424595] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:00:40 2019][243820.431626] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:00:40 2019][243820.438923] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:00:40 2019][243820.445572] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:00:40 2019][243820.451973] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:00:40 2019][243820.459004] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:00:40 2019][243820.466820] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:00:40 2019][243820.473235] [] kthread+0xd1/0xe0 [Thu Dec 12 02:00:40 2019][243820.478238] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:00:40 2019][243820.484822] [] 0xffffffffffffffff [Thu Dec 12 02:00:40 2019][243820.489930] LustreError: dumping log to /tmp/lustre-log.1576144840.67840 [Thu Dec 12 02:01:17 2019][243857.276255] Pid: 67826, comm: ll_ost02_048 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:01:17 2019][243857.286773] Call Trace: [Thu Dec 12 02:01:17 2019][243857.289325] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:01:17 2019][243857.296365] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:01:17 2019][243857.303662] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:01:17 2019][243857.310309] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:01:17 2019][243857.316714] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:01:17 2019][243857.323753] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:01:17 2019][243857.331575] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:01:17 2019][243857.338028] [] kthread+0xd1/0xe0 [Thu Dec 12 02:01:17 2019][243857.343045] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:01:17 2019][243857.349638] [] 0xffffffffffffffff [Thu Dec 12 02:01:17 2019][243857.354764] LustreError: dumping log to /tmp/lustre-log.1576144877.67826 [Thu Dec 12 02:01:26 2019][243866.349098] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125734 to 0x1980000400:1125761 [Thu Dec 12 02:02:05 2019][243905.236943] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070636 to 0x1800000400:3070657 [Thu Dec 12 02:02:17 2019][243916.958121] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124391 to 0x1900000400:1124417 [Thu Dec 12 02:02:19 2019][243918.717473] Pid: 112489, comm: ll_ost00_070 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:02:19 2019][243918.728081] Call Trace: [Thu Dec 12 02:02:19 2019][243918.730648] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:02:19 2019][243918.737656] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:02:19 2019][243918.744856] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:02:19 2019][243918.751506] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:02:19 2019][243918.758254] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:02:19 2019][243918.765780] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:02:19 2019][243918.772876] [] dqget+0x3fa/0x450 [Thu Dec 12 02:02:19 2019][243918.777880] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:02:19 2019][243918.783651] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:02:19 2019][243918.791268] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:02:19 2019][243918.797743] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:02:19 2019][243918.803901] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:02:19 2019][243918.810944] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:02:19 2019][243918.818742] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:02:19 2019][243918.825157] [] kthread+0xd1/0xe0 [Thu Dec 12 02:02:19 2019][243918.830172] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:02:19 2019][243918.836737] [] 0xffffffffffffffff [Thu Dec 12 02:02:19 2019][243918.841847] LustreError: dumping log to /tmp/lustre-log.1576144939.112489 [Thu Dec 12 02:02:25 2019][243924.942253] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117450 to 0x1800000402:1117473 [Thu Dec 12 02:02:31 2019][243931.005736] Pid: 67695, comm: ll_ost01_037 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:02:31 2019][243931.016255] Call Trace: [Thu Dec 12 02:02:31 2019][243931.018813] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:02:31 2019][243931.025846] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:02:31 2019][243931.033145] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:02:31 2019][243931.039790] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:02:31 2019][243931.046192] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:02:31 2019][243931.053233] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:02:31 2019][243931.061048] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:02:31 2019][243931.067465] [] kthread+0xd1/0xe0 [Thu Dec 12 02:02:31 2019][243931.072465] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:02:31 2019][243931.079042] [] 0xffffffffffffffff [Thu Dec 12 02:02:31 2019][243931.084150] LustreError: dumping log to /tmp/lustre-log.1576144951.67695 [Thu Dec 12 02:02:41 2019][243940.790925] LustreError: 67642:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576144661, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005e_UUID lock: ffff890020f4f500/0x7066c9c190b32db0 lrc: 3/0,1 mode: --/PW res: [0x1a80000401:0x111f69:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 67642 timeout: 0 lvb_type: 0 [Thu Dec 12 02:02:41 2019][243940.834649] LustreError: 67642:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 4 previous similar messages [Thu Dec 12 02:02:43 2019][243943.293967] LustreError: dumping log to /tmp/lustre-log.1576144963.67216 [Thu Dec 12 02:03:04 2019][243963.774379] LustreError: dumping log to /tmp/lustre-log.1576144984.67405 [Thu Dec 12 02:03:28 2019][243988.350855] LustreError: dumping log to /tmp/lustre-log.1576145008.67627 [Thu Dec 12 02:04:00 2019][244019.723529] Lustre: fir-OST0054: Client fe16bc49-4bbe-dc30-a069-fee92bf3e984 (at 10.9.104.23@o2ib4) reconnecting [Thu Dec 12 02:04:00 2019][244019.733802] Lustre: Skipped 743 previous similar messages [Thu Dec 12 02:04:04 2019][244023.943208] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800494 to 0x1a80000400:11800513 [Thu Dec 12 02:04:05 2019][244025.215578] LustreError: dumping log to /tmp/lustre-log.1576145045.67589 [Thu Dec 12 02:04:33 2019][244052.833348] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467461 to 0x0:27467489 [Thu Dec 12 02:04:35 2019][244055.592844] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105448 to 0x1a00000402:1105473 [Thu Dec 12 02:04:53 2019][244073.568554] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 02:04:53 2019][244073.568554] req@ffff890d6ff65050 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:303/0 lens 440/0 e 0 to 0 dl 1576145098 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 02:04:53 2019][244073.597191] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1551 previous similar messages [Thu Dec 12 02:04:54 2019][244074.368546] LustreError: dumping log to /tmp/lustre-log.1576145094.112526 [Thu Dec 12 02:05:16 2019][244096.361888] Lustre: fir-OST005c: Connection restored to cec884d3-ca4b-8127-2f6b-7762665aa5f8 (at 10.9.0.64@o2ib4) [Thu Dec 12 02:05:16 2019][244096.372243] Lustre: Skipped 765 previous similar messages [Thu Dec 12 02:05:19 2019][244098.945044] LNet: Service thread pid 112509 was inactive for 1204.01s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 02:05:19 2019][244098.962237] LNet: Skipped 9 previous similar messages [Thu Dec 12 02:05:19 2019][244098.967385] Pid: 112509, comm: ll_ost00_081 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:05:19 2019][244098.978009] Call Trace: [Thu Dec 12 02:05:19 2019][244098.980586] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:05:19 2019][244098.987583] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:05:19 2019][244098.994751] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:05:19 2019][244099.001416] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:05:19 2019][244099.008150] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:05:19 2019][244099.015704] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:05:19 2019][244099.022806] [] dqget+0x3fa/0x450 [Thu Dec 12 02:05:19 2019][244099.027820] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:05:19 2019][244099.033606] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:05:19 2019][244099.041229] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:05:19 2019][244099.047704] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:05:19 2019][244099.053847] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:05:19 2019][244099.060887] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:05:19 2019][244099.068712] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:05:19 2019][244099.075128] [] kthread+0xd1/0xe0 [Thu Dec 12 02:05:19 2019][244099.080129] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:05:19 2019][244099.086719] [] 0xffffffffffffffff [Thu Dec 12 02:05:19 2019][244099.091815] LustreError: dumping log to /tmp/lustre-log.1576145119.112509 [Thu Dec 12 02:05:21 2019][244101.511858] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 02:05:21 2019][244101.520826] Lustre: Skipped 71 previous similar messages [Thu Dec 12 02:05:43 2019][244123.521532] Pid: 67603, comm: ll_ost02_010 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:05:43 2019][244123.532052] Call Trace: [Thu Dec 12 02:05:43 2019][244123.534604] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:05:43 2019][244123.541002] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:05:43 2019][244123.548066] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:05:43 2019][244123.555866] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:05:43 2019][244123.562296] [] kthread+0xd1/0xe0 [Thu Dec 12 02:05:43 2019][244123.567300] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:05:43 2019][244123.573874] [] 0xffffffffffffffff [Thu Dec 12 02:05:43 2019][244123.578983] LustreError: dumping log to /tmp/lustre-log.1576145143.67603 [Thu Dec 12 02:05:47 2019][244127.617601] Pid: 67406, comm: ll_ost00_007 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:05:47 2019][244127.628123] Call Trace: [Thu Dec 12 02:05:47 2019][244127.630693] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:05:47 2019][244127.637692] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:05:47 2019][244127.644875] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:05:47 2019][244127.651524] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:05:47 2019][244127.658287] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:05:48 2019][244127.665814] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:05:48 2019][244127.672911] [] dqget+0x3fa/0x450 [Thu Dec 12 02:05:48 2019][244127.677913] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:05:48 2019][244127.683701] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:05:48 2019][244127.691311] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:05:48 2019][244127.697801] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:05:48 2019][244127.703938] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:05:48 2019][244127.710979] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:05:48 2019][244127.718792] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:05:48 2019][244127.725227] [] kthread+0xd1/0xe0 [Thu Dec 12 02:05:48 2019][244127.730244] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:05:48 2019][244127.736806] [] 0xffffffffffffffff [Thu Dec 12 02:05:48 2019][244127.741919] LustreError: dumping log to /tmp/lustre-log.1576145148.67406 [Thu Dec 12 02:06:25 2019][244165.564579] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192451 to 0x0:27192481 [Thu Dec 12 02:07:13 2019][244213.564536] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506755 to 0x0:27506785 [Thu Dec 12 02:07:18 2019][244217.731400] Pid: 67608, comm: ll_ost01_011 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:07:18 2019][244217.741917] Call Trace: [Thu Dec 12 02:07:18 2019][244217.744478] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.751519] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.758816] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:07:18 2019][244217.765472] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:07:18 2019][244217.771875] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.778905] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.786718] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.793136] [] kthread+0xd1/0xe0 [Thu Dec 12 02:07:18 2019][244217.798135] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:07:18 2019][244217.804706] [] 0xffffffffffffffff [Thu Dec 12 02:07:18 2019][244217.809824] LustreError: dumping log to /tmp/lustre-log.1576145238.67608 [Thu Dec 12 02:07:18 2019][244217.817430] Pid: 112565, comm: ll_ost03_078 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:07:18 2019][244217.828059] Call Trace: [Thu Dec 12 02:07:18 2019][244217.830606] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:07:18 2019][244217.836998] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.844034] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.851836] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:07:18 2019][244217.858256] [] kthread+0xd1/0xe0 [Thu Dec 12 02:07:18 2019][244217.863250] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:07:18 2019][244217.869803] [] 0xffffffffffffffff [Thu Dec 12 02:07:27 2019][244226.910827] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082024 to 0x1a80000402:3082049 [Thu Dec 12 02:07:30 2019][244230.019650] LNet: Service thread pid 67899 was inactive for 1203.17s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 02:07:30 2019][244230.032681] LNet: Skipped 15 previous similar messages [Thu Dec 12 02:07:30 2019][244230.037916] LustreError: dumping log to /tmp/lustre-log.1576145250.67899 [Thu Dec 12 02:07:45 2019][244245.307605] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785353 to 0x1800000401:11785377 [Thu Dec 12 02:07:49 2019][244249.307760] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3048936 to 0x1a00000401:3048961 [Thu Dec 12 02:07:54 2019][244254.596126] LustreError: dumping log to /tmp/lustre-log.1576145274.67867 [Thu Dec 12 02:08:05 2019][244264.732052] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089736 to 0x1980000402:3089761 [Thu Dec 12 02:09:30 2019][244350.109693] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842123 to 0x1980000401:11842145 [Thu Dec 12 02:10:06 2019][244385.670715] LustreError: dumping log to /tmp/lustre-log.1576145405.112513 [Thu Dec 12 02:10:30 2019][244410.247206] Pid: 112497, comm: ll_ost00_077 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:10:30 2019][244410.257812] Call Trace: [Thu Dec 12 02:10:30 2019][244410.260387] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:10:30 2019][244410.267386] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:10:30 2019][244410.274573] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:10:30 2019][244410.281219] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:10:30 2019][244410.287967] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:10:30 2019][244410.295494] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:10:30 2019][244410.302591] [] dqget+0x3fa/0x450 [Thu Dec 12 02:10:30 2019][244410.307609] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:10:30 2019][244410.313382] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:10:30 2019][244410.321008] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:10:30 2019][244410.327483] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:10:30 2019][244410.333625] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:10:30 2019][244410.340664] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:10:30 2019][244410.348481] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:10:30 2019][244410.354894] [] kthread+0xd1/0xe0 [Thu Dec 12 02:10:30 2019][244410.359914] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:10:30 2019][244410.366475] [] 0xffffffffffffffff [Thu Dec 12 02:10:30 2019][244410.371602] LustreError: dumping log to /tmp/lustre-log.1576145430.112497 [Thu Dec 12 02:10:30 2019][244410.379116] Pid: 112490, comm: ll_ost00_071 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:10:30 2019][244410.389754] Call Trace: [Thu Dec 12 02:10:30 2019][244410.392302] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:10:30 2019][244410.399311] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:10:30 2019][244410.406480] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:10:30 2019][244410.413144] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:10:30 2019][244410.419882] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:10:30 2019][244410.427418] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:10:30 2019][244410.434502] [] dqget+0x3fa/0x450 [Thu Dec 12 02:10:30 2019][244410.439529] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:10:30 2019][244410.445310] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:10:30 2019][244410.452923] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:10:30 2019][244410.459400] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:10:30 2019][244410.465544] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:10:30 2019][244410.472565] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:10:30 2019][244410.480381] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:10:30 2019][244410.486798] [] kthread+0xd1/0xe0 [Thu Dec 12 02:10:30 2019][244410.491802] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:10:30 2019][244410.498358] [] 0xffffffffffffffff [Thu Dec 12 02:11:09 2019][244449.103660] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797453 to 0x1900000401:11797473 [Thu Dec 12 02:11:12 2019][244452.007748] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3085933 to 0x1900000402:3085953 [Thu Dec 12 02:11:27 2019][244466.964684] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125767 to 0x1980000400:1125793 [Thu Dec 12 02:11:27 2019][244467.592340] Pid: 113354, comm: ll_ost02_092 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:11:27 2019][244467.602943] Call Trace: [Thu Dec 12 02:11:27 2019][244467.605494] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:11:27 2019][244467.612527] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:11:27 2019][244467.619814] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:11:27 2019][244467.626480] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:11:27 2019][244467.632881] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:11:27 2019][244467.639913] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:11:27 2019][244467.647729] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:11:27 2019][244467.654147] [] kthread+0xd1/0xe0 [Thu Dec 12 02:11:28 2019][244467.659148] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:11:28 2019][244467.665723] [] 0xffffffffffffffff [Thu Dec 12 02:11:28 2019][244467.670832] LustreError: dumping log to /tmp/lustre-log.1576145487.113354 [Thu Dec 12 02:11:48 2019][244488.072775] Pid: 112524, comm: ll_ost03_068 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:11:48 2019][244488.083381] Call Trace: [Thu Dec 12 02:11:48 2019][244488.085941] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:11:48 2019][244488.092980] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:11:48 2019][244488.100279] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:11:48 2019][244488.106927] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:11:48 2019][244488.113332] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:11:48 2019][244488.120370] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:11:48 2019][244488.128185] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:11:48 2019][244488.134599] [] kthread+0xd1/0xe0 [Thu Dec 12 02:11:48 2019][244488.139601] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:11:48 2019][244488.146177] [] 0xffffffffffffffff [Thu Dec 12 02:11:48 2019][244488.151286] LustreError: dumping log to /tmp/lustre-log.1576145508.112524 [Thu Dec 12 02:12:13 2019][244512.649214] Pid: 67654, comm: ll_ost02_022 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:12:13 2019][244512.659732] Call Trace: [Thu Dec 12 02:12:13 2019][244512.662292] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:12:13 2019][244512.668691] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:12:13 2019][244512.675754] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:12:13 2019][244512.683555] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:12:13 2019][244512.689983] [] kthread+0xd1/0xe0 [Thu Dec 12 02:12:13 2019][244512.694996] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:12:13 2019][244512.701584] [] 0xffffffffffffffff [Thu Dec 12 02:12:13 2019][244512.706706] LustreError: dumping log to /tmp/lustre-log.1576145533.67654 [Thu Dec 12 02:12:17 2019][244516.745305] LustreError: dumping log to /tmp/lustre-log.1576145537.67815 [Thu Dec 12 02:12:41 2019][244541.321771] LustreError: dumping log to /tmp/lustre-log.1576145561.112563 [Thu Dec 12 02:14:01 2019][244621.267131] Lustre: fir-OST0056: Client 8c3ccd99-dc20-24d2-79f4-f8d2c0329cfb (at 10.8.24.19@o2ib6) reconnecting [Thu Dec 12 02:14:01 2019][244621.277306] Lustre: Skipped 789 previous similar messages [Thu Dec 12 02:14:15 2019][244635.401272] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122183 to 0x1a80000401:1122209 [Thu Dec 12 02:14:36 2019][244656.004751] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105448 to 0x1a00000402:1105505 [Thu Dec 12 02:14:41 2019][244661.355914] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070661 to 0x1800000400:3070689 [Thu Dec 12 02:14:52 2019][244672.396368] LustreError: dumping log to /tmp/lustre-log.1576145692.67860 [Thu Dec 12 02:14:53 2019][244673.133159] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124428 to 0x1900000400:1124449 [Thu Dec 12 02:14:54 2019][244674.380427] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 02:14:54 2019][244674.380427] req@ffff890df37ee850 x1650929716577312/t0(0) o10->68d8bbaa-aab1-a78f-323f-aaff4c375c30@10.8.7.4@o2ib6:149/0 lens 440/0 e 0 to 0 dl 1576145699 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 02:14:54 2019][244674.409087] Lustre: 27252:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1589 previous similar messages [Thu Dec 12 02:15:01 2019][244680.829213] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117480 to 0x1800000402:1117505 [Thu Dec 12 02:15:17 2019][244696.972864] LustreError: dumping log to /tmp/lustre-log.1576145717.112502 [Thu Dec 12 02:15:17 2019][244696.984894] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 02:15:17 2019][244696.995421] Lustre: Skipped 794 previous similar messages [Thu Dec 12 02:15:24 2019][244703.635682] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 02:15:24 2019][244703.644647] Lustre: Skipped 71 previous similar messages [Thu Dec 12 02:16:06 2019][244746.125835] LNet: Service thread pid 67779 was inactive for 1200.45s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 02:16:06 2019][244746.142959] LNet: Skipped 9 previous similar messages [Thu Dec 12 02:16:06 2019][244746.148106] Pid: 67779, comm: ll_ost01_050 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:16:06 2019][244746.158643] Call Trace: [Thu Dec 12 02:16:06 2019][244746.161193] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.168227] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.175513] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:16:06 2019][244746.182163] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:16:06 2019][244746.188565] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.195596] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.203411] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.209844] [] kthread+0xd1/0xe0 [Thu Dec 12 02:16:06 2019][244746.214859] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:16:06 2019][244746.221423] [] 0xffffffffffffffff [Thu Dec 12 02:16:06 2019][244746.226545] LustreError: dumping log to /tmp/lustre-log.1576145766.67779 [Thu Dec 12 02:16:06 2019][244746.234164] Pid: 112545, comm: ll_ost01_089 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:16:06 2019][244746.244796] Call Trace: [Thu Dec 12 02:16:06 2019][244746.247343] [] ofd_create_hdl+0xcb3/0x20e0 [ofd] [Thu Dec 12 02:16:06 2019][244746.253733] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.260769] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.268588] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:16:06 2019][244746.275032] [] kthread+0xd1/0xe0 [Thu Dec 12 02:16:06 2019][244746.280038] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:16:06 2019][244746.286604] [] 0xffffffffffffffff [Thu Dec 12 02:16:26 2019][244766.606244] Pid: 66903, comm: ll_ost02_006 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:16:26 2019][244766.616760] Call Trace: [Thu Dec 12 02:16:26 2019][244766.619312] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:16:26 2019][244766.626355] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:16:26 2019][244766.633651] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:16:26 2019][244766.640310] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:16:26 2019][244766.646712] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:16:26 2019][244766.653768] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:16:26 2019][244766.661591] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:16:26 2019][244766.668010] [] kthread+0xd1/0xe0 [Thu Dec 12 02:16:26 2019][244766.673009] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:16:27 2019][244766.679578] [] 0xffffffffffffffff [Thu Dec 12 02:16:27 2019][244766.684687] LustreError: dumping log to /tmp/lustre-log.1576145786.66903 [Thu Dec 12 02:16:40 2019][244779.766220] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000400:11800522 to 0x1a80000400:11800545 [Thu Dec 12 02:16:56 2019][244796.436850] LustreError: 112547:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) ### lock timed out (enqueued at 1576145516, 300s ago); not entering recovery in server code, just going back to sleep ns: filter-fir-OST005a_UUID lock: ffff88e9f0f4d340/0x7066c9c190b38bea lrc: 3/0,1 mode: --/PW res: [0x1a490d4:0x0:0x0].0x0 rrc: 3 type: EXT [0->18446744073709551615] (req 0->18446744073709551615) flags: 0x40010080000000 nid: local remote: 0x0 expref: -99 pid: 112547 timeout: 0 lvb_type: 0 [Thu Dec 12 02:16:56 2019][244796.480062] LustreError: 112547:0:(ldlm_request.c:129:ldlm_expired_completion_wait()) Skipped 9 previous similar messages [Thu Dec 12 02:17:09 2019][244809.056343] Lustre: fir-OST005e: deleting orphan objects from 0x0:27467491 to 0x0:27467521 [Thu Dec 12 02:17:28 2019][244827.882738] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082051 to 0x1a80000402:3082081 [Thu Dec 12 02:17:28 2019][244828.047456] Pid: 66384, comm: ll_ost00_004 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:17:28 2019][244828.057974] Call Trace: [Thu Dec 12 02:17:28 2019][244828.060544] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:17:28 2019][244828.067543] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:17:28 2019][244828.074733] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:17:28 2019][244828.081382] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:17:28 2019][244828.088118] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:17:28 2019][244828.095669] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:17:28 2019][244828.102765] [] dqget+0x3fa/0x450 [Thu Dec 12 02:17:28 2019][244828.107779] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:17:28 2019][244828.113553] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:17:28 2019][244828.121178] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:17:28 2019][244828.127663] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:17:28 2019][244828.133812] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:17:28 2019][244828.140859] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:17:28 2019][244828.148676] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:17:28 2019][244828.155122] [] kthread+0xd1/0xe0 [Thu Dec 12 02:17:28 2019][244828.160123] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:17:28 2019][244828.166708] [] 0xffffffffffffffff [Thu Dec 12 02:17:28 2019][244828.171807] LustreError: dumping log to /tmp/lustre-log.1576145848.66384 [Thu Dec 12 02:17:44 2019][244844.431793] Pid: 67642, comm: ll_ost01_020 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:17:44 2019][244844.442314] Call Trace: [Thu Dec 12 02:17:44 2019][244844.444868] [] ldlm_completion_ast+0x4e5/0x860 [ptlrpc] [Thu Dec 12 02:17:44 2019][244844.451906] [] ldlm_cli_enqueue_local+0x231/0x830 [ptlrpc] [Thu Dec 12 02:17:44 2019][244844.459202] [] ofd_destroy_by_fid+0x1dd/0x4a0 [ofd] [Thu Dec 12 02:17:44 2019][244844.465852] [] ofd_destroy_hdl+0x267/0x970 [ofd] [Thu Dec 12 02:17:44 2019][244844.472255] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:17:44 2019][244844.479309] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:17:44 2019][244844.487144] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:17:44 2019][244844.493577] [] kthread+0xd1/0xe0 [Thu Dec 12 02:17:44 2019][244844.498577] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:17:44 2019][244844.505161] [] 0xffffffffffffffff [Thu Dec 12 02:17:44 2019][244844.510274] LustreError: dumping log to /tmp/lustre-log.1576145864.67642 [Thu Dec 12 02:17:52 2019][244852.623954] LNet: Service thread pid 112540 was inactive for 1201.63s. Watchdog stack traces are limited to 3 per 300 seconds, skipping this one. [Thu Dec 12 02:17:52 2019][244852.637077] LNet: Skipped 6 previous similar messages [Thu Dec 12 02:17:52 2019][244852.642225] LustreError: dumping log to /tmp/lustre-log.1576145872.112540 [Thu Dec 12 02:18:06 2019][244866.327961] Lustre: fir-OST005a: deleting orphan objects from 0x1980000402:3089765 to 0x1980000402:3089793 [Thu Dec 12 02:18:13 2019][244873.104370] LustreError: dumping log to /tmp/lustre-log.1576145893.67681 [Thu Dec 12 02:18:17 2019][244877.200447] LustreError: dumping log to /tmp/lustre-log.1576145897.67776 [Thu Dec 12 02:18:21 2019][244881.296515] LustreError: dumping log to /tmp/lustre-log.1576145901.67811 [Thu Dec 12 02:19:01 2019][244921.210547] Lustre: fir-OST005c: deleting orphan objects from 0x0:27192484 to 0x0:27192513 [Thu Dec 12 02:19:06 2019][244926.353407] LustreError: dumping log to /tmp/lustre-log.1576145946.67613 [Thu Dec 12 02:19:10 2019][244930.449508] LustreError: dumping log to /tmp/lustre-log.1576145950.67704 [Thu Dec 12 02:19:14 2019][244934.545583] LustreError: dumping log to /tmp/lustre-log.1576145954.67857 [Thu Dec 12 02:19:35 2019][244955.025978] LustreError: dumping log to /tmp/lustre-log.1576145975.67909 [Thu Dec 12 02:19:43 2019][244963.218142] LustreError: dumping log to /tmp/lustre-log.1576145983.67854 [Thu Dec 12 02:19:49 2019][244968.739568] Lustre: fir-OST0058: deleting orphan objects from 0x0:27506790 to 0x0:27506817 [Thu Dec 12 02:19:55 2019][244975.506392] LustreError: dumping log to /tmp/lustre-log.1576145995.67648 [Thu Dec 12 02:20:04 2019][244983.698547] LustreError: dumping log to /tmp/lustre-log.1576146003.67735 [Thu Dec 12 02:20:21 2019][245000.890591] Lustre: fir-OST0054: deleting orphan objects from 0x1800000401:11785385 to 0x1800000401:11785409 [Thu Dec 12 02:20:24 2019][245004.178952] LustreError: dumping log to /tmp/lustre-log.1576146024.67661 [Thu Dec 12 02:20:25 2019][245004.874736] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000401:3048967 to 0x1a00000401:3048993 [Thu Dec 12 02:20:28 2019][245008.275039] LustreError: dumping log to /tmp/lustre-log.1576146028.67748 [Thu Dec 12 02:21:28 2019][245068.352651] Lustre: fir-OST005a: deleting orphan objects from 0x1980000400:1125800 to 0x1980000400:1125825 [Thu Dec 12 02:22:06 2019][245105.804728] Lustre: fir-OST005a: deleting orphan objects from 0x1980000401:11842158 to 0x1980000401:11842177 [Thu Dec 12 02:22:10 2019][245110.677064] Pid: 67149, comm: ll_ost01_007 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:22:10 2019][245110.687584] Call Trace: [Thu Dec 12 02:22:10 2019][245110.690153] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:22:10 2019][245110.697151] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:22:11 2019][245110.704337] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:22:11 2019][245110.710976] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:22:11 2019][245110.717701] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:22:11 2019][245110.725216] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:22:11 2019][245110.732310] [] dqget+0x3fa/0x450 [Thu Dec 12 02:22:11 2019][245110.737330] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:22:11 2019][245110.743115] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:22:11 2019][245110.750723] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:22:11 2019][245110.757209] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:22:11 2019][245110.763341] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:22:11 2019][245110.770400] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:22:11 2019][245110.778206] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:22:11 2019][245110.784631] [] kthread+0xd1/0xe0 [Thu Dec 12 02:22:11 2019][245110.789634] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:22:11 2019][245110.796203] [] 0xffffffffffffffff [Thu Dec 12 02:22:11 2019][245110.801317] LustreError: dumping log to /tmp/lustre-log.1576146131.67149 [Thu Dec 12 02:22:15 2019][245114.773156] Pid: 67601, comm: ll_ost00_009 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:22:15 2019][245114.783692] Call Trace: [Thu Dec 12 02:22:15 2019][245114.786264] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:22:15 2019][245114.793260] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:22:15 2019][245114.800445] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:22:15 2019][245114.807092] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:22:15 2019][245114.813841] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:22:15 2019][245114.821365] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:22:15 2019][245114.828463] [] dqget+0x3fa/0x450 [Thu Dec 12 02:22:15 2019][245114.833481] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:22:15 2019][245114.839271] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:22:15 2019][245114.846902] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:22:15 2019][245114.853390] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:22:15 2019][245114.859548] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:22:15 2019][245114.866591] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:22:15 2019][245114.874420] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:22:15 2019][245114.880837] [] kthread+0xd1/0xe0 [Thu Dec 12 02:22:15 2019][245114.885854] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:22:15 2019][245114.892417] [] 0xffffffffffffffff [Thu Dec 12 02:22:15 2019][245114.897544] LustreError: dumping log to /tmp/lustre-log.1576146135.67601 [Thu Dec 12 02:22:19 2019][245118.869229] Pid: 67927, comm: ll_ost01_075 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:22:19 2019][245118.879747] Call Trace: [Thu Dec 12 02:22:19 2019][245118.882317] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:22:19 2019][245118.889312] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:22:19 2019][245118.896497] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:22:19 2019][245118.903146] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:22:19 2019][245118.909893] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:22:19 2019][245118.917421] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:22:19 2019][245118.924516] [] dqget+0x3fa/0x450 [Thu Dec 12 02:22:19 2019][245118.929535] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:22:19 2019][245118.935322] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:22:19 2019][245118.942935] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:22:19 2019][245118.949424] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:22:19 2019][245118.955553] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:22:19 2019][245118.962615] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:22:19 2019][245118.970419] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:22:19 2019][245118.976845] [] kthread+0xd1/0xe0 [Thu Dec 12 02:22:19 2019][245118.981851] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:22:19 2019][245118.988424] [] 0xffffffffffffffff [Thu Dec 12 02:22:19 2019][245118.993539] LustreError: dumping log to /tmp/lustre-log.1576146139.67927 [Thu Dec 12 02:22:39 2019][245139.349639] Pid: 67852, comm: ll_ost00_059 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:22:39 2019][245139.360174] Call Trace: [Thu Dec 12 02:22:39 2019][245139.362753] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:22:39 2019][245139.369776] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:22:39 2019][245139.377005] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:22:39 2019][245139.383678] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:22:39 2019][245139.390454] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:22:39 2019][245139.398003] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:22:39 2019][245139.405123] [] dqget+0x3fa/0x450 [Thu Dec 12 02:22:39 2019][245139.410137] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:22:39 2019][245139.415917] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:22:39 2019][245139.423577] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:22:39 2019][245139.430078] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:22:39 2019][245139.436230] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:22:39 2019][245139.443270] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:22:39 2019][245139.451097] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:22:39 2019][245139.457509] [] kthread+0xd1/0xe0 [Thu Dec 12 02:22:39 2019][245139.462511] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:22:39 2019][245139.469079] [] 0xffffffffffffffff [Thu Dec 12 02:22:39 2019][245139.474179] LustreError: dumping log to /tmp/lustre-log.1576146159.67852 [Thu Dec 12 02:23:04 2019][245163.926129] Pid: 67634, comm: ll_ost00_018 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:23:04 2019][245163.936663] Call Trace: [Thu Dec 12 02:23:04 2019][245163.939239] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:23:04 2019][245163.946230] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:23:04 2019][245163.953406] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:23:04 2019][245163.960055] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:23:04 2019][245163.966804] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:23:04 2019][245163.974345] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:23:04 2019][245163.981442] [] dqget+0x3fa/0x450 [Thu Dec 12 02:23:04 2019][245163.986436] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:23:04 2019][245163.992223] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:23:04 2019][245163.999835] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:23:04 2019][245164.006323] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:23:04 2019][245164.012453] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:23:04 2019][245164.019498] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:23:04 2019][245164.027300] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:23:04 2019][245164.033744] [] kthread+0xd1/0xe0 [Thu Dec 12 02:23:04 2019][245164.038757] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:23:04 2019][245164.045350] [] 0xffffffffffffffff [Thu Dec 12 02:23:04 2019][245164.050451] LustreError: dumping log to /tmp/lustre-log.1576146184.67634 [Thu Dec 12 02:23:08 2019][245168.022259] LustreError: dumping log to /tmp/lustre-log.1576146188.67821 [Thu Dec 12 02:23:45 2019][245205.326654] Lustre: fir-OST0058: deleting orphan objects from 0x1900000401:11797482 to 0x1900000401:11797505 [Thu Dec 12 02:23:48 2019][245207.798809] Lustre: fir-OST0058: deleting orphan objects from 0x1900000402:3085959 to 0x1900000402:3085985 [Thu Dec 12 02:23:57 2019][245217.175196] LustreError: dumping log to /tmp/lustre-log.1576146237.112558 [Thu Dec 12 02:24:02 2019][245222.073473] Lustre: fir-OST005a: Client 717fa73e-8071-a76f-931e-8957a8ca32aa (at 10.9.101.41@o2ib4) reconnecting [Thu Dec 12 02:24:02 2019][245222.083737] Lustre: Skipped 792 previous similar messages [Thu Dec 12 02:24:32 2019][245252.465167] Lustre: fir-OST005a: deleting orphan objects from 0x0:27562200 to 0x0:27562241 [Thu Dec 12 02:24:37 2019][245256.976675] Lustre: fir-OST005c: deleting orphan objects from 0x1a00000402:1105514 to 0x1a00000402:1105537 [Thu Dec 12 02:24:50 2019][245270.424257] LustreError: dumping log to /tmp/lustre-log.1576146290.67691 [Thu Dec 12 02:24:54 2019][245274.520335] LustreError: dumping log to /tmp/lustre-log.1576146294.67632 [Thu Dec 12 02:24:54 2019][245274.584357] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) @@@ Couldn't add any time (5/5), not sending early reply [Thu Dec 12 02:24:54 2019][245274.584357] req@ffff88ebc457c050 x1648527343518128/t0(0) o10->1d4d1153-82cd-6bbc-4932-1e6a2a506ca0@10.8.30.27@o2ib6:749/0 lens 440/0 e 0 to 0 dl 1576146299 ref 2 fl New:/2/ffffffff rc 0/-1 [Thu Dec 12 02:24:54 2019][245274.613172] Lustre: 27257:0:(service.c:1372:ptlrpc_at_send_early_reply()) Skipped 1630 previous similar messages [Thu Dec 12 02:25:18 2019][245297.957831] Lustre: fir-OST005a: Connection restored to f5f4e1fb-09a1-cbb0-925c-f94d3727005b (at 10.9.101.44@o2ib4) [Thu Dec 12 02:25:18 2019][245297.968354] Lustre: Skipped 811 previous similar messages [Thu Dec 12 02:25:19 2019][245299.096823] LustreError: dumping log to /tmp/lustre-log.1576146319.67916 [Thu Dec 12 02:25:26 2019][245305.759932] Lustre: fir-OST0056: Export ffff88fc8fff5800 already connecting from 10.8.22.17@o2ib6 [Thu Dec 12 02:25:26 2019][245305.768898] Lustre: Skipped 71 previous similar messages [Thu Dec 12 02:25:43 2019][245323.082001] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000401:1122212 to 0x1a80000401:1122241 [Thu Dec 12 02:26:33 2019][245372.826286] LustreError: dumping log to /tmp/lustre-log.1576146393.112536 [Thu Dec 12 02:27:17 2019][245417.682987] Lustre: fir-OST0054: deleting orphan objects from 0x1800000400:3070697 to 0x1800000400:3070721 [Thu Dec 12 02:27:29 2019][245428.780099] Lustre: fir-OST0058: deleting orphan objects from 0x1900000400:1124451 to 0x1900000400:1124481 [Thu Dec 12 02:27:29 2019][245428.958687] Lustre: fir-OST005e: deleting orphan objects from 0x1a80000402:3082088 to 0x1a80000402:3082113 [Thu Dec 12 02:27:30 2019][245430.171442] LNet: Service thread pid 112567 was inactive for 1203.28s. The thread might be hung, or it might only be slow and will resume later. Dumping the stack trace for debugging purposes: [Thu Dec 12 02:27:30 2019][245430.188635] LNet: Skipped 9 previous similar messages [Thu Dec 12 02:27:30 2019][245430.193785] Pid: 112567, comm: ll_ost00_097 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:27:30 2019][245430.204406] Call Trace: [Thu Dec 12 02:27:30 2019][245430.206974] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:27:30 2019][245430.213973] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:27:30 2019][245430.221158] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:27:30 2019][245430.227806] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:27:30 2019][245430.234554] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:27:30 2019][245430.242080] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:27:30 2019][245430.249176] [] dqget+0x3fa/0x450 [Thu Dec 12 02:27:30 2019][245430.254182] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:27:30 2019][245430.259970] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:27:30 2019][245430.267602] [] lquotactl_slv+0x27d/0x9d0 [lquota] [Thu Dec 12 02:27:30 2019][245430.274080] [] ofd_quotactl+0x13c/0x380 [ofd] [Thu Dec 12 02:27:30 2019][245430.280222] [] tgt_request_handle+0xaea/0x1580 [ptlrpc] [Thu Dec 12 02:27:30 2019][245430.287261] [] ptlrpc_server_handle_request+0x24b/0xab0 [ptlrpc] [Thu Dec 12 02:27:30 2019][245430.295077] [] ptlrpc_main+0xb2c/0x1460 [ptlrpc] [Thu Dec 12 02:27:30 2019][245430.301493] [] kthread+0xd1/0xe0 [Thu Dec 12 02:27:30 2019][245430.306507] [] ret_from_fork_nospec_begin+0xe/0x21 [Thu Dec 12 02:27:30 2019][245430.313070] [] 0xffffffffffffffff [Thu Dec 12 02:27:30 2019][245430.318184] LustreError: dumping log to /tmp/lustre-log.1576146450.112567 [Thu Dec 12 02:27:37 2019][245436.852255] Lustre: fir-OST0054: deleting orphan objects from 0x1800000402:1117514 to 0x1800000402:1117537 [Thu Dec 12 02:27:55 2019][245454.747927] Pid: 66253, comm: ll_ost00_003 3.10.0-957.27.2.el7_lustre.pl2.x86_64 #1 SMP Thu Nov 7 15:26:16 PST 2019 [Thu Dec 12 02:27:55 2019][245454.758449] Call Trace: [Thu Dec 12 02:27:55 2019][245454.761018] [] wait_transaction_locked+0x85/0xd0 [jbd2] [Thu Dec 12 02:27:55 2019][245454.768017] [] add_transaction_credits+0x268/0x2f0 [jbd2] [Thu Dec 12 02:27:55 2019][245454.775203] [] start_this_handle+0x1a1/0x430 [jbd2] [Thu Dec 12 02:27:55 2019][245454.781847] [] jbd2__journal_start+0xf3/0x1f0 [jbd2] [Thu Dec 12 02:27:55 2019][245454.788597] [] __ldiskfs_journal_start_sb+0x69/0xe0 [ldiskfs] [Thu Dec 12 02:27:55 2019][245454.796124] [] ldiskfs_acquire_dquot+0x53/0xb0 [ldiskfs] [Thu Dec 12 02:27:55 2019][245454.803218] [] dqget+0x3fa/0x450 [Thu Dec 12 02:27:55 2019][245454.808221] [] dquot_get_dqblk+0x14/0x1f0 [Thu Dec 12 02:27:55 2019][245454.813992] [] osd_acct_index_lookup+0x235/0x480 [osd_ldiskfs] [Thu Dec 12 02:27:55 2019][245454.